1 Akinorn

Raskolnikov S Madness Essaytyper

Authors
Hans Tangelder and Andreas Fabri

Introduction

The spatial searching package implements exact and approximate distance browsing by providing implementations of algorithms supporting

  • both nearest and furthest neighbor searching

  • both exact and approximate searching

  • (approximate) range searching

  • (approximate) -nearest and -furthest neighbor searching

  • (approximate) incremental nearest and incremental furthest neighbor searching

  • query items representing points and spatial objects.

In these searching problems a set of data points in -dimensional space is given. The points can be represented by Cartesian coordinates or homogeneous coordinates. These points are preprocessed into a tree data structure, so that given any query item the points of can be browsed efficiently. The approximate spatial searching package is designed for data sets that are small enough to store the search structure in main memory (in contrast to approaches from databases that assume that the data reside in secondary storage).

Neighbor Searching

Spatial searching supports browsing through a collection of -dimensional spatial objects stored in a spatial data structure on the basis of their distances to a query object. The query object may be a point or an arbitrary spatial object, e.g, a -dimensional sphere. The objects in the spatial data structure are -dimensional points.

Often the number of the neighbors to be computed is not know beforehand, e.g., because the number may depend on some properties of the neighbors (for example when querying for the nearest city to Paris with population greater than a million) or the distance to the query point. The conventional approach is -nearest neighbor searching that makes use of a -nearest neighbor algorithm, where is known prior to the invocation of the algorithm. Hence, the number of nearest neighbors has to be guessed. If the guess is too large redundant computations are performed. If the number is too small the computation has to be re-invoked for a larger number of neighbors, thereby performing redundant computations. Therefore, Hjaltason and Samet [5] introduced incremental nearest neighbor searching in the sense that having obtained the nearest neighbors, the st neighbor can be obtained without having to calculate the nearest neighbor from scratch.

Spatial searching typically consists of a preprocessing phase and a searching phase. In the preprocessing phase one builds a search structure and in the searching phase one makes the queries. In the preprocessing phase the user builds a tree data structure storing the spatial data. In the searching phase the user invokes a searching method to browse the spatial data.

With relatively minor modifications, nearest neighbor searching algorithms can be used to find the furthest object from the query object. Therefore, furthest neighbor searching is also supported by the spatial searching package.

The execution time for exact neighbor searching can be reduced by relaxing the requirement that the neighbors should be computed exactly. If the distances of two objects to the query object are approximately the same, instead of computing the nearest/furthest neighbor exactly, one of these objects may be returned as the approximate nearest/furthest neighbor. I.e., given some non-negative constant \( \epsilon\) the distance of an object returned as an approximate -nearest neighbor must not be larger than \( (1+\epsilon)r\), where \( r\) denotes the distance to the real th nearest neighbor. Similar the distance of an approximate -furthest neighbor must not be smaller than \( r/(1+\epsilon)\). Obviously, for \( \epsilon=0\) we get the exact result, and the larger \( \epsilon\) is, the less exact the result.

While searching the nearest neighbor the algorithm descends the kd-tree and has to decide two things for each node : Which child node should be visited first and could there be possible nearest neighbors in the other child. This basically comes down to computing the distance to the further child, because the distance to the closer child is the same as the one to the parent. There are two options now:

  1. In general, we compute the distance with the given metric. This is the k-neighbor search with a general distance class.
  2. For point queries we can "update" the distance, because it is only changed in one dimension at a time. This is the orthogonal k-neighbor search with an orthogonal distance class. The following example shows the orthogonal distance computation in detail:

Figure 75.1 Orthogonal distance computation technique


Assume we are searching the nearest neighbor, descending the kd-tree, with \( R_{p} \) as the parent rectangle and \( R_{lo} \) and \( R_{hi}\) as its childs in the current step. Further assume \( R_{lo} \) is closer to query point \(q\). Let \(cd\) denote the cutting dimension and let \(cv\) denote the cutting value. At this point we already know the distance \(rd_{p}\) to the parent rectangle and need to check if \(R_{hi}\) could contain nearest neighbors. Because \(R_{lo}\) is the closer rectangle, its distance to \(q\), \(rd_{lo}\), is the same as \(rd_{p}\). Notice that for each dimension \(i \neq cd \), \( \mathrm{dists}_{lo}[i] = \mathrm{dists}_{hi}[i]\), since these coordinates are not affected by the current cut. So the new distance along the cutting dimension is \( \mathrm{dists}_{hi}[cd] = cv - q[cd]\). Now we can compute \(rd_{hi}\) in constant time (independent of dimension) with \(rd_{hi} = rd_{p} - \mathrm{dists}_{lo}[cd]^2 + (cv - q[cd])^2\).
This strategy can be used if and only if the distance changes only in one dimension at a time, which is the case for point queries.

The following two classes implement the standard search strategy for orthogonal distances like the weighted Minkowski distance. The second one is a specialization for incremental neighbor searching and distance browsing. Both require extendes nodes.

The other two classes implement the standard search strategy for general distances like the Manhattan distance for iso-rectangle queries. Again, the second one is a specialization for incremental neighbor searching and distance browsing .

Range Searching

Exact range searching and approximate range searching are supported using exact or fuzzy -dimensional objects enclosing a region. The fuzziness of the query object is specified by a parameter \( \epsilon\) used to define inner and outer approximations of the query object. Points in the interior of the inner approximation are always reported and points that are not in the closure of the outer approximation are never reported. Other points may or may not be reported. For exact range searching, the fuzziness parameter \( \epsilon\) is set to zero.

The class implements range searching in the method , which is a template method with an output iterator and a model of the concept as or . For range searching of large data sets, the user may set the parameter used in building the tree to a large value (e.g. 100), because in general the query time will be less than using the default value.

Splitting Rules

Instead of using the default splitting rule described below, a user may, depending upon the data, select one from the following splitting rules, which determine how a separating hyperplane is computed. Every splitter has degenerated worst cases, which may lead to a linear tree and a stack overflow. Switching the splitting rule to one of a different kind will solve the problem.

This splitting rule cuts a rectangle through its midpoint orthogonal to the longest side.

This splitting rule cuts a rectangle through \( (\mathrm{Mind}+\mathrm{Maxd})/2\) orthogonal to the dimension with the maximum point spread \( [\mathrm{Mind},\mathrm{Maxd}]\).

This is a modification of the midpoint of rectangle splitting rule. It first attempts to perform a midpoint of rectangle split as described above. If data points lie on both sides of the separating plane the sliding midpoint rule computes the same separator as the midpoint of rectangle rule. If the data points lie only on one side it avoids this by sliding the separator, computed by the midpoint of rectangle rule, to the nearest data point.

As all the midpoint rules cut the bounding box in the middle of the longest side, the tree will become linear for a dataset with exponential increasing distances in one dimension.

Figure 75.2 Midpoint worst case point set in 2d.


The splitting dimension is the dimension of the longest side of the rectangle. The splitting value is defined by the median of the coordinates of the data points along this dimension.

The splitting dimension is the dimension of the longest side of the rectangle. The splitting value is defined by the median of the coordinates of the data points along this dimension.

The tree can become linear for the median rules, if many points are collinear in a dimension which is not the cutting dimension.

Figure 75.3 Median worst case point set in 2d.
a is longer than b, so this will be the cutting dimension.


This splitting rule is a compromise between the median of rectangle splitting rule and the midpoint of rectangle splitting rule. This splitting rule maintains an upper bound on the maximal allowed ratio of the longest and shortest side of a rectangle (the value of this upper bound is set in the constructor of the fair splitting rule). Among the splits that satisfy this bound, it selects the one in which the points have the largest spread. It then splits the points in the most even manner possible, subject to maintaining the bound on the ratio of the resulting rectangles.

This splitting rule is a compromise between the fair splitting rule and the sliding midpoint rule. Sliding fair-split is based on the theory that there are two types of splits that are good: balanced splits that produce fat rectangles, and unbalanced splits provided the rectangle with fewer points is fat.

Also, this splitting rule maintains an upper bound on the maximal allowed ratio of the longest and shortest side of a rectangle (the value of this upper bound is set in the constructor of the fair splitting rule). Among the splits that satisfy this bound, it selects the one one in which the points have the largest spread. It then considers the most extreme cuts that would be allowed by the aspect ratio bound. This is done by dividing the longest side of the rectangle by the aspect ratio bound. If the median cut lies between these extreme cuts, then we use the median cut. If not, then consider the extreme cut that is closer to the median. If all the points lie to one side of this cut, then we slide the cut until it hits the first point. This may violate the aspect ratio bound, but will never generate empty cells.

Example Programs

We give seven examples. The first example illustrates k nearest neighbor searching, and the second example incremental neighbor searching. The third is an example of approximate furthest neighbor searching using a -dimensional iso-rectangle as an query object. Approximate range searching is illustrated by the fourth example. The fifth example illustrates k neighbor searching for a user defined point class. The sixth example shows how to choose another splitting rule in the tree that is used as search tree. The last example shows two worst-case scenarios for different splitter types.

Example for K Neighbor Searching

The first example illustrates k neighbor searching with an Euclidean distance and 2-dimensional points. The generated random data points are inserted in a search tree. We then initialize the k neighbor search object with the origin as query. Finally, we obtain the result of the computation in the form of an iterator range. The value of the iterator is a pair of a point and its square distance to the query point. We use square distances, or transformed distances for other distance classes, as they are computationally cheaper.


FileSpatial_searching/nearest_neighbor_searching.cpp

#include <CGAL/Simple_cartesian.h>

#include <CGAL/point_generators_2.h>

#include <CGAL/Orthogonal_k_neighbor_search.h>

#include <CGAL/Search_traits_2.h>

#include <list>

#include <cmath>

typedefCGAL::Simple_cartesian<double> K;

typedef K::Point_2 Point_d;

typedefCGAL::Search_traits_2<K> TreeTraits;

typedefCGAL::Orthogonal_k_neighbor_search<TreeTraits> Neighbor_search;

typedef Neighbor_search::Tree Tree;

int main() {

constunsignedint N = 1;

std::list<Point_d> points;

points.push_back(Point_d(0,0));

Tree tree(points.begin(), points.end());

Point_d query(0,0);

Neighbor_search search(tree, query, N);

for(Neighbor_search::iterator it = search.begin(); it != search.end(); ++it){

std::cout << it->first << " "<< std::sqrt(it->second) << std::endl;

}

return 0;

}

Example for Incremental Searching

This example program illustrates incremental searching for the closest point with a positive first coordinate. We can use the orthogonal incremental neighbor search class, as the query is also a point and as the distance is the Euclidean distance.

As for the neighbor search, we first initialize the search tree with the data. We then create the search object, and finally obtain the iterator with the method. Note that the iterator is of the input iterator category, that is one can make only one pass over the data.


FileSpatial_searching/distance_browsing.cpp

#include <CGAL/Simple_cartesian.h>

#include <CGAL/Orthogonal_incremental_neighbor_search.h>

#include <CGAL/Search_traits_2.h>

typedefCGAL::Simple_cartesian<double> K;

typedef K::Point_2 Point_d;

typedefCGAL::Search_traits_2<K> TreeTraits;

typedefCGAL::Orthogonal_incremental_neighbor_search<TreeTraits> NN_incremental_search;

typedef NN_incremental_search::iterator NN_iterator;

typedef NN_incremental_search::Tree Tree;

struct X_not_positive {

bool operator()(const NN_iterator& it) { return ((*it).first)[0]<0; }

};

typedefCGAL::Filter_iterator<NN_iterator, X_not_positive> NN_positive_x_iterator;

int main() {

Tree tree;

tree.insert(Point_d(0,0));

tree.insert(Point_d(1,1));

tree.insert(Point_d(0,1));

tree.insert(Point_d(10,110));

tree.insert(Point_d(45,0));

tree.insert(Point_d(0,2340));

tree.insert(Point_d(0,30));

Point_d query(0,0);

NN_incremental_search NN(tree, query);

NN_positive_x_iterator it(NN.end(), X_not_positive(), NN.begin()), end(NN.end(), X_not_positive());

std::cout << "The first 5 nearest neighbours with positive x-coord are: " << std::endl;

for (int j=0; (j < 5)&&(it!=end); ++j,++it)

std::cout << (*it).first << " at squared distance = " << (*it).second << std::endl;

return 0;

}

Example for General Neighbor Searching

This example program illustrates approximate nearest and furthest neighbor searching using 4-dimensional Cartesian coordinates. Five approximate furthest neighbors of the query rectangle \( [0.1,0.2]^4\) are computed. Because the query object is a rectangle we cannot use the orthogonal neighbor search. As in the previous examples we first initialize a search tree, create the search object with the query, and obtain the result of the search as iterator range.


FileSpatial_searching/general_neighbor_searching.cpp

#include <CGAL/Epick_d.h>

#include <CGAL/point_generators_d.h>

#include <CGAL/Manhattan_distance_iso_box_point.h>

#include <CGAL/K_neighbor_search.h>

#include <CGAL/Search_traits_d.h>

typedefCGAL::Epick_d<CGAL::Dimension_tag<4> > Kernel;

typedef Kernel::Point_d Point_d;

typedef CGAL::Random_points_in_cube_d<Point_d> Random_points_iterator;

typedef Kernel::Iso_box_d Iso_box_d;

typedef Kernel TreeTraits;

typedefCGAL::Manhattan_distance_iso_box_point<TreeTraits> Distance;

typedefCGAL::K_neighbor_search<TreeTraits, Distance> Neighbor_search;

typedef Neighbor_search::Tree Tree;

int main() {

constint N = 1000;

constunsignedint K = 10;

Tree tree;

Random_points_iterator rpit(4,1000.0);

for(int i = 0; i < N; i++){

tree.insert(*rpit++);

}

Point_d pp(0.1,0.1,0.1,0.1);

Point_d qq(0.2,0.2,0.2,0.2);

Iso_box_d query(pp,qq);

Distance tr_dist;

Neighbor_search N1(tree, query, 5, 10.0, false);

std::cout << "For query rectangle = [0.1, 0.2]^4 " << std::endl

<< "the " << K << " approximate furthest neighbors are: " << std::endl;

for (Neighbor_search::iterator it = N1.begin();it != N1.end();it++) {

std::cout << " Point " << it->first << " at distance " << tr_dist.inverse_of_transformed_distance(it->second) << std::endl;

}

return 0;

}

Example for a Range Query

This example program illustrates approximate range querying for 4-dimensional fuzzy iso-rectangles and spheres using the higher dimensional kernel . The range queries are member functions of the tree class.


FileSpatial_searching/fuzzy_range_query.cpp

#include <CGAL/Epick_d.h>

#include <CGAL/point_generators_d.h>

#include <CGAL/Kd_tree.h>

#include <CGAL/Fuzzy_sphere.h>

#include <CGAL/Fuzzy_iso_box.h>

#include <CGAL/Search_traits_d.h>

constint D = 4;

typedefCGAL::Epick_d<CGAL::Dimension_tag<D> > K;

typedef K::Point_d Point_d;

typedefCGAL::Search_traits_d<K,CGAL::Dimension_tag<D> > Traits;

typedef CGAL::Random_points_in_cube_d<Point_d> Random_points_iterator;

typedefCGAL::Counting_iterator<Random_points_iterator> N_Random_points_iterator;

typedefCGAL::Kd_tree<Traits> Tree;

typedefCGAL::Fuzzy_sphere<Traits> Fuzzy_sphere;

typedefCGAL::Fuzzy_iso_box<Traits> Fuzzy_iso_box;

int main() {

constint N = 1000;

Random_points_iterator rpit(4, 1000.0);

Tree tree(N_Random_points_iterator(rpit,0),

N_Random_points_iterator(rpit,N));

double pcoord[D] = { 300, 300, 300, 300 };

double qcoord[D] = { 900.0, 900.0, 900.0, 900.0 };

Point_d p(D, pcoord+0, pcoord+D);

Point_d q(D, qcoord+0, qcoord+D);

Fuzzy_sphere fs(p, 700.0, 100.0);

Fuzzy_iso_box fib(p, q, 100.0);

std::cout << "points approximately in fuzzy spherical range query" << std::endl;

std::cout << "with center (300, 300, 300, 300)" << std::endl;

std::cout << "and fuzzy radius [600, 800] are:" << std::endl;

tree.search(std::ostream_iterator<Point_d>(std::cout, "\n"), fs);

std::cout << "points approximately in fuzzy rectangular range query ";

std::cout << "[[200, 400], [800,1000]]^4 are:" << std::endl;

tree.search(std::ostream_iterator<Point_d>(std::cout, "\n"), fib);

return 0;

}

Example for User Defined Point and Distance Class

The neighbor searching works with all CGAL kernels, as well as with user defined points and distance classes. In this example we assume that the user provides the following 3-dimensional points class.


FileSpatial_searching/Point.h

struct Point {

double vec[3];

Point() { vec[0]= vec[1] = vec[2] = 0; }

Point (double x, double y, double z) { vec[0]=x; vec[1]=y; vec[2]=z; }

double x() const { return vec[ 0 ]; }

double y() const { return vec[ 1 ]; }

double z() const { return vec[ 2 ]; }

double& x() { return vec[ 0 ]; }

double& y() { return vec[ 1 ]; }

double& z() { return vec[ 2 ]; }

bool operator==(const Point& p) const

{

return (x() == p.x()) && (y() == p.y()) && (z() == p.z()) ;

}

bool operator!=(const Point& p) const { return ! (*this == p); }

};

struct Construct_coord_iterator {

typedefconstdouble* result_type;

constdouble* operator()(const Point& p) const

{ returnstatic_cast<const double*>(p.vec); }

constdouble* operator()(const Point& p, int) const

{ returnstatic_cast<const double*>(p.vec+3); }

};

We have put the glue layer in this file as well, that is a class that allows to iterate over the Cartesian coordinates of the point, and a class to construct such an iterator for a point. We next need a distance class
FileSpatial_searching/Distance.h

struct Distance {

typedef Point Query_item;

typedefdouble FT;

typedefCGAL::Dimension_tag<3> D;

double transformed_distance(const Point& p1, const Point& p2) const {

double distx= p1.x()-p2.x();

double disty= p1.y()-p2.y();

double distz= p1.z()-p2.z();

return distx*distx+disty*disty+distz*distz;

}

double min_distance_to_rectangle(const Point& p,

constCGAL::Kd_tree_rectangle<FT,D>& b) const {

double distance(0.0), h = p.x();

if (h < b.min_coord(0)) distance += (b.min_coord(0)-h)*(b.min_coord(0)-h);

if (h > b.max_coord(0)) distance += (h-b.max_coord(0))*(h-b.max_coord(0));

h=p.y();

if (h < b.min_coord(1)) distance += (b.min_coord(1)-h)*(b.min_coord(1)-h);

if (h > b.max_coord(1)) distance += (h-b.max_coord(1))*(h-b.min_coord(1));

h=p.z();

if (h < b.min_coord(2)) distance += (b.min_coord(2)-h)*(b.min_coord(2)-h);

if (h > b.max_coord(2)) distance += (h-b.max_coord(2))*(h-b.max_coord(2));

return distance;

}

double min_distance_to_rectangle(const Point& p,

constCGAL::Kd_tree_rectangle<FT,D>& b,std::vector<double>& dists){

double distance(0.0), h = p.x();

if (h < b.min_coord(0)){

dists[0] = (b.min_coord(0)-h);

distance += dists[0]*dists[0];

}

if (h > b.max_coord(0)){

dists[0] = (h-b.max_coord(0));

distance += dists[0]*dists[0];

}

h=p.y();

if (h < b.min_coord(1)){

dists[1] = (b.min_coord(1)-h);

distance += dists[1]*dists[1];

}

if (h > b.max_coord(1)){

dists[1] = (h-b.max_coord(1));

distance += dists[1]*dists[1];

}

h=p.z();

if (h < b.min_coord(2)){

dists[2] = (b.min_coord(2)-h);

distance += dists[2]*dists[2];

}

if (h > b.max_coord(2)){

dists[2] = (h-b.max_coord(2));

distance += dists[2]*dists[2];

}

return distance;

}

double max_distance_to_rectangle(const Point& p,

constCGAL::Kd_tree_rectangle<FT,D>& b) const {

double h = p.x();

double d0 = (h >= (b.min_coord(0)+b.max_coord(0))/2.0) ?

(h-b.min_coord(0))*(h-b.min_coord(0)) : (b.max_coord(0)-h)*(b.max_coord(0)-h);

h=p.y();

double d1 = (h >= (b.min_coord(1)+b.max_coord(1))/2.0) ?

(h-b.min_coord(1))*(h-b.min_coord(1)) : (b.max_coord(1)-h)*(b.max_coord(1)-h);

h=p.z();

double d2 = (h >= (b.min_coord(2)+b.max_coord(2))/2.0) ?

(h-b.min_coord(2))*(h-b.min_coord(2)) : (b.max_coord(2)-h)*(b.max_coord(2)-h);

return d0 + d1 + d2;

}

double max_distance_to_rectangle(const Point& p,

constCGAL::Kd_tree_rectangle<FT,D>& b,std::vector<double>& dists){

double h = p.x();

dists[0] = (h >= (b.min_coord(0)+b.max_coord(0))/2.0) ?

(h-b.min_coord(0)) : (b.max_coord(0)-h);

h=p.y();

dists[1] = (h >= (b.min_coord(1)+b.max_coord(1))/2.0) ?

(h-b.min_coord(1)) : (b.max_coord(1)-h);

h=p.z();

dists[2] = (h >= (b.min_coord(2)+b.max_coord(2))/2.0) ?

(h-b.min_coord(2)) : (b.max_coord(2)-h);

return dists[0] * dists[0] + dists[1] * dists[1] + dists[2] * dists[2];

}

double new_distance(double& dist, double old_off, double new_off,

int) const {

return dist + new_off*new_off - old_off*old_off;

}

double transformed_distance(double d) const { return d*d; }

double inverse_of_transformed_distance(double d) { returnstd::sqrt(d); }

};

We are ready to put the pieces together. The class ,which you see in the next file, is a mere wrapper for all our defined types. The searching itself works exactly as for CGAL kernels.


FileSpatial_searching/user_defined_point_and_distance.cpp

#include <CGAL/Search_traits.h>

#include <CGAL/point_generators_3.h>

#include <CGAL/Orthogonal_k_neighbor_search.h>

#include "Point.h"

#include "Distance.h"

typedefCGAL::Creator_uniform_3<double,Point> Point_creator;

typedef CGAL::Random_points_in_cube_3<Point, Point_creator> Random_points_iterator;

typedefCGAL::Counting_iterator<Random_points_iterator> N_Random_points_iterator;

typedefCGAL::Dimension_tag<3> D;

typedefCGAL::Search_traits<double, Point, const double*, Construct_coord_iterator, D> Traits;

typedefCGAL::Orthogonal_k_neighbor_search<Traits, Distance> K_neighbor_search;

typedefK_neighbor_search::Tree Tree;

int main() {

constint N = 1000;

constunsignedint K = 5;

Random_points_iterator rpit( 1.0);

Tree tree(N_Random_points_iterator(rpit,0),

N_Random_points_iterator(N));

Point query(0.0, 0.0, 0.0);

Distance tr_dist;

K_neighbor_search search(tree, query, K);

for(K_neighbor_search::iterator it = search.begin(); it != search.end(); it++){

std::cout << " d(q, nearest neighbor)= "

<< tr_dist.inverse_of_transformed_distance(it->second) << std::endl;

}

K_neighbor_search search2(tree, query, K, 0.0, false);

for(K_neighbor_search::iterator it = search2.begin(); it != search2.end(); it++){

std::cout << " d(q, furthest neighbor)= "

<< tr_dist.inverse_of_transformed_distance(it->second) << std::endl;

}

return 0;

}

Examples for Using an Arbitrary Point Type with Point Property Maps

The following four example programs illustrate how to use the classes and to store in the kd-tree objects of an arbitrary key type. Points are accessed through a point property map. This enables to associate information to a point or to reduce the size of the search structure.

Using a Point and an Integer as Key Type

In this example program, the search tree stores tuples of point and integer. The value type of the iterator of the neighbor searching algorithm is this tuple type.


FileSpatial_searching/searching_with_point_with_info.cpp

#include <CGAL/Exact_predicates_inexact_constructions_kernel.h>

#include <CGAL/Search_traits_3.h>

#include <CGAL/Search_traits_adapter.h>

#include <CGAL/point_generators_3.h>

#include <CGAL/Orthogonal_k_neighbor_search.h>

#include <CGAL/property_map.h>

#include <boost/iterator/zip_iterator.hpp>

#include <utility>

typedefCGAL::Exact_predicates_inexact_constructions_kernel Kernel;

typedefKernel::Point_3 Point_3;

typedef boost::tuple<Point_3,int> Point_and_int;

typedef CGAL::Random_points_in_cube_3<Point_3> Random_points_iterator;

typedefCGAL::Search_traits_3<Kernel> Traits_base;

typedefCGAL::Search_traits_adapter<Point_and_int,

CGAL::Nth_of_tuple_property_map<0, Point_and_int>,

Traits_base> Traits;

typedefCGAL::Orthogonal_k_neighbor_search<Traits> K_neighbor_search;

typedefK_neighbor_search::Tree Tree;

typedefK_neighbor_search::Distance Distance;

int main() {

constunsignedint K = 5;

Random_points_iterator rpit( 1.0);

std::vector<Point_3> points;

std::vector<int> indices;

points.push_back(Point_3(*rpit++));

points.push_back(Point_3(*rpit++));

points.push_back(Point_3(*rpit++));

points.push_back(Point_3(*rpit++));

points.push_back(Point_3(*rpit++));

points.push_back(Point_3(*rpit++));

points.push_back(Point_3(*rpit++));

indices.push_back(0);

indices.push_back(1);

indices.push_back(2);

indices.push_back(3);

indices.push_back(4);

indices.push_back(5);

indices.push_back(6);

Tree tree(

boost::make_zip_iterator(boost::make_tuple( points.begin(),indices.begin() )),

boost::make_zip_iterator(boost::make_tuple( points.end(),indices.end() ) )

);

Point_3 query(0.0, 0.0, 0.0);

Distance tr_dist;

K_neighbor_search search(tree, query, K);

for(K_neighbor_search::iterator it = search.begin(); it != search.end(); it++){

std::cout << " d(q, nearest neighbor)= "

<< tr_dist.inverse_of_transformed_distance(it->second) << " "

<< boost::get<0>(it->first)<< " " << boost::get<1>(it->first) << std::endl;

}

return 0;

}

Using an Integer as Key Type

In this example program, the search tree stores only integers that refer to points stored within a user vector. The point type of the search traits is .


FileSpatial_searching/searching_with_point_with_info_inplace.cpp

#include <CGAL/Exact_predicates_inexact_constructions_kernel.h>

#include <CGAL/Search_traits_3.h>

#include <CGAL/Search_traits_adapter.h>

#include <CGAL/point_generators_3.h>

#include <CGAL/Orthogonal_k_neighbor_search.h>

#include <CGAL/boost/iterator/counting_iterator.hpp>

#include <utility>

typedefCGAL::Exact_predicates_inexact_constructions_kernel Kernel;

typedefKernel::Point_3 Point_3;

class My_point_property_map{

const std::vector<Point_3>& points;

public:

typedef Point_3 value_type;

typedefconst value_type& reference;

typedef std::size_t key_type;

typedef boost::lvalue_property_map_tag category;

My_point_property_map(const std::vector<Point_3>& pts):points(pts){}

reference operator[](key_type k) const {return points[k];}

friend reference get(const My_point_property_map& ppmap,key_type i)

{return ppmap[i];}

};

typedef CGAL::Random_points_in_cube_3<Point_3> Random_points_iterator;

typedefCGAL::Search_traits_3<Kernel> Traits_base;

typedefCGAL::Search_traits_adapter<std::size_t,My_point_property_map,Traits_base> Traits;

typedefCGAL::Orthogonal_k_neighbor_search<Traits> K_neighbor_search;

typedefK_neighbor_search::Tree Tree;

typedef Tree::Splitter Splitter;

typedefK_neighbor_search::Distance Distance;

int main() {

constunsignedint K = 5;

Random_points_iterator rpit( 1.0);

std::vector<Point_3> points;

points.push_back(Point_3(*rpit++));

points.push_back(Point_3(*rpit++));

points.push_back(Point_3(*rpit++));

points.push_back(Point_3(*rpit++));

points.push_back(Point_3(*rpit++));

points.push_back(Point_3(*rpit++));

points.push_back(Point_3(*rpit++));

My_point_property_map ppmap(points);

Tree tree(

boost::counting_iterator<std::size_t>(0),

boost::counting_iterator<std::size_t>(points.size()),

Splitter(),

Traits(ppmap)

);

Point_3 query(0.0, 0.0, 0.0);

Distance tr_dist(ppmap);

K_neighbor_search search(tree, query, K,0,true,tr_dist);

for(K_neighbor_search::iterator it = search.begin(); it != search.end(); it++){

std::cout << " d(q, nearest neighbor)= "

<< tr_dist.inverse_of_transformed_distance(it->second) << " "

<< points[it->first] << " " << it->first << std::endl;

}

return 0;

}

Using a Model of L-value Property Map Concept

This example programs uses a model of . Points are read from a . The search tree stores integers of type . The value type of the iterator of the neighbor searching algorithm is .


FileSpatial_searching/searching_with_point_with_info_pmap.cpp

#include <CGAL/Exact_predicates_inexact_constructions_kernel.h>

#include <CGAL/Search_traits_3.h>

#include <CGAL/Search_traits_adapter.h>

#include <CGAL/point_generators_3.h>

#include <CGAL/Orthogonal_k_neighbor_search.h>

#include <CGAL/boost/iterator/counting_iterator.hpp>

#include <utility>

typedefCGAL::Exact_predicates_inexact_constructions_kernel Kernel;

typedefKernel::Point_3 Point_3;

typedef boost::const_associative_property_map<std::map<std::size_t,Point_3> > My_point_property_map;

typedef CGAL::Random_points_in_cube_3<Point_3> Random_points_iterator;

typedefCGAL::Search_traits_3<Kernel> Traits_base;

typedefCGAL::Search_traits_adapter<std::size_t,My_point_property_map,Traits_base> Traits;

typedefCGAL::Orthogonal_k_neighbor_search<Traits> K_neighbor_search;

typedefK_neighbor_search::Tree Tree;

typedef Tree::Splitter Splitter;

typedefK_neighbor_search::Distance Distance;

int main() {

constunsignedint K = 5;

Random_points_iterator rpit( 1.0);

std::map<std::size_t,Point_3> points;

points[0]=Point_3(*rpit++);

points[1]=Point_3(*rpit++);

points[2]=Point_3(*rpit++);

points[3]=Point_3(*rpit++);

points[4]=Point_3(*rpit++);

points[5]=Point_3(*rpit++);

points[6]=Point_3(*rpit++);

My_point_property_map ppmap(points);

Tree tree(

boost::counting_iterator<std::size_t>(0),

boost::counting_iterator<std::size_t>(points.size()),

Splitter(),

Traits(ppmap)

);

Point_3 query(0.0, 0.0, 0.0);

Distance tr_dist(ppmap);

K_neighbor_search search(tree, query, K,0,true,tr_dist);

for(K_neighbor_search::iterator it = search.begin(); it != search.end(); it++){

std::cout << " d(q, nearest neighbor)= "

<< tr_dist.inverse_of_transformed_distance(it->second) << " "

<< points[it->first] << " " << it->first << std::endl;

}

return 0;

}

Using a Point Property Map of a Polygonal Mesh

This example programs shows how to search the closest vertices of a or, quite similar, of a . Points are stored in the polygonal mesh. The search tree stores vertex descriptors. The value type of the iterator of the neighbor searching algorithm is .


FileSpatial_searching/searching_surface_mesh_vertices.cpp

If you attended high school in the late nineties and early aughts, it's likely that you used the family computer in the den to type up your essays or do research. It's also likely that much of your time "doing research" was actually tooling around on AOL with an open Microsoft Word window so if your parents walked in you could smoothly play it off like you were truly doing work.

The more things change, the more they stay the same.

EssayTyper is a site that allows you to plug in virtually any subject, then brings you to a Word-style webpage where you can write your essay. But you don't have to "write" anything. Not technically. Just bang on the keyboard and words appear.

Go ahead, try it. I used "economics" then pressed that button on the right.


Immediately, a paper appears.

The title is prewritten: "Innovative or Simply Post-Modern?"


And then, some computer magic.

Just start banging on keys.

Bang on the home keys, bang on the number keys. Press enter! Press delete! What will they think of next?


And here's a look at what's happening on the screen:


It's very fun, but we wondered if students were actually trying to pass off these generated papers as their own.

See, the first sentences of "Truly Jobs" (all EssayTyper papers are pre-titled) reads as follows:

Steven Paul "Steve" Jobs was an American pioneer of the personal computer revolution of the 1970s. He would come to be known as the entreprenur, marketer, and inventor, and cofounder, chairman, and finally CEO of Apple Inc. who transformed "one industry after another, from computers and smartphones to music and movies.


And a quick search proved it was just a rewrite of Jobs' Wikipedia page. So was our EssayTyper paper on Business Insider, and "Mad Men."

In 2012, The Atlantic published "Write My Essay, Please!" uncovering the truth behind sites similar to EssayTyper and the people who use them.

"Essay writing has become a cottage industry premised on systematic flaunting of the most basic aims of higher education," Richard Gunderman explains in the Atlantic piece. "The very fact that such services exist reflects a deep and widespread misunderstanding of why colleges and universities ask students to write essays in the first place."


While EssayTyper is free, and pretty useful for fooling your parents into thinking you're actually sitting on the computer and doing legitimate work, Gunderman says the bevvy of sites out there that appoint real people to write term papers for students is alarming. And, he points out, paying someone to write an essay for you isn't technically plagiarism.

"In this case, assuming the essay-writing services are actually providing brand-new essays, no one else's work is being stolen without consent," Gunderman writes. "It is being purchased. Nevertheless, the work is being used without attribution, and the students are claiming credit for work they never did. In short, the students are cheating, not learning."

A quick Google search for "how to find out if student is plagiarizing" serves up tons of tips and tricks for exhausted teachers and parents. A site called PlagTracker lets you type in a phrase or sentence to run against the rest of the internet. I copied and pasted the first sentence of "my" Steve Jobs essay.

The process took about twenty seconds (and PlagTracker offered to speed it up if I paid.) Here were the results.


My content was "81% plagerized from 5 sources," but none of those sources were listed as Wikipedia.

Brooklyn Friends School teacher Kathleen Clinchy agrees that while technology has made it easier to cheat, it's now a lot harder to definitively catch a cheater. She says resorting to old-school interrogation is the way to go.

In an email to Business Insider, Clinchy tells us:

It gets a little tricky because you don't want to accuse a student of cheating, so being able to have a conversation with strategic questioning is a good skill to have as a teacher. In younger grades, like middle school, you can get the parent involved and just ask them to revise the work together (AKA make sure your child stops cheating), but high school is a little murkier.

You also need to watch for students copying or plagiarizing each other too — that's where you just give the kids their papers back together with highlighted similar sentences and just stare at them until they talk.

But Bay Gross, founder of EssayTyper, has made sure to caveat his service to take any potential blame off of himself and the site. "Please don't ever try to use this legitimately," he says on the site. "The magic part is not real ... and that's plagiarism."

Leave a Comment

(0 Comments)

Your email address will not be published. Required fields are marked *