Select to view content in your preferred language

getting ags point from opengl

2492
19
Jump to solution
08-14-2012 09:49 AM
HamzaHaroon
Regular Contributor
I need to get the ags point on the map (point with long and lat) on a opengl view. right now i have it to getting the user touch on the screen, which is x and y. how can this be done if my glview controlling class is seperate from the mapview controller?

this is how i get input
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { CGRect bounds = [self bounds];     UITouch*    touch = [[event touchesForView:self] anyObject]; firstTouch = YES; // Convert touch point from UIView referential to OpenGL one (upside- down flip) location = [touch locationInView:self]; location.y = bounds.size.height - location.y; } // Handles the continuation of a touch. - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {        CGRect bounds = [self bounds]; UITouch* touch = [[event touchesForView:self] anyObject];      // Convert touch point from UIView referential to OpenGL one (upside- down flip) if (firstTouch) { firstTouch = NO; previousLocation = [touch previousLocationInView:self]; previousLocation.y = bounds.size.height - previousLocation.y; } else { location = [touch locationInView:self];     location.y = bounds.size.height - location.y; previousLocation = [touch previousLocationInView:self]; previousLocation.y = bounds.size.height - previousLocation.y; }      // Render the stroke [self renderLineFromPoint:previousLocation toPoint:location]; } // Handles the end of a touch event when the touch is a tap. - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { CGRect bounds = [self bounds];     UITouch*    touch = [[event touchesForView:self] anyObject]; if (firstTouch) { firstTouch = NO; previousLocation = [touch previousLocationInView:self]; previousLocation.y = bounds.size.height - previousLocation.y; [self renderLineFromPoint:previousLocation toPoint:location]; } }


here is the code i have for drawing based on user touch
// Drawings a line onscreen based on where the user touches - (void) renderLineFromPoint:(CGPoint)start toPoint:(CGPoint)end { static GLfloat*     vertexBuffer = NULL; static NSUInteger   vertexMax = 64; NSUInteger vertexCount = 0,     count,     i; [EAGLContext setCurrentContext:context]; glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer); // Convert locations from Points to Pixels CGFloat scale = self.contentScaleFactor; start.x *= scale;PaintingView.m 12-08-07 9:34 AM start.y *= scale; end.x *= scale; end.y *= scale; // Allocate vertex array buffer if(vertexBuffer == NULL) vertexBuffer = malloc(vertexMax * 2 * sizeof(GLfloat)); // Add points to the buffer so there are drawing points every X pixels count = MAX(ceilf(sqrtf((end.x - start.x) * (end.x - start.x) + (end.y -  start.y) * (end.y - start.y)) / kBrushPixelStep), 1); for(i = 0; i < count; ++i) { if(vertexCount == vertexMax) { vertexMax = 2 * vertexMax; vertexBuffer = realloc(vertexBuffer, vertexMax * 2 * sizeof (GLfloat)); } vertexBuffer[2 * vertexCount + 0] = start.x + (end.x - start.x) *  ((GLfloat)i / (GLfloat)count); vertexBuffer[2 * vertexCount + 1] = start.y + (end.y - start.y) *  ((GLfloat)i / (GLfloat)count); vertexCount += 1; } // Render the vertex array glVertexPointer(2, GL_FLOAT, 0, vertexBuffer); glDrawArrays(GL_POINTS, 0, vertexCount); // Display the buffer glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer); [context presentRenderbuffer:GL_RENDERBUFFER_OES]; }
0 Kudos
1 Solution

Accepted Solutions
HamzaHaroon
Regular Contributor
solved this by making my painting class have a delegate method, and adding the delegate as a protocol into my main view controller.

the delegate method gets the mapview from the main view controller.

View solution in original post

0 Kudos
19 Replies
NimeshJarecha
Esri Regular Contributor
You can create AGSMutablePolyline or AGSMutablePolygon (based on geometry you want to draw) and keep adding AGSPoint converted from each touch (UITouch). Here is the code of conversion.

- (AGSPoint*)getMapPointFromTouchs:(NSSet *)touches {
    UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
    CGPoint touchPoint = [mytouch locationInView:self];
    AGSPoint *mapPoint = [self.mapView toMapPoint:touchPoint];
    return mapPoint;
}

Regards,
Nimesh
0 Kudos
HamzaHaroon
Regular Contributor
i added that method, but I'm not getting any coordinates. i think its because i didn't set the mapView properly. here is my code for this class, and its header file. I'm new to arcgis and iOS, so i don't know how to properly set the mapView spatial reference to get the points based on touch.

#import <UIKit/UIKit.h>
#import <OpenGLES/EAGL.h>
#import <OpenGLES/ES1/gl.h>
#import <OpenGLES/ES1/glext.h>
#import <ArcGIS/ArcGIS.h>

//CONSTANTS:

#define kBrushOpacity  (1.0 / 3.0)
#define kBrushPixelStep  1
#define kBrushScale   1
#define kLuminosity   0.75
#define kSaturation   1.0

//CLASS INTERFACES:

@interface PaintingView : UIView
{
    AGSMapView* _mapView;
 AGSGraphicsLayer* _graphicsLayer;
@private
 // The pixel dimensions of the backbuffer
 GLint backingWidth;
 GLint backingHeight;
 
 EAGLContext *context;
 
 // OpenGL names for the renderbuffer and framebuffers used to render to this view
 GLuint viewRenderbuffer, viewFramebuffer;
 
 // OpenGL name for the depth buffer that is attached to viewFramebuffer, if it exists (0 if it does not exist)
 GLuint depthRenderbuffer;
 
 GLuint brushTexture;
 CGPoint location;
 CGPoint previousLocation;
    
    AGSPoint *currentPoint;
    AGSPoint *previousPoint;
 
    Boolean firstTouch;
 Boolean needsErase;
 AGSMutableMultipoint *lines;
}

@property(nonatomic, readwrite) CGPoint location;
@property(nonatomic, readwrite) CGPoint previousLocation;
@property(nonatomic, readwrite) AGSMutableMultipoint *lines;
//@property(nonatomic, retain) AGSMapView mapView;

- (void)erase;
- (void)setBrushColorWithRed:(CGFloat)red green:(CGFloat)green blue:(CGFloat)blue;
- (void)update;

@end
0 Kudos
HamzaHaroon
Regular Contributor
//  PaintingView.m

#import <QuartzCore/QuartzCore.h>
#import <OpenGLES/EAGLDrawable.h>
#import "PaintingView.h"

//CLASS IMPLEMENTATIONS:

// A class extension to declare private methods
@interface PaintingView (private)

- (BOOL)createFramebuffer;
- (void)destroyFramebuffer;

@end

@implementation PaintingView

@synthesize  location;
@synthesize  previousLocation;
@synthesize  lines;

+ (Class) layerClass
{
 return [CAEAGLLayer class];
}

// The GL view is stored in the nib file. When it's unarchived it's sent -initWithCoder:
- (id)initWithCoder:(NSCoder*)coder mapView:(AGSMapView*) mapView graphicsLayer:(AGSGraphicsLayer*)graphicsLayer {
 
    _mapView = mapView;
    _graphicsLayer = graphicsLayer;
    lines = [[AGSMutableMultipoint alloc] initWithSpatialReference:_mapView];
    
 NSMutableArray* recordedPaths;
 CGImageRef  brushImage;
 CGContextRef brushContext;
 GLubyte   *brushData;
 size_t   width, height;
    
    if ((self = [super initWithCoder:coder])) {
  CAEAGLLayer *eaglLayer = (CAEAGLLayer *)self.layer;
  
  eaglLayer.opaque = YES;
  // In this application, we want to retain the EAGLDrawable contents after a call to presentRenderbuffer.
 
        eaglLayer.drawableProperties = [NSDictionary dictionaryWithObjectsAndKeys:
          [NSNumber numberWithBool:YES], kEAGLDrawablePropertyRetainedBacking, kEAGLColorFormatRGBA8, kEAGLDrawablePropertyColorFormat, nil];
  
  context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1];
  
  if (!context || ![EAGLContext setCurrentContext:context]) {
   [self release];
   return nil;
  }
  
  // Create a texture from an image
  // First create a UIImage object from the data in a image file, and then extract the Core Graphics image
  brushImage = [UIImage imageNamed:@"Particle1.png"].CGImage;
  
  // Get the width and height of the image

  width = 5;
  height= 10;
  
  // Make sure the image exists
  if(brushImage) {
   // Allocate  memory needed for the bitmap context
   brushData = (GLubyte *) calloc(width * height * 4, sizeof(GLubyte));
   // Use  the bitmatp creation function provided by the Core Graphics framework. 
   brushContext = CGBitmapContextCreate(brushData, width, height, 15, width * 8, CGImageGetColorSpace(brushImage), kCGImageAlphaPremultipliedLast);
   // After you create the context, you can draw the  image to the context.
   CGContextDrawImage(brushContext, CGRectMake(0.0, 0.0, (CGFloat)width, (CGFloat)height), brushImage);
   // You don't need the context at this point, so you need to release it to avoid memory leaks.
   CGContextRelease(brushContext);
   // Use OpenGL ES to generate a name for the texture.
   glGenTextures(1, &brushTexture);
   // Bind the texture name. 
   glBindTexture(GL_TEXTURE_2D, brushTexture);
   // Set the texture parameters to use a minifying filter and a linear filer (weighted average)
   glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
   // Specify a 2D texture image, providing the a pointer to the image data in memory
   glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, brushData);
   // Release  the image data; it's no longer needed
            free(brushData);
  }
  
  // Set the view's scale factor
  self.contentScaleFactor = 1.0;
        
  // Setup OpenGL states
  glMatrixMode(GL_PROJECTION);
  CGRect frame = self.bounds;
  CGFloat scale = self.contentScaleFactor;
  // Setup the view port in Pixels
  glOrthof(0, frame.size.width * scale, 0, frame.size.height * scale, -1, 1);
  glViewport(0, 0, frame.size.width * scale, frame.size.height * scale);
  glMatrixMode(GL_MODELVIEW);
  glDisable(GL_DITHER);
  glEnable(GL_TEXTURE_2D);
  glEnableClientState(GL_VERTEX_ARRAY);
     glEnable(GL_BLEND);
  // Set a blending function appropriate for premultiplied alpha pixel data
  glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
  glEnable(GL_POINT_SPRITE_OES);
  glTexEnvf(GL_POINT_SPRITE_OES, GL_COORD_REPLACE_OES, GL_TRUE);
  glPointSize(width / kBrushScale);
  glColor4f(0.5f,0.1f,1.0f,1.0f);  
        // Make sure to start with a cleared buffer
  needsErase = YES;
  
  // Playback recorded path, which is "Shake Me"
  recordedPaths = [NSMutableArray arrayWithContentsOfFile:[[NSBundle mainBundle] pathForResource:@"Recording" ofType:@"data"]];
  if([recordedPaths count])
   [self performSelector:@selector(playback:) withObject:recordedPaths afterDelay:0.2];
 }
 [self setBackgroundColor:[UIColor clearColor]];
    
    //self.userInteractionEnabled = NO;
    
    return self;
    
}

-(void)layoutSubviews
{
 [EAGLContext setCurrentContext:context];
 [self destroyFramebuffer];
 [self createFramebuffer];
 
 // Clear the framebuffer the first time it is allocated
 if (needsErase) {
  [self erase];
  needsErase = NO;
 }
}

- (BOOL)createFramebuffer
{
 // Generate IDs for a framebuffer object and a color renderbuffer
 glGenFramebuffersOES(1, &viewFramebuffer);
 glGenRenderbuffersOES(1, &viewRenderbuffer);
 
 glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
 glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
 [context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:(id<EAGLDrawable>)self.layer];
 glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, viewRenderbuffer);
 
 glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
 glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);
 
 glGenRenderbuffersOES(1, &depthRenderbuffer);
 glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);
 glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, backingWidth, backingHeight);
 glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);
 
 if(glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES)
 {
  NSLog(@"failed to make complete framebuffer object %x", glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES));
  return NO;
 }
 return YES;
}

// Clean up any buffers we have allocated.
- (void)destroyFramebuffer
{
 glDeleteFramebuffersOES(1, &viewFramebuffer);
 viewFramebuffer = 0;
 glDeleteRenderbuffersOES(1, &viewRenderbuffer);
 viewRenderbuffer = 0;
 
 if(depthRenderbuffer)
 {
  glDeleteRenderbuffersOES(1, &depthRenderbuffer);
  depthRenderbuffer = 0;
 }
}

// Releases resources when they are not longer needed.
- (void) dealloc
{
 if (brushTexture)
 {
  glDeleteTextures(1, &brushTexture);
  brushTexture = 0;
 }
 
 if([EAGLContext currentContext] == context)
 {
  [EAGLContext setCurrentContext:nil];
 }
 
 [context release];
 [super dealloc];
}


// Erases the screen
- (void) erase
{
 [EAGLContext setCurrentContext:context];
 
 // Clear the buffer
 glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
 glClearColor(0.0, 0.0, 0.0, 0.0);
 glClear(GL_COLOR_BUFFER_BIT);
 
 // Display the buffer
 glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
 [context presentRenderbuffer:GL_RENDERBUFFER_OES];
}


0 Kudos
HamzaHaroon
Regular Contributor
// Drawings a line onscreen based on where the user touches
- (void) renderLineFromPoint:(AGSPoint*)astart toPoint:(AGSPoint*)aend//(CGPoint)start toPoint:(CGPoint)end
{
 static GLfloat*  vertexBuffer = NULL;
 static NSUInteger vertexMax = 64;
 NSUInteger   vertexCount = 0, count, i;
 CGPoint start = [_mapView toScreenPoint:astart];
        CGPoint end = [_mapView toScreenPoint:aend];
   
 [EAGLContext setCurrentContext:context];
 glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
 
 // Convert locations from Points to Pixels
 CGFloat scale = self.contentScaleFactor;
 start.x *= scale;
 start.y *= scale;
 end.x *= scale;
 end.y *= scale;
 
 // Allocate vertex array buffer
 if(vertexBuffer == NULL)
  vertexBuffer = malloc(vertexMax * 2 * sizeof(GLfloat));
 
 // Add points to the buffer so there are drawing points every X pixels
 count = MAX(ceilf(sqrtf((end.x - start.x) * (end.x - start.x) + (end.y - start.y) * (end.y - start.y)) / kBrushPixelStep), 1);
 for(i = 0; i < count; ++i) {
  if(vertexCount == vertexMax) {
   vertexMax = 2 * vertexMax;
   vertexBuffer = realloc(vertexBuffer, vertexMax * 2 * sizeof(GLfloat));
  }
  
  vertexBuffer[2 * vertexCount + 0] = start.x + (end.x - start.x) * ((GLfloat)i / (GLfloat)count);
  vertexBuffer[2 * vertexCount + 1] = start.y + (end.y - start.y) * ((GLfloat)i / (GLfloat)count);
  vertexCount += 1;
 }
 
 // Render the vertex array
 glVertexPointer(2, GL_FLOAT, 0, vertexBuffer);
 glDrawArrays(GL_POINTS, 0, vertexCount);
 
 // Display the buffer
 glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
 [context presentRenderbuffer:GL_RENDERBUFFER_OES];
}

//Returns a map point based on user touch on screen.

-(AGSPoint*)getMapPointFromTouch:(NSSet *)touches
{
    UITouch *myTouch = [[touches allObjects] objectAtIndex:0];
    CGPoint touchPoint = [myTouch locationInView:self];
    AGSPoint *mapPoint = [_mapView toMapPoint:touchPoint];
    return mapPoint;
}

// Reads previously recorded points and draws them onscreen. This is the Shake Me message that appears when the application launches.
- (void) playback:(NSMutableArray*)recordedPaths
{
 NSData*    data = [recordedPaths objectAtIndex:0];
 AGSPoint*   point = (AGSPoint*)[data bytes];
 NSUInteger   count = [data length] / sizeof(CGPoint),
    i;
 
 // Render the current path
 for(i = 0; i < count - 1; ++i)//, ++point)
  [self renderLineFromPoint:[lines pointAtIndex:i] toPoint:[lines pointAtIndex:i+1]];
        //[self renderLineFromPoint:*point toPoint:*(point + 1)];
 
 // Render the next path after a short delay 
 [recordedPaths removeObjectAtIndex:0];
 if([recordedPaths count])
  [self performSelector:@selector(playback:) withObject:recordedPaths afterDelay:0.0];
}

// Handles the start of a touch
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
    AGSPoint* coord = [[event touchesForView:_mapView] anyObject];
 CGRect  bounds = [self bounds];
    UITouch* touch = [[event touchesForView:self] anyObject];
 firstTouch = YES;
 
    // get touch point and store are current point
    currentPoint = [self getMapPointFromTouch:touches];
    [lines addPoint: currentPoint];
}

// Handles the continuation of a touch.
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{  
    AGSPoint* coord = [[event touchesForView:_mapView] anyObject];
 CGRect    bounds = [self bounds];
 UITouch*   touch = [[event touchesForView:self] anyObject];
    
 // Convert touch point from UIView referential to OpenGL one (upside-down flip)
 if (firstTouch) {
  firstTouch = NO;
        
        previousPoint = [currentPoint copy];

 } else {
        previousPoint = [currentPoint copy];
        currentPoint = [self getMapPointFromTouch:touches];
        [lines addPoint:currentPoint];
 }
    
 // Render the stroke
    [self renderLineFromPoint:previousPoint toPoint:currentPoint];
}

// Handles the end of a touch event when the touch is a tap.
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
 CGRect   bounds = [self bounds];
        UITouch*   touch = [[event touchesForView:self] anyObject];
 if (firstTouch) {
  firstTouch = NO;
        previousPoint = [currentPoint copy];
        
        [self renderLineFromPoint:previousPoint toPoint:currentPoint];
 }
}

// Handles the end of a touch event.
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
}

- (void)setBrushColorWithRed:(CGFloat)red green:(CGFloat)green blue:(CGFloat)blue
{
 // Set the brush color using premultiplied alpha values
 glColor4f(100.0f,50.0f,200.0f,1.0f);
}

@end
0 Kudos
NimeshJarecha
Esri Regular Contributor
Yes, map view must be properly initialized. Are you showing open gl view on top of map view? If yes, you should create a custom initializer for the open gl view and pass already initialized mapView to it.

Regards,
Nimesh
0 Kudos
HamzaHaroon
Regular Contributor
yes, my opengl view comes up ontop of my mapView. I thougt i was initialzing my mapview in my initWithCoder: method. how should i be initializing it?
0 Kudos
NimeshJarecha
Esri Regular Contributor
Are you initializing open gl view with initWithCoder: method? If yes, you should create a custom init method as initWithCoder:mapView and pass already initialized mapView object from map view controller.

Regards,
Nimesh
0 Kudos
HamzaHaroon
Regular Contributor
my painting classes are seperate from the view controller. i thought that this method should initialize my mapView
- (id)initWithCoder:(NSCoder*)coder mapView:(AGSMapView*) mapView graphicsLayer:(AGSGraphicsLayer*)graphicsLayer {
0 Kudos
NimeshJarecha
Esri Regular Contributor
That'll just pass the object. If it is initialized then it'll work otherwise not. You should pass already initialized mapView object to that method.

Regards,
Nimesh
0 Kudos