Search Results

Search found 2589 results on 104 pages for 'ef es'.

Page 22/104 | < Previous Page | 18 19 20 21 22 23 24 25 26 27 28 29  | Next Page >

  • Should I use OpenGL for chess with animations?

    - by fhucho
    At the moment I am experimenting with SurfaceView for my chess game with animations. I am getting only about 8 FPS in the emulator. I draw a chess board and 32 chess pieces and rotate everything (to see how smooth it is), I am using antialiasing. On the Droid I'm getting about 20FPS, so it's not very smooth. Is it possible to implement a game with very scarce and simple animations without having to use OpenGL? This is what I do every frame: // scale and rotate matrix.setScale(scale, scale); rotation += 3; matrix.postRotate(rotation, 152, 152); canvas = surfaceHolder.lockCanvas(); canvas.setDrawFilter(new PaintFlagsDrawFilter(0, Paint.FILTER_BITMAP_FLAG)); canvas.setMatrix(matrix); canvas.drawARGB(255, 255, 255, 255); // fill the canvas with white for (int i = 0; i < sprites.size(); i++) { sprites.get(i).draw(canvas); // draws chessboard and chess pieces }

    Read the article

  • Setting ModelView matrix using rotate, translate, etc.. vs setting manual matrix

    - by guymic
    When setting the ModelView matrix you normally go through several transformations from the identity matrix. for example: glMatrixMode(GL_MODELVIEW); glLoadIdentity(); glRotatef(270.0f, 0.0f, 0.0f, 1.0f); glTranslatef(-rect.size.height / 2, -rect.size.width / 2, 0.0f); Instead of doing those operations one after the other (assume there are more than two), wouldn't it be more efficient to simply pre-calculate the resulting matrix and set the ModelView matrix to this manual matrix?

    Read the article

  • Driving me INSANE: Unable to Retrieve Metadata for

    - by Loren
    I've been spending the past 3 days trying to fix this problem I'm encountering - it's driving me insane... I'm not quite sure what is causing this bug - here are the details: MVC4 + Entity Framework 4.4 + MySql + POCO/Code First I'm setting up the above configuration .. here are my classes: namespace BTD.DataContext { public class BTDContext : DbContext { public BTDContext() : base("name=BTDContext") { } protected override void OnModelCreating(DbModelBuilder modelBuilder) { base.OnModelCreating(modelBuilder); //modelBuilder.Conventions.Remove<System.Data.Entity.Infrastructure.IncludeMetadataConvention>(); } public DbSet<Product> Products { get; set; } public DbSet<ProductImage> ProductImages { get; set; } } } namespace BTD.Data { [Table("Product")] public class Product { [Key] public long ProductId { get; set; } [DisplayName("Manufacturer")] public int? ManufacturerId { get; set; } [Required] [StringLength(150)] public string Name { get; set; } [Required] [DataType(DataType.MultilineText)] public string Description { get; set; } [Required] [StringLength(120)] public string URL { get; set; } [Required] [StringLength(75)] [DisplayName("Meta Title")] public string MetaTitle { get; set; } [DataType(DataType.MultilineText)] [DisplayName("Meta Description")] public string MetaDescription { get; set; } [Required] [StringLength(25)] public string Status { get; set; } [DisplayName("Create Date/Time")] public DateTime CreateDateTime { get; set; } [DisplayName("Edit Date/Time")] public DateTime EditDateTime { get; set; } } [Table("ProductImage")] public class ProductImage { [Key] public long ProductImageId { get; set; } public long ProductId { get; set; } public long? ProductVariantId { get; set; } [Required] public byte[] Image { get; set; } public bool PrimaryImage { get; set; } public DateTime CreateDateTime { get; set; } public DateTime EditDateTime { get; set; } } } Here is my web.config setup... <connectionStrings> <add name="BTDContext" connectionString="Server=localhost;Port=3306;Database=btd;User Id=root;Password=mypassword;" providerName="MySql.Data.MySqlClient" /> </connectionStrings> The database AND tables already exist... I'm still pretty new with mvc but was using this tutorial The application builds fine.. however when I try to add a controller using Product (BTD.Data) as my model class and BTDContext (BTD.DataContext) as my data context class I receive the following error: Unable to retrieve metadata for BTD.Data.Product using the same DbCompiledModel to create context against different types of database servers is not supported. Instead, create a separate DbCompiledModel for each type of server being used. I am at a complete loss - I've scoured google with almost every different variation of that error message above I can think of but to no avail. Here are the things i can verify... MySql is working properly I'm using MySql Connector version 6.5.4 and have created other ASP.net web forms + entity framework applications with ZERO problems I have also tried including/removing this in my web.config: <system.data> <DbProviderFactories> <remove invariant="MySql.Data.MySqlClient"/> <add name="MySQL Data Provider" invariant="MySql.Data.MySqlClient" description=".Net Framework Data Provider for MySQL" type="MySql.Data.MySqlClient.MySqlClientFactory, MySql.Data, Version=6.5.4.0, Culture=neutral, PublicKeyToken=c5687fc88969c44d" /> </DbProviderFactories> I've literally been working on this bug for days - I'm to the point now that I would be willing to pay someone to solve it.. no joke... I'd really love to use MVC 4 and Razor - I was so excited to get started on this, but now i'm pretty discouraged - I truly appreciate any help/guidance on this! Also note - i'm using Entityframework from Nuget... Another Note I was using the default visual studio template that creates your MVC project with the account pages and other stuff. I JUST removed all references to the added files because they were trying to use the "DefaultConnection" which didn't exist - so i thought those files may be what was causing the error - however still no luck after removing them - I just wanted to let everyone know i'm using the visual studio MVC project template which pre-creates a bunch of files. I will be trying to recreate this all from a blank MVC project which doesn't have those files - i will update this once i test that Other References It appears someone else is having the same issues I am - the only difference is they are using sql server - I tried tweaking all my code to follow the suggestions on this stackoverflow question/answer here but still to no avail

    Read the article

  • Possible to change the alpha value of certain pixels on iPhone?

    - by emi1faber
    Is it possible to change just a portion of a Sprite's alpha in response to user interaction? A good example of what I mean is iFog or iSteam, where the user can wipe "steam" off the iPhone's screen. Swapping images out wouldn't be feasible due to the sheer number of possibilities where the user could touch and move... For example, say you have a simple app that has a brick wall in the background that has graffiti on it, so there'd be two sprites, one of the brick wall, then one of the graffiti that has a higher z value than the brick wall. Then, based upon where the user touches (assuming their touch controls a sandblaster), some of the graffiti should be removed, but not all of it, which could be accomplished by changing the alpha value on a portion of the graffiti sprite. Is there any way to do this in cocos2d-iphone? Or, do I need to drop down into openGL, and if so, where would be a good place to start my search on how to accomplish this? Ideally, I'd like to accomplish this on a cocos2d-iphone Sprite, but if it's not possible, where's the best place to start looking? Thanks in advance, Ben

    Read the article

  • Smoothing touch-based animation in iPhone OpenGL?

    - by quixoto
    I know this is vague, but looking for general tips/help on this, as it's not an area of significant expertise for me. I have some iPhone code that's basically an EAGL view handling a single touch. The app draws (using GL) a circle via triangle fan at the touch point, and moves it when the user moves the touch point, and re-renders the view then. When dragging a finger slowly, the circle keeps up and consistent with the finger as it moves. If I scribble my finger quickly back and forth across the screen, the rendering doesn't keep up with the touch motion, so you see an optical illusion of "multiple" discrete circles on the screen "at once". (Normal persistence of vision illusion). This optical illusion is jarring. How can I make this look more natural? Can I blur the motion of the circle somehow? Is this result the evidence of some bad frame rate issue? I see this artifact even when nothing else is being rendered, so I think this might just be as fast as we can go. Any hints or suggestions? Much appreciated. Thanks.

    Read the article

  • opengl paint program based on Apple's 'glPaint' on a white background - how to blend?

    - by Adam
    Trying to write a simple paint program for iPhone, and I'm using Apple's glPaint sample as a guide. The only problem is, painting doesn't work on a white background, since white + colour = white. I've tried different blending functions, but haven't been able to hit on the right combination of settings and/or brushes to make this work. I've seen similar posts about this problem but no answers. Does anyone know how this might work?

    Read the article

  • Why does my performance increase when touching the screen?

    - by Smills
    For some reason my FPS jumps up considerably when I move my mouse around on the screen (on the emulator) while holding the left mouse button. Normally my game is very laggy, but if I touch the screen (and as long as I am moving the mouse around while touching) it goes perfectly smooth. I have tried sleeping for 20ms in the onTouchEvent, but it doesn't appear to make any difference. Here is the code I use in my onTouchEvent: // events when touching the screen public boolean onTouchEvent(MotionEvent event) { int eventaction = event.getAction(); touchX=event.getX(); touchY=event.getY(); switch (eventaction) { case MotionEvent.ACTION_DOWN: { touch=true; } break; case MotionEvent.ACTION_MOVE: { } break; case MotionEvent.ACTION_UP: { touch=false; } break; } /*try { AscentThread.sleep(20); } catch (InterruptedException e) { // TODO Auto-generated catch block e.printStackTrace(); }*/ return true; } In the logcat log, FPS is the current fps (average of the last 20 frames), touch is whether or not the screen is being touched (from onTouchEvent). What on earth is going on? Has anyone else had this odd behaviour before? Logcat log: 12-21 19:43:26.154: INFO/myActivity(786): FPS: 31.686569159606414 Touch: false 12-21 19:43:27.624: INFO/myActivity(786): FPS: 19.46310293212206 Touch: false 12-21 19:43:29.104: INFO/myActivity(786): FPS: 18.801202175690467 Touch: false 12-21 19:43:30.514: INFO/myActivity(786): FPS: 21.118295877408478 Touch: false 12-21 19:43:31.985: INFO/myActivity(786): FPS: 19.117397812958878 Touch: false 12-21 19:43:33.534: INFO/myActivity(786): FPS: 15.572571858239263 Touch: false 12-21 19:43:34.934: INFO/myActivity(786): FPS: 20.584119901503506 Touch: false 12-21 19:43:36.404: INFO/myActivity(786): FPS: 18.888025905454207 Touch: false 12-21 19:43:37.814: INFO/myActivity(786): FPS: 22.35722329083629 Touch: false 12-21 19:43:39.353: INFO/myActivity(786): FPS: 15.73604859775362 Touch: false 12-21 19:43:40.763: INFO/myActivity(786): FPS: 20.912449882754633 Touch: false 12-21 19:43:42.233: INFO/myActivity(786): FPS: 18.785278388997718 Touch: false 12-21 19:43:43.634: INFO/myActivity(786): FPS: 20.1357397209596 Touch: false 12-21 19:43:45.043: INFO/myActivity(786): FPS: 21.961138432007957 Touch: false 12-21 19:43:46.453: INFO/myActivity(786): FPS: 22.167196852834273 Touch: false 12-21 19:43:47.854: INFO/myActivity(786): FPS: 22.207318228024274 Touch: false 12-21 19:43:49.264: INFO/myActivity(786): FPS: 22.36980559230175 Touch: false 12-21 19:43:50.604: INFO/myActivity(786): FPS: 23.587638823252547 Touch: false 12-21 19:43:52.073: INFO/myActivity(786): FPS: 19.233902040593076 Touch: false 12-21 19:43:53.624: INFO/myActivity(786): FPS: 15.542190150440987 Touch: false 12-21 19:43:55.034: INFO/myActivity(786): FPS: 20.82290063974675 Touch: false 12-21 19:43:56.436: INFO/myActivity(786): FPS: 21.975282007207717 Touch: false 12-21 19:43:57.914: INFO/myActivity(786): FPS: 18.786927284103687 Touch: false 12-21 19:43:59.393: INFO/myActivity(786): FPS: 18.96879004217992 Touch: false 12-21 19:44:00.625: INFO/myActivity(786): FPS: 28.367566618064878 Touch: false 12-21 19:44:02.113: INFO/myActivity(786): FPS: 19.04441528684418 Touch: false 12-21 19:44:03.585: INFO/myActivity(786): FPS: 18.807837511809065 Touch: false 12-21 19:44:04.993: INFO/myActivity(786): FPS: 21.134330284993418 Touch: false 12-21 19:44:06.275: INFO/myActivity(786): FPS: 27.209688764079907 Touch: false 12-21 19:44:07.753: INFO/myActivity(786): FPS: 19.055894653261653 Touch: false 12-21 19:44:09.163: INFO/myActivity(786): FPS: 22.05422794901088 Touch: false 12-21 19:44:10.644: INFO/myActivity(786): FPS: 18.6956805300596 Touch: false 12-21 19:44:12.124: INFO/myActivity(786): FPS: 17.434180581311054 Touch: false 12-21 19:44:13.594: INFO/myActivity(786): FPS: 18.71932038510891 Touch: false 12-21 19:44:14.504: INFO/myActivity(786): FPS: 40.94571503868066 Touch: true 12-21 19:44:14.924: INFO/myActivity(786): FPS: 57.061200121138576 Touch: true 12-21 19:44:15.364: INFO/myActivity(786): FPS: 62.54377946377936 Touch: true 12-21 19:44:15.764: INFO/myActivity(786): FPS: 64.05005071818726 Touch: true 12-21 19:44:16.384: INFO/myActivity(786): FPS: 50.912951172948155 Touch: true 12-21 19:44:16.874: INFO/myActivity(786): FPS: 55.31242053078078 Touch: true 12-21 19:44:17.364: INFO/myActivity(786): FPS: 59.31625410615102 Touch: true 12-21 19:44:18.413: INFO/myActivity(786): FPS: 36.63504170925923 Touch: false 12-21 19:44:19.885: INFO/myActivity(786): FPS: 18.099130467755923 Touch: false 12-21 19:44:21.363: INFO/myActivity(786): FPS: 18.458978222946566 Touch: false 12-21 19:44:22.683: INFO/myActivity(786): FPS: 25.582179409330823 Touch: true 12-21 19:44:23.044: INFO/myActivity(786): FPS: 60.99865521942455 Touch: true 12-21 19:44:23.403: INFO/myActivity(786): FPS: 74.17873975470984 Touch: true 12-21 19:44:23.763: INFO/myActivity(786): FPS: 64.25663040460714 Touch: true 12-21 19:44:24.113: INFO/myActivity(786): FPS: 62.47483457826921 Touch: true 12-21 19:44:24.473: INFO/myActivity(786): FPS: 65.27969529547072 Touch: true 12-21 19:44:24.825: INFO/myActivity(786): FPS: 67.84743115273311 Touch: true 12-21 19:44:25.173: INFO/myActivity(786): FPS: 73.50854551357706 Touch: true 12-21 19:44:25.523: INFO/myActivity(786): FPS: 70.46432534585368 Touch: true 12-21 19:44:25.873: INFO/myActivity(786): FPS: 69.04076953445896 Touch: true

    Read the article

  • Iphone openGlES models

    - by Gedeon
    Hi everybody What is best software for creating models , textures etc... for iphone development. From simplest to more complex programs. First thing that comes to my mind is blender , but I'm curious what everybody else is using and their opinions.

    Read the article

  • Getting the MODELVIEW matrix...

    - by james.ingham
    Hi, I've been pulling my hair out trying to get some matrix calculations working properly and started to wonder. If I have the following: glPushMatrix(); float m[16]; glGetFloatv(GL_MODELVIEW_MATRIX, m); glPopMatrix(); What should I expect the values of m to equal? Currently I'm getting these values and I'm confused as to where they're coming from: -1, 0, 0, 0, 0, -0.6139, 0.7893522, 0, 0, 0.789352238, 0.61394, 0, 0, 0.0955992, -1.344529, 1, I'm assuming there is something which affects this, but I'm not sure what. Could anyone help? I've tried changing pretty much anything but everytime I push the matrix stack I always get this matrix straight away! I don't think this makes a difference but I'm using OpenGLES. Thanks

    Read the article

  • iPhone openGLES performance tuning

    - by genesys
    Hey there! I'm trying now for quite a while to optimize the framerate of my game without really making progress. I'm running on the newest iPhone SDK and have a iPhone 3G 3.1.2 device. I invoke arround 150 drawcalls, rendering about 1900 Triangles in total (all objects are textured using two texturelayers and multitexturing. most textures come from the same textureAtlasTexture stored in pvrtc 2bpp compressed texture). This renders on my phone at arround 30 fps, which appears to me to be way too low for only 1900 triangles. I tried many things to optimize the performance, including batching together the objects, transforming the vertices on the CPU and rendering them in a single drawcall. this yelds 8 drawcalls (as oposed to 150 drawcalls), but performance is about the same (fps drop to arround 26fps) I'm using 32byte vertices stored in an interleaved array (12bytes position, 12bytes normals, 8bytes uv). I'm rendering triangleLists and the vertices are ordered in TriStrip order. I did some profiling but I don't really know how to interprete it. instruments-sampling using Instruments and Sampling yelds this result: http://neo.cycovery.com/instruments_sampling.gif telling me that a lot of time is spent in "mach_msg_trap". I googled for it and it seems this function is called in order to wait for some other things. But wait for what?? instruments-openGL instruments with the openGL module yelds this result: http://neo.cycovery.com/intstruments_openglES_debug.gif but here i have really no idea what those numbers are telling me shark profiling: profiling with shark didn't tell me much either: http://neo.cycovery.com/shark_profile_release.gif the largest number is 10%, spent by DrawTriangles - and the whole rest is spent in very small percentage functions Can anyone tell me what else I could do in order to figure out the bottleneck and could help me to interprete those profiling information? Thanks a lot!

    Read the article

  • How to stop OpenGL from applying blending to certain content? (cocos2d/iPhone/OpenGL)

    - by RexOnRoids
    Supporting Info: I use cocos2d to draw a sprite (graph background) on the screen (z:-1). I then use cocos2d to draw lines/points (z:0) on top of the background -- and make some calls to OpenGL blending functions before the drawing to SMOOTH out the lines. Problem: The problem is that: aside from producing smooth lines/points, calling these OpenGL blending functions seems to degrade the underlying sprite (graph background). So there is a tradeoff: I can either have (Case 1) a nice background and choppy lines/points, or I can have (Case 2) nice smooth lines/points and a degraded background. But obviously I need both. The Code: I have included code of the draw() method of the CCLayer for both cases explained above. As you can see, the code producing the difference between Case 1 and Case 2 seems to be 1 or 2 lines involving OpenGL Blending. Case 1 -- MainScene.h (CCLayer): -(void)draw{ int lastPointX = 0; int lastPointY = 0; GLfloat colorMAX = 255.0f; GLfloat valR; GLfloat valG; GLfloat valB; if([self.myGraphManager ready]){ valR = (255.0f/colorMAX)*1.0f; valG = (255.0f/colorMAX)*1.0f; valB = (255.0f/colorMAX)*1.0f; NSEnumerator *enumerator = [[self.myGraphManager.currentCanvas graphPoints] objectEnumerator]; GraphPoint* object; while ((object = [enumerator nextObject])) { if(object.filled){ /*Commenting out the following two lines induces a problem of making it impossible to have smooth lines/points, but has merit in that it does not degrade the background sprite.*/ //glEnable (GL_BLEND); //glBlendFunc (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); glHint (GL_LINE_SMOOTH_HINT, GL_DONT_CARE); glEnable (GL_LINE_SMOOTH); glLineWidth(1.5f); glColor4f(valR, valG, valB, 1.0); ccDrawLine(ccp(lastPointX, lastPointY), ccp(object.position.x, object.position.y)); lastPointX = object.position.x; lastPointY = object.position.y; glPointSize(3.0f); glEnable(GL_POINT_SMOOTH); glHint(GL_POINT_SMOOTH_HINT, GL_NICEST); ccDrawPoint(ccp(lastPointX, lastPointY)); } } } } Case 2 -- MainScene.h (CCLayer): -(void)draw{ int lastPointX = 0; int lastPointY = 0; GLfloat colorMAX = 255.0f; GLfloat valR; GLfloat valG; GLfloat valB; if([self.myGraphManager ready]){ valR = (255.0f/colorMAX)*1.0f; valG = (255.0f/colorMAX)*1.0f; valB = (255.0f/colorMAX)*1.0f; NSEnumerator *enumerator = [[self.myGraphManager.currentCanvas graphPoints] objectEnumerator]; GraphPoint* object; while ((object = [enumerator nextObject])) { if(object.filled){ /*Enabling the following two lines gives nice smooth lines/points, but has a problem in that it degrades the background sprite.*/ glEnable (GL_BLEND); glBlendFunc (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); glHint (GL_LINE_SMOOTH_HINT, GL_DONT_CARE); glEnable (GL_LINE_SMOOTH); glLineWidth(1.5f); glColor4f(valR, valG, valB, 1.0); ccDrawLine(ccp(lastPointX, lastPointY), ccp(object.position.x, object.position.y)); lastPointX = object.position.x; lastPointY = object.position.y; glPointSize(3.0f); glEnable(GL_POINT_SMOOTH); glHint(GL_POINT_SMOOTH_HINT, GL_NICEST); ccDrawPoint(ccp(lastPointX, lastPointY)); } } } }

    Read the article

  • Will OpenGL give me any FPS improvement over CoreAnimation for scrolling a large image?

    - by Ben Roberts
    Hi, I'm considering re-writing the menu system of my iPhone app to use Open GL just to improve the smoothness of scrolling a big image (480x1900px) across the screen. I'm looking at doing this as a way to improve on using the method/solution as described here (http://stackoverflow.com/questions/1443140/smoother-uiview). This solution was a big improvement over the previous implementation but it's still not perfect and as this is the first thing the user will see I'd like it to be as flawless as possible. Will switching to OpenGL give me the sort of smooth scrolling I'm looking for? I've stayed clear of OpenGL until now as this is my first app and core animation has handled everything else I've thrown at it well enough, would be good to know if this alternative implementation is likely to work! thanks

    Read the article

  • Android: 2D. OpenGl or android.graphics?

    - by DroidIn.net
    I'm working with my friend on our first Android game. Basic idea is that every frame the whole surface is redrawn (1 large bitmap) which then sprinkled all over with large number of particles which produces effect of soapy bubbles where there's a pool of about 20 bitmaps which randomly gets picked to produce illusion that all bubbles (between 200 - 300) are all different. The math engine is in C (JNI) and currently all drawing is done using android.graphics package very similar (since that was the example I was using) to Lunar Lander. It works but animation is somewhat jerky and I can feel by temperature of my phone that it is very busy. Will we benefit from switching to OpenGL? And as a bonus question: what would be a good way to optimize the drawing mechanism (Lunar Lander like) we have now?

    Read the article

  • Difference between GL10 and GLES10 on Android

    - by kayahr
    The GLSurfaceView.Renderer interface of the Android SDK gives me a GL interface as parameter which has the type GL10. This interface is implemented by some private internal jni wrapper class. But there is also the class GLES10 where all the GL methods are available as static methods. Is there an important difference between them? So what if I ignore the gl parameter of onDrawFrame and instead use the static methods of GLES10 everywhere? Here is an example. Instead of doing this: void onDrawFrame(GL10 gl) { drawSomething(gl); } void drawSomething(GL10 gl) { gl.glLoadIdentity(); ... } I could do this: void onDrawFrame(GL10 gl) { drawSomething(); } void drawSomething() { GLES10.glLoadIdentity(); ... } The advantage is that I don't have to pass the GL context to all called methods. But even it it works (And it works, I tried it) I wonder if there are any disadvantages and reasons to NOT do it like that.

    Read the article

  • AndEngine VS Android's Canvas VS OpenGLES - For rendering a 2D indoor vector map

    - by Orchestrator
    This is a big issue for me I'm trying to figure out for a long time already. I'm working on an application that should include a 2D vector indoor map in it. The map will be drawn out from an .svg file that will specify all the data of the lines, curved lines (path) and rectangles that should be drawn. My main requirement from the map are Support touch events to detect where exactly the finger is touching. Great image quality especially when considering the drawings of curved and diagonal lines (anti-aliasing) Optional but very nice to have - Built in ability to zoom, pan and rotate. So far I tried AndEngine and Android's canvas. With AndEngine I had troubles with implementing anti-aliasing for rendering smooth diagonal lines or drawing curved lines, and as far as I understand, this is not an easy thing to implement in AndEngine. Though I have to mention that AndEngine's ability to zoom in and pan with the camera instead of modifying the objects on the screen was really nice to have. I also had some little experience with the built in Android's Canvas, mainly with viewing simple bitmaps, but I'm not sure if it supports all of these things, and especially if it would provide smooth results. Last but no least, there's the option of just plain OpenGLES 1 or 2, that as far as I understand, with enough work should be able to support all the features I require. However it seems like something that would be hard to implement. And I've never programmed in OpenGL or anything like it, but I'm willing very much to learn. To sum it all up, I need a platform that would provide me with the ability to do the 3 things I mentioned before, but also very important - To allow me to implement this feature as fast as possible. Any kind of answer or suggestion would be very much welcomed as I'm very eager to solve this problem! Thanks!

    Read the article

  • code first CTP5 error message

    - by user482833
    I get the following error message with a new project I have set using code first CTP5. Can't find anything on the web about it. Has anyone encountered this error message? The context cannot be used while the model is being created. This occurs the first time my database context is called (code below): using (StaffData context = new StaffData()) { return context.Employees.Count(e = e.EmployeeReference) == 1; } At this point the database has not been created. I have a database initialiser DropCreateDatabaseIfModelChanges which I set in app_start.

    Read the article

  • Draw a line over UIViewController

    - by ghiboz
    Hi all, I have my app on iPhone with a UIViewController with some stuff inside.. image, textbox, etc... is there a way to draw a line or something like this using opengl directly inside the UIViewController? thanks in advance

    Read the article

  • How to overlay GLSurfaceView over a MapView in Android?

    - by Rajapandian
    Hi, I want to create a simple Map based application in android ,where i can display my current position.Instead of overlaying a simple Image on the MapView to represent the position, i want to overlay the GLSurfaceView on the MapView. But i don't know how to achieve this. Is there any way to do that?. Please anybody knows the solution help me.

    Read the article

  • iPad + OpenGL ES2. Why the Puzzling Virtual Memory Spike During Device Reorientation?

    - by dugla
    I've been spending the afternoon starring at Xcode Instruments memory monitor trying to decipher the following memory issue. I have a fullscreen OpenGL ES2 app running on iPad. I am fanatical about memory issues so my retains/releases are all nicely balanced. I closely monitor memory leaks. My app is basically squeeky clean. Except occassionally when I reorient the device. Portrait to Landscape. Back and forth I rock the device stress testing my discarding and rebuilding of the OpenGL framebuffer. The ambient memory footprint of my app is about 70MB Real Mems and 180MB Virtual Mems. Real memory hardly varies at all during device rotations. However the virtual mems reading sometimes briefly spikes up to 250MB and then recedes back to 180MB. No real pattern. But clearly related discarding/rebuilding the framebuffer. I see random memory warnings in my NSlogs but the app just hums along, no worries. 1) Since iPhone OS devices don't have VM could someone explain to me what the VM reading actually means? 2) My app totally leak free and generally bulletproof dispite the VM spikes. Never crashes. Rock solid. Should I be concerned about this? 3) There is clearly something happening in OpenGL framebuffer land that is causing this but I am using the API in the proper way: paraphrasing: Discarding the framebuffer: glDeleteRenderbuffers(1, &m_colorbuffer); glDeleteFramebuffers(1, &m_framebuffer); Rebuilding the framebuffer: glGenFramebuffers(1, &m_framebuffer); glGenRenderbuffers(1, &m_colorbuffer); Is there some other memory flushing trick I have missed? Thanks for any insight. Cheers, Doug

    Read the article

  • How to use onSensorChanged sensor data in combination with OpenGL

    - by Sponge
    I have written a TestSuite to find out how to calculate the rotation angles from the data you get in SensorEventListener.onSensorChanged(). I really hope you can complete my solution to help people who will have the same problems like me. Here is the code, i think you will understand it after reading it. Feel free to change it, the main idea was to implement several methods to send the orientation angles to the opengl view or any other target which would need it. method 1 to 4 are working, they are directly sending the rotationMatrix to the OpenGl view. all other methods are not working or buggy and i hope someone knows to get them working. i think the best method would be method 5 if it would work, because it would be the easiest to understand but i'm not sure how efficient it is. the complete code isn't optimized so i recommend to not use it as it is in your project. here it is: import java.nio.ByteBuffer; import java.nio.ByteOrder; import java.nio.FloatBuffer; import javax.microedition.khronos.egl.EGL10; import javax.microedition.khronos.egl.EGLConfig; import javax.microedition.khronos.opengles.GL10; import static javax.microedition.khronos.opengles.GL10.*; import android.app.Activity; import android.content.Context; import android.content.pm.ActivityInfo; import android.hardware.Sensor; import android.hardware.SensorEvent; import android.hardware.SensorEventListener; import android.hardware.SensorManager; import android.opengl.GLSurfaceView; import android.opengl.GLSurfaceView.Renderer; import android.os.Bundle; import android.util.Log; import android.view.WindowManager; /** * This class provides a basic demonstration of how to use the * {@link android.hardware.SensorManager SensorManager} API to draw a 3D * compass. */ public class SensorToOpenGlTests extends Activity implements Renderer, SensorEventListener { private static final boolean TRY_TRANSPOSED_VERSION = false; /* * MODUS overview: * * 1 - unbufferd data directly transfaired from the rotation matrix to the * modelview matrix * * 2 - buffered version of 1 where both acceleration and magnetometer are * buffered * * 3 - buffered version of 1 where only magnetometer is buffered * * 4 - buffered version of 1 where only acceleration is buffered * * 5 - uses the orientation sensor and sets the angles how to rotate the * camera with glrotate() * * 6 - uses the rotation matrix to calculate the angles * * 7 to 12 - every possibility how the rotationMatrix could be constructed * in SensorManager.getRotationMatrix (see * http://www.songho.ca/opengl/gl_anglestoaxes.html#anglestoaxes for all * possibilities) */ private static int MODUS = 2; private GLSurfaceView openglView; private FloatBuffer vertexBuffer; private ByteBuffer indexBuffer; private FloatBuffer colorBuffer; private SensorManager mSensorManager; private float[] rotationMatrix = new float[16]; private float[] accelGData = new float[3]; private float[] bufferedAccelGData = new float[3]; private float[] magnetData = new float[3]; private float[] bufferedMagnetData = new float[3]; private float[] orientationData = new float[3]; // private float[] mI = new float[16]; private float[] resultingAngles = new float[3]; private int mCount; final static float rad2deg = (float) (180.0f / Math.PI); private boolean mirrorOnBlueAxis = false; private boolean landscape; public SensorToOpenGlTests() { } /** Called with the activity is first created. */ @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE); openglView = new GLSurfaceView(this); openglView.setRenderer(this); setContentView(openglView); } @Override protected void onResume() { // Ideally a game should implement onResume() and onPause() // to take appropriate action when the activity looses focus super.onResume(); openglView.onResume(); if (((WindowManager) getSystemService(WINDOW_SERVICE)) .getDefaultDisplay().getOrientation() == 1) { landscape = true; } else { landscape = false; } mSensorManager.registerListener(this, mSensorManager .getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_GAME); mSensorManager.registerListener(this, mSensorManager .getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD), SensorManager.SENSOR_DELAY_GAME); mSensorManager.registerListener(this, mSensorManager .getDefaultSensor(Sensor.TYPE_ORIENTATION), SensorManager.SENSOR_DELAY_GAME); } @Override protected void onPause() { // Ideally a game should implement onResume() and onPause() // to take appropriate action when the activity looses focus super.onPause(); openglView.onPause(); mSensorManager.unregisterListener(this); } public int[] getConfigSpec() { // We want a depth buffer, don't care about the // details of the color buffer. int[] configSpec = { EGL10.EGL_DEPTH_SIZE, 16, EGL10.EGL_NONE }; return configSpec; } public void onDrawFrame(GL10 gl) { // clear screen and color buffer: gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT); // set target matrix to modelview matrix: gl.glMatrixMode(GL10.GL_MODELVIEW); // init modelview matrix: gl.glLoadIdentity(); // move camera away a little bit: if ((MODUS == 1) || (MODUS == 2) || (MODUS == 3) || (MODUS == 4)) { if (landscape) { // in landscape mode first remap the rotationMatrix before using // it with glMultMatrixf: float[] result = new float[16]; SensorManager.remapCoordinateSystem(rotationMatrix, SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X, result); gl.glMultMatrixf(result, 0); } else { gl.glMultMatrixf(rotationMatrix, 0); } } else { //in all other modes do the rotation by hand: gl.glRotatef(resultingAngles[1], 1, 0, 0); gl.glRotatef(resultingAngles[2], 0, 1, 0); gl.glRotatef(resultingAngles[0], 0, 0, 1); if (mirrorOnBlueAxis) { //this is needed for mode 6 to work gl.glScalef(1, 1, -1); } } //move the axis to simulate augmented behaviour: gl.glTranslatef(0, 2, 0); // draw the 3 axis on the screen: gl.glVertexPointer(3, GL_FLOAT, 0, vertexBuffer); gl.glColorPointer(4, GL_FLOAT, 0, colorBuffer); gl.glDrawElements(GL_LINES, 6, GL_UNSIGNED_BYTE, indexBuffer); } public void onSurfaceChanged(GL10 gl, int width, int height) { gl.glViewport(0, 0, width, height); float r = (float) width / height; gl.glMatrixMode(GL10.GL_PROJECTION); gl.glLoadIdentity(); gl.glFrustumf(-r, r, -1, 1, 1, 10); } public void onSurfaceCreated(GL10 gl, EGLConfig config) { gl.glDisable(GL10.GL_DITHER); gl.glClearColor(1, 1, 1, 1); gl.glEnable(GL10.GL_CULL_FACE); gl.glShadeModel(GL10.GL_SMOOTH); gl.glEnable(GL10.GL_DEPTH_TEST); gl.glEnableClientState(GL10.GL_VERTEX_ARRAY); gl.glEnableClientState(GL10.GL_COLOR_ARRAY); // load the 3 axis and there colors: float vertices[] = { 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1 }; float colors[] = { 0, 0, 0, 0, 1, 0, 0, 1, 0, 1, 0, 1, 0, 0, 1, 1 }; byte indices[] = { 0, 1, 0, 2, 0, 3 }; ByteBuffer vbb; vbb = ByteBuffer.allocateDirect(vertices.length * 4); vbb.order(ByteOrder.nativeOrder()); vertexBuffer = vbb.asFloatBuffer(); vertexBuffer.put(vertices); vertexBuffer.position(0); vbb = ByteBuffer.allocateDirect(colors.length * 4); vbb.order(ByteOrder.nativeOrder()); colorBuffer = vbb.asFloatBuffer(); colorBuffer.put(colors); colorBuffer.position(0); indexBuffer = ByteBuffer.allocateDirect(indices.length); indexBuffer.put(indices); indexBuffer.position(0); } public void onAccuracyChanged(Sensor sensor, int accuracy) { } public void onSensorChanged(SensorEvent event) { // load the new values: loadNewSensorData(event); if (MODUS == 1) { SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, magnetData); } if (MODUS == 2) { rootMeanSquareBuffer(bufferedAccelGData, accelGData); rootMeanSquareBuffer(bufferedMagnetData, magnetData); SensorManager.getRotationMatrix(rotationMatrix, null, bufferedAccelGData, bufferedMagnetData); } if (MODUS == 3) { rootMeanSquareBuffer(bufferedMagnetData, magnetData); SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, bufferedMagnetData); } if (MODUS == 4) { rootMeanSquareBuffer(bufferedAccelGData, accelGData); SensorManager.getRotationMatrix(rotationMatrix, null, bufferedAccelGData, magnetData); } if (MODUS == 5) { // this mode uses the sensor data recieved from the orientation // sensor resultingAngles = orientationData.clone(); if ((-90 > resultingAngles[1]) || (resultingAngles[1] > 90)) { resultingAngles[1] = orientationData[0]; resultingAngles[2] = orientationData[1]; resultingAngles[0] = orientationData[2]; } } if (MODUS == 6) { SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, magnetData); final float[] anglesInRadians = new float[3]; SensorManager.getOrientation(rotationMatrix, anglesInRadians); if ((-90 < anglesInRadians[2] * rad2deg) && (anglesInRadians[2] * rad2deg < 90)) { // device camera is looking on the floor // this hemisphere is working fine mirrorOnBlueAxis = false; resultingAngles[0] = anglesInRadians[0] * rad2deg; resultingAngles[1] = anglesInRadians[1] * rad2deg; resultingAngles[2] = anglesInRadians[2] * -rad2deg; } else { mirrorOnBlueAxis = true; // device camera is looking in the sky // this hemisphere is mirrored at the blue axis resultingAngles[0] = (anglesInRadians[0] * rad2deg); resultingAngles[1] = (anglesInRadians[1] * rad2deg); resultingAngles[2] = (anglesInRadians[2] * rad2deg); } } if (MODUS == 7) { SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, magnetData); rotationMatrix = transpose(rotationMatrix); /* * this assumes that the rotation matrices are multiplied in x y z * order Rx*Ry*Rz */ resultingAngles[2] = (float) (Math.asin(rotationMatrix[2])); final float cosB = (float) Math.cos(resultingAngles[2]); resultingAngles[2] = resultingAngles[2] * rad2deg; resultingAngles[0] = -(float) (Math.acos(rotationMatrix[0] / cosB)) * rad2deg; resultingAngles[1] = (float) (Math.acos(rotationMatrix[10] / cosB)) * rad2deg; } if (MODUS == 8) { SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, magnetData); rotationMatrix = transpose(rotationMatrix); /* * this assumes that the rotation matrices are multiplied in z y x */ resultingAngles[2] = (float) (Math.asin(-rotationMatrix[8])); final float cosB = (float) Math.cos(resultingAngles[2]); resultingAngles[2] = resultingAngles[2] * rad2deg; resultingAngles[1] = (float) (Math.acos(rotationMatrix[9] / cosB)) * rad2deg; resultingAngles[0] = (float) (Math.asin(rotationMatrix[4] / cosB)) * rad2deg; } if (MODUS == 9) { SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, magnetData); rotationMatrix = transpose(rotationMatrix); /* * this assumes that the rotation matrices are multiplied in z x y * * note z axis looks good at this one */ resultingAngles[1] = (float) (Math.asin(rotationMatrix[9])); final float minusCosA = -(float) Math.cos(resultingAngles[1]); resultingAngles[1] = resultingAngles[1] * rad2deg; resultingAngles[2] = (float) (Math.asin(rotationMatrix[8] / minusCosA)) * rad2deg; resultingAngles[0] = (float) (Math.asin(rotationMatrix[1] / minusCosA)) * rad2deg; } if (MODUS == 10) { SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, magnetData); rotationMatrix = transpose(rotationMatrix); /* * this assumes that the rotation matrices are multiplied in y x z */ resultingAngles[1] = (float) (Math.asin(-rotationMatrix[6])); final float cosA = (float) Math.cos(resultingAngles[1]); resultingAngles[1] = resultingAngles[1] * rad2deg; resultingAngles[2] = (float) (Math.asin(rotationMatrix[2] / cosA)) * rad2deg; resultingAngles[0] = (float) (Math.acos(rotationMatrix[5] / cosA)) * rad2deg; } if (MODUS == 11) { SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, magnetData); rotationMatrix = transpose(rotationMatrix); /* * this assumes that the rotation matrices are multiplied in y z x */ resultingAngles[0] = (float) (Math.asin(rotationMatrix[4])); final float cosC = (float) Math.cos(resultingAngles[0]); resultingAngles[0] = resultingAngles[0] * rad2deg; resultingAngles[2] = (float) (Math.acos(rotationMatrix[0] / cosC)) * rad2deg; resultingAngles[1] = (float) (Math.acos(rotationMatrix[5] / cosC)) * rad2deg; } if (MODUS == 12) { SensorManager.getRotationMatrix(rotationMatrix, null, accelGData, magnetData); rotationMatrix = transpose(rotationMatrix); /* * this assumes that the rotation matrices are multiplied in x z y */ resultingAngles[0] = (float) (Math.asin(-rotationMatrix[1])); final float cosC = (float) Math.cos(resultingAngles[0]); resultingAngles[0] = resultingAngles[0] * rad2deg; resultingAngles[2] = (float) (Math.acos(rotationMatrix[0] / cosC)) * rad2deg; resultingAngles[1] = (float) (Math.acos(rotationMatrix[5] / cosC)) * rad2deg; } logOutput(); } /** * transposes the matrix because it was transposted (inverted, but here its * the same, because its a rotation matrix) to be used for opengl * * @param source * @return */ private float[] transpose(float[] source) { final float[] result = source.clone(); if (TRY_TRANSPOSED_VERSION) { result[1] = source[4]; result[2] = source[8]; result[4] = source[1]; result[6] = source[9]; result[8] = source[2]; result[9] = source[6]; } // the other values in the matrix are not relevant for rotations return result; } private void rootMeanSquareBuffer(float[] target, float[] values) { final float amplification = 200.0f; float buffer = 20.0f; target[0] += amplification; target[1] += amplification; target[2] += amplification; values[0] += amplification; values[1] += amplification; values[2] += amplification; target[0] = (float) (Math .sqrt((target[0] * target[0] * buffer + values[0] * values[0]) / (1 + buffer))); target[1] = (float) (Math .sqrt((target[1] * target[1] * buffer + values[1] * values[1]) / (1 + buffer))); target[2] = (float) (Math .sqrt((target[2] * target[2] * buffer + values[2] * values[2]) / (1 + buffer))); target[0] -= amplification; target[1] -= amplification; target[2] -= amplification; values[0] -= amplification; values[1] -= amplification; values[2] -= amplification; } private void loadNewSensorData(SensorEvent event) { final int type = event.sensor.getType(); if (type == Sensor.TYPE_ACCELEROMETER) { accelGData = event.values.clone(); } if (type == Sensor.TYPE_MAGNETIC_FIELD) { magnetData = event.values.clone(); } if (type == Sensor.TYPE_ORIENTATION) { orientationData = event.values.clone(); } } private void logOutput() { if (mCount++ > 30) { mCount = 0; Log.d("Compass", "yaw0: " + (int) (resultingAngles[0]) + " pitch1: " + (int) (resultingAngles[1]) + " roll2: " + (int) (resultingAngles[2])); } } }

    Read the article

  • Texture2D problem

    - by Anders Karlsson
    I have a problem that is driving me crazy, I want to write a number of texts on the screen using Texture2D however I only seem to be able to write the first one. If I individually write one of the labels it works but not if I write all of them, only the first label is displayed. Let me show some code: -(void)drawText:(NSString*)theString AtX:(float)X Y:(float)Y withFont:(UIFont*)aFont { // set color glColor4f(1, 0, 0, 1.0); // Enable modes needed for drawing glEnableClientState(GL_TEXTURE_COORD_ARRAY); Texture2D* textTexture = [[Texture2D alloc] initWithString:theString dimensions:viewSize // 320x480 alignment:UITextAlignmentLeft font:aFont]; glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); [textTexture drawInRect:CGRectMake(X,Y,1,1)]; glDisableClientState(GL_TEXTURE_COORD_ARRAY); [textTexture release]; } When I call this drawText once it seems to display the text properly, but if I call it a second time nothing seems to be displayed. Somebody has an idea what it could be? The states like GL_BLEND and GL_TEXTURE_2D have been enabled in the view setup function. In the Texture2D the dimensions are 512x512 as I pass the whole screen to function. If I don't pass that the text gets enlarged and fuzzy. I am a bit uncertain about that parameter. TIA for any help.

    Read the article

  • Why does calling glMatrixMode(GL_PROJECTION) give me EXC_BAD_ACCESS in an iPhone app?

    - by MrDatabase
    I have an iphone app where I call these three functions in appDidFinishLaunching: glMatrixMode(GL_PROJECTION); glOrthof(0, rect.size.width, 0, rect.size.height, -1, 1); glMatrixMode(GL_MODELVIEW); When stepping through with the debugger I get EXC BAD ACCESS when I execute the first line. Any ideas why this is happening? Btw I have another application where I do the same thing and it works fine. So I've tried to duplicate everything in that app (#imports, adding OpenGLES framework, etc) but now I'm just stuck.

    Read the article

  • iPad GLSL. From within a fragment shader how do I get the surface - not vertex - normal

    - by dugla
    Is it possible to access the surface normal - the normal associated with the plane of a fragment - from within a fragment shader? Or perhaps this can be done in the vertex shader? Is all knowledge of the associated geometry lost when we go down the shader pipeline or is there some clever way of recovering that information in either the vertex of fragment shader? Thanks in advance. Cheers, Doug twitter: @dugla

    Read the article

< Previous Page | 18 19 20 21 22 23 24 25 26 27 28 29  | Next Page >