Search Results

Search found 13860 results on 555 pages for 'core graphics'.

Page 71/555 | < Previous Page | 67 68 69 70 71 72 73 74 75 76 77 78  | Next Page >

  • Creating a WARP device in managed DirectX

    - by arex
    I have a very old graphic card that only supports shader model 2, but I need shader model 3 or up for the app I am developing. I tried to use a reference device but it seems to run very slowly, then I found some samples in C++ that allows me to change to a WARP device and the performance is good. I am using C# and I don't know how to create such type of device. So the question is: how do I create a WARP device in C#? Thanks in advance.

    Read the article

  • How can I efficiently create/store/implement animations as I add to my game?

    - by nickbadal
    My game's characters are made up of different parts (head/body/legs/etc), and whatever items they have equipped. As I'm creating the animation system for my game, I want to try to anticipate a large number of combinations for different pieces for each character. Originally, I had planned on having a frame-by frame animation for each piece for each animation, and then layer them to combine them into a character, but this seems like it would be a lot of work for my artist, and that the memory/disk size would start to add up as well since we would need a sprite for every frame, of every customization of every piece, in every animation, for every character. What efficient ways are there to create/implement these animations as we add more and more configurations to our game?

    Read the article

  • Splashing Liquid Using Cocos2d

    - by Maverick
    I am new to game development in iOS. My problem is that I want to give a water splash effect on the screen as like someone has just randomly thrown water from any corner of the screen. I will be grateful to know any tutorials, or libraries that wil help me achieving this effect. Thanks in advance. Edit: Sample Liquid Simulation I wanted to simulate the behavior like this IMAGE. Like a liquid is poured in glass. I hope it clears what I asked. Thanks for your precious time.

    Read the article

  • How do you blend multiple colors in HSV (polar) color-space?

    - by Toxikman
    In RGB color space, you can do a weighted multiple-color blend by just doing: Start with R = G = B = 0. Then we perform a blend at index i using a set of colors C, and a set of normalized weights w like so: R += w[i] * C[i].r G += w[i] * C[i].g B += w[i] * C[i].b But I'd like to interpolate the colors in the HSV color-space instead, so that saturation and brightness are uniform across the interpolation. I know I can blend saturation and brightness in the same way as above, but the HUE component is an angle around a continuous circle, since HSV is essentially a polar coordinate system. Blending only two HSV colors makes sense to me, you just find the shortest arc around the circle and interpolate between the two hues. But when you attempt to blend more than 2 colors, it becomes a bit of a puzzle. You have to handle anomalous cases, like 4 equally-weighted colors with a hue at 0, 90, 180, and 270 degrees. They basically cancel each other out, so any hue will do. Any ideas would be greatly appreciated.

    Read the article

  • How to disable VGA

    - by Bitmap
    If i run lspci | grep VGA I get below output which tells me below VGA cards are present on my computer. 01:00.0 VGA compatible controller: nVidia Corporation GT216 [GeForce GT 220] (rev a2) 08:02.0 VGA compatible controller: ATI Technologies Inc ES1000 (rev 02) The ES1000 is an onboard card which came with my machine. Do anyone know how to disable this VGA on my machine. The reason for this request is because if I run xrandr I get the output as shown below: xrandr: Failed to get size of gamma for output default Screen 0: minimum 320 x 240, current 1024 x 768, maximum 1024 x 768 default connected 1024x768+0+0 0mm x 0mm 1024x768 50.0* 800x600 51.0 52.0 53.0 680x384 54.0 55.0 640x480 56.0 512x384 57.0 400x300 58.0 320x240 59.0 Which means I am not able to configure nVidia to accept smaller resolution. Thank you.

    Read the article

  • Running crysis 2 on ubuntu gives me an error message

    - by ShajD
    so i recently installed ubuntu alongside windows 7 and i installed crysis 2 with wine. crysis 2 works fine when i run windows, however when i run it using wine in ubuntu cryengine gives me a message saying, "Unsupported video card detected! Continuing to run migth lead to unexpected results or crashes....." i've got two video cards ones an intel and the other's a nvidia. i typed lspci into the terminal and my nvidia card was listed under video controller as well

    Read the article

  • W520 External monitor setup with Ubuntu 12.10

    - by user108372
    I just installed a fresh Ubuntu 12.10 64-Bit Desktop on my Lenovo W520. It looks like there are a lot of challenges around making it work with out of the box Nouveau drivers or propriety Nvidia drivers or Intel GPU. I looked at couple of notes on how to make it work with Bumblebee with Optimus Nvidia. None of them seems to work for 12.10. Anybody has a solid answer on this? It seems like a lot of people are suffering from this. Here is my xrandr output. Let me know if you need any additional information. Screen 0: minimum 320 x 200, current 1920 x 1080, maximum 8192 x 8192 LVDS1 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 344mm x 193mm 1920x1080 60.0*+ 59.9 50.0 1680x1050 60.0 59.9 1600x1024 60.2 1400x1050 60.0 1280x1024 60.0 1440x900 59.9 1280x960 60.0 1360x768 59.8 60.0 1152x864 60.0 1024x768 60.0 800x600 60.3 56.2 640x480 59.9 VGA1 disconnected (normal left inverted right x axis y axis) Thanks, Sef

    Read the article

  • Intel Graphic card problem

    - by user10406
    Hello, I am facing a problem in installing the right driver for my computer "HP dv4 2154ca" The problem that I tried to install it couple of times but the xorg.conf file is still empty no matter what So my question is why is this file empty and how could I generate it correctly for my device The problem that I think this thing is causing is that when I maximize flash to full screen it lags the video will go slower while the sound will go smooth Thanks in advance

    Read the article

  • Provide A Scrolling "Camera" View Over A 2D Game Map

    - by BitCrash
    I'm in the process of attempting to create a 2D MMO type game with Kryonet and some basic sprites, mostly for my own learning. I have the back end set up great (By my standards) and I'm moving on to actually getting some things drawn onto the map. I cannot for the life of me figure out a solid way to have a "Camera" follow a player around a large area. The view pane for the game is 640 x 480 pixels, and each tile is 32x32 pixels (Thats 20 tiles wide and 15 high for the viewpane) I have tried a couple things to do this, but they did not seem to work out so well. I had a JScrollPane with 9 "Viewpane"-sized canvases in it, and tried to have the JScrollPane move in accordance with the player. The issue came when I reached the end of the JScrollPane. I tried to "Flip" canvases, sending the canvas currrently drawing the player to the middle of the 9 and load the corresponding maps onto the other ones. It was slow and worked poorly. I'm looking for any advice or previous experience with this; any ideas? Thank you! Edit and Clarification: I did not mean to mention Kryonet, I was merely providing peripheral information in case there was something that would help which I could not foresee. Instead of having an array of 9 canvases, why not just have one large canvas loading a large map every once in a while? I'm willing to have "load times" where as with the canvas array I would have none (in theory) to give the user a smooth experience. I could just change the size and location of the map with a modified setBounds() call on the canvas in a layered pane (layered because I have hidden swing items, like inventories and stuff) I'll try it out and post here how it goes for people asking the same question.

    Read the article

  • Fog with Blend in OpenGL

    - by MhdAljobory
    I want to add fog in my scene which contain transparent textures made by Blend , when i enable the fog the transparent textures appear white From a distance but when i disable it the textures appear well. What is the solution to the problem of whiteness? Fog Code: GLfloat fogColor[4]= {0.5f, 0.5f, 0.5f, 1.0f}; glClearColor(0.5f,0.5f,0.5f,1.0f); glFogi(GL_FOG_MODE, GL_LINEAR); glFogfv(GL_FOG_COLOR, fogColor); glFogf(GL_FOG_DENSITY, 0.35f); glHint(GL_FOG_HINT, GL_DONT_CARE); glFogf(GL_FOG_START, 1.0f); glFogf(GL_FOG_END, 1000.0f); glEnable(GL_FOG); Screenshot

    Read the article

  • How do i add a start menu page to my java game?

    - by user2149407
    I have a rather cool space invaders game that my friend and I have been working on for a while, and we have decided it needs an opening page, with "Start" options, "Quit" options and so forth. I have looked at several methods online, but cant seem to get any of them to work! Does anybody have any ideas? P.S Using JFrame to draw the main frame Im just looking to do this within Java, so just a panel that appears at a state change (GAME, MENU). Id like it to contain a few buttons to start the game, and quit. Later, I will add achievements, but im after something really basic for now. But thanks for the suggestions!

    Read the article

  • How do I install the NVIDIA driver for a GeForce 6200?

    - by 2d4skt
    I have been recently trying to find a solution for this on the web but did not find something useful or accurate for Ubuntu 11.10. I also consulted the NVIDIA help, but things there did not work for me. I installed the additional drivers from system settings but they are not fully compatible with my GeForce 6200. First I tried finding how to stop the X server. I succeeded, but another problem was the nouveau kernel. This is really frustrating. Can anybody tell me an accurate and authentic way to install NVIDIA drivers?

    Read the article

  • Ask temperature on ubuntu 11.10 too hot

    - by stacheldraht
    I wonder why when I am using Ubuntu, my laptop temperature increases. It's around 62 degrees or less. I am already using Jupiter and have set it to power saving mode but if I use maximum performance the temperature can reach 70 degrees or more. Its too hot if I compare it when I am running windows where the temperature is quite normal at 56 degrees. How can I solve that?, sorry for my bad English.

    Read the article

  • What's wrong with this answer? [migrated]

    - by MikeLJ
    I wrote an answer to this question, but I can't post it even though it's not opinion based. which tile size to choice for 16-bits What's wrong with my answer? The Answer: I'll use these classic 16-bit consoles as reference: http://en.wikipedia.org/wiki/History_of_video_game_consoles_(fourth_generation) Super Nintendo: Max Resolution: 512x478 Sprites On Screen: 128 Max Sprite Size: 8×8. TurboGrafx-16: Max Resolution: 565x242 "Normal" resolution: 256×239 Sprites On Screen: 64 Max Sprite Sizes: 32×64 Neo Geo: Display resolution: 320×224 "Normal" Resolution: 304x224 Sprites on screen: 380 Max Sprite Size: 16x512

    Read the article

  • Rendering trillions of "atoms" instead of polygons?

    - by Baring
    I just saw a video about what the publishers call the "next major step after the invention of 3D". According to the person speaking in it, they use a huge amount of atoms grouped into clouds instead of polygons, to reach a level of unlimited detail. They tried their best to make the video understandable for persons with no knowledge of any rendering techniques, and therefore or for other purposes left out all details of how their engine works. The level of detail in their video does look quite impressive to me. How is it possible to render scenes using custom atoms instead of polygons on current hardware? (Speed, memory-wise) If this is real, why has nobody else even thought about it so far? I'm, as an OpenGL developer, really baffled by this and would really like to hear what experts have to say. Therefore I also don't want this to look like a cheap advert and will include the link to the video only if requested, in the comments section.

    Read the article

  • Is this too much to ask for a game programming and developing enthusiast? Am I doing this wrong?

    - by I_Question_Things_Deeply
    I have been a computer-fanatic for almost a decade now. I've always loved and wondered how computers work, even from the purest, lowest hardware level to the very smallest pixel on the screen, and all the software around that. That seems to be my problem though ... as I try to write code (I'm pretty fluent at C++) I always sit there enormous amounts of time in front of a text-editor wondering how every line, statement, datum, function, etc. will correspond to every Assembly and machine instruction performed to do absolutely everything necessary for the kernel to allocate memory to run my compiled program, and all of the other hardware being used as well. For example ... I would write cout << "Before memory changed" << endl; and run the debugger to get the Assembly for this, and then try and reverse disassemble the Assembly to machine code based on my ISA, and then research every .dll, library file, linked library, linking process, linker source code of the program, the make file, the kernel I'm using's steps of processing this compilation, the hardware's part aside from the processor (e.g. video card, sound card, chipset, cache latency, byte-sized registers, calling convention use, DDR3 RAM and disk drive, filesystem functioning and so many other things). Am I going about programming wrong? I mean I feel I should know everything that goes on underneath English syntax on a computer program. But the problem is that the more I research every little thing the less I actually accomplish at all. I can never finish anything because of this mentality, yet I feel compelled to know everything... what should I do?

    Read the article

  • Boot screen appears to be asking a question but garbled

    - by mark kaylor
    I'm running 12.04 Precise Pangolin, Kernel 3.2.0-32 w/ GNOME 3.4.2 I perused the prior questions/answers and did not find exactly the same problem, I am concerned that AUTOFSCK, Grub or some other critical event that needs some attention ? Any idea on how to get my video clean during boot? Once I get past the boot screen the video driver/card, etc is performing beautifully ! Here is a photo of the boot screen; nVidia GeForce CARD INFORMATION (lspci -vvv) 01:00.0 VGA compatible controller: NVIDIA Corporation G72 [GeForce 7300 LE] (rev a1) (prog-if 00 [VGA controller]) Subsystem: Gateway 2000 Device 3a07 Control: I/O+ Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR- FastB2B- DisINTx- Status: Cap+ 66MHz- UDF- FastB2B- ParErr- DEVSEL=fast TAbort- SERR- [disabled] Capabilities: Kernel driver in use: nvidia Kernel modules: nvidia_173, nouveau, nvidiafb Thanks for your help/advice.

    Read the article

  • Which optional features would you recommend for a raytracer? [closed]

    - by locks
    I'm developing a basic triangle mesh raytracer on a short deadline. This means I can't implement every feature I come across, so I'm looking for some feedback about which features you think are most important, taking into consideration the performance of the feature and how much punch it packs. I'm especially looking for optimization techniques that allow for a faster rendering and simple techniques that make a big impact on the final scene quality. Is there any chance of making it fast enough to run in realtime? Here are some example of features I've read about: Anti-aliasing Bounding box Sky box

    Read the article

  • Garbled screen after sleep/suspend with nVidia 8800M GTS Ubuntu 11.10

    - by user34062
    Just did a clean install of 11.10, have been using it (and enjoying it!) for about a day now. But it seems that every time I resume from a sleep or blank screen after idling for a while, my desktop, as well as any programs on the screen get "garbled" I had a few windows open and all displayed the same gui glitching except for Chromium (I am guessing because it was minimized to the Unity Launcher, and not currently on the screen when I woke the PC up.) Anyone know what might be causing this, and how I might be able to fix it? I am using a Gateway P-6860FX with a C2D 1.8GHz, and nVidia 8800M GTS 512mb. Screenshot: http://dl.dropbox.com/u/28188839/Screenshot%20at%202011-11-15%2018%3A53%3A25.png NOTE: Yes, I know I misspelled Lothlorien.... .<

    Read the article

  • For photography use, is Unity is overheating my laptop? Should I try OpenSuse instead?

    - by SoT
    I am a perfect noob here in the Linux world. Previously was using Windows 7. Mine is an HP laptop - Intel core2duo T5470 @ 1.60GHz × 2 / 965GM with 2GB RAM. I installed Ubuntu 12.04TLS and is quite liking it's display. I really recognized it is 3D before knowing it was Unity 3D interface. My uses are image editing, home uses, downloads, browsing etc.. No video-editing/gaming at all. Being a Photography enthusiast I use image editing programs fairly more. But I am now feeling my laptop is getting a bit overheated - processor and hard-disk. I tried lm-sensor and could not make out much of it. Installed Xsensors.7. It gives the same output as lm-sensors gave me. It gives temperature for 4 things Temp1, temp2, temp3, and temp4. For "acpitz". Please guide me in this. However I wanted to ask something more. Which one is better for working with images - photography I mean - openSUSE 12.1 or Ubuntu with unity 3D? Can I get the display quality with the openSUSE distribution? I heard for laptops openSUSE uses power more efficiently, is there any truth? Please suggest me whether I should try openSUSE or not. If so with which GUI? KDE or GNOME? Thanks in advance. Regards SoT

    Read the article

  • Defcon totally screwed up my system!

    - by Draeton
    I'm using Ubuntu 12.04. Downloaded and ran a game called Defcon, by Introversion Software, and then my display problems began! At first, my smallest monitor stopped receiving input. Then, when I exit the game, still no input & I look at the 'Display' System Settings menu and it is off. I try switching it back on, and my system totally crashes. I restart my system, and try two more times at switching it back on, before I go back to basics and switch mirroring on before switching that monitor on. Now I've switched mirroring off, but if I set the resolution for my largest monitor above that of my other monitor, my system crashes! What the heck! Does anyone have a solution to my problems?

    Read the article

  • How do I generate a random curve for landscape (like Worms)? [closed]

    - by Stas
    Possible Duplicate: How do I generate terrain like that of Scorched Earth? How can I generate Worms-style terrain? I must build random curve line for the 2D Game on the BitMap (like in Worms, from the side). Teacher said that I should do it using Terrain Generation through recourcy (I work in Delphi 7). I understand the main principle, but I don't know how to introduce it as code. All measurements according to the screen resolution.

    Read the article

  • How can I get the camera to follow a moving object from behind in C++ and openGL [closed]

    - by user1324894
    I am trying to get the camera to follow an object that moves around my environment using the gluLookAt function. This is my code for the object moving in the direction that it faces: Xtri += -Vtri*cos((90+heading)*(PI/180.0f)); Ztri += Vtri*sin((90+heading)*(PI/180.0f)); I then render the object: glPushMatrix(); glTranslatef(Xtri,0,Ztri); glRotatef(heading,0,1,0); drawTriangle(); glPopMatrix(); All heading is is a spin variable so that if I press left or right it spins in that direction. When you press up on the arrows it moves forward and if you press down it moves backwards in the direction that it is facing. To try and get it so the camera follows I am using the gluLookAt function like this: gluLookAt(Xtri,0,(Ztri+20), Xtri,0,Ztri, 0,1,0); So that it follows the car from a distance and should follow it around. However, the object doesn't even move at all now all it can do is rotate still but not move forwards or backwards and when it spins it doesn't follow the spin instead it just watches it turn still fixed to the same position. Where is it that I am going wrong? UPDATE: I have updated the gluLookAt function so now it is: gluLookAt((Xtri+Vtri),0,((Ztri+20)), (Xtri+Vtri),0,(Ztri), 0,1,0); This seems to move the object around. I have a stationary terrain so I can see that the object is now moving and in the direction that it is facing. However, I want the camera to follow the object when it spins as well so it is always viewing the object from behind.

    Read the article

  • Ubuntu 12.04 ATI 6450

    - by user210717
    Right now my video card isn't responding after the Ubuntu Logo is shown, (I see a black screen and that's it) I have installed ubuntu 12.04 AMD64 and if I remove the video card and use the VGA from the MOBO then I can use it with no problems, other data: 4 GB of ram 1333 500 GB WD AMD APU A6 3500 2,1 GHZ I forgot a couple of details, everything was working great until last night, when the light went off (I don't know if I'm explaining myself I'm from Argentina and english isn't my first language there was a power cut I meant) and then when it got back I used my pc until I went to bed (after upgrading) and this morning, when I woke up I had this problem for breakfast, I've been reading a little and I had a similar problem before and fixed it, but it was a system problem, a missing package or something, I don't remember, but here the only issue is that the video card doesn't give me image after the ubuntu logo.

    Read the article

  • Vertex Array Object (OpenGL)

    - by user5140
    I've just started out with OpenGL I still haven't really understood what Vertex Array Objects are and how they can be employed. If Vertex Buffer Object are used to store vertex data (such as their positions and texture coordinates) and the VAOs only contain status flags, where can they be used? What's their purpose? As far as I understood from the (very incomplete and unclear) GL Wiki, VAOs are used to set the flags/status for every vertex, following the order described in the Element Array Buffer, but the wiki was really ambiguous about it and I'm not really sure about what VAOs really do and how I could employ them.

    Read the article

< Previous Page | 67 68 69 70 71 72 73 74 75 76 77 78  | Next Page >