Search Results

Search found 2967 results on 119 pages for '3d meshes'.

Page 49/119 | < Previous Page | 45 46 47 48 49 50 51 52 53 54 55 56  | Next Page >

  • Ubuntu and bumblebee and intel problem

    - by LnxSlck
    I have a brain teaser that i can't solve. When 12.04 (64bits) came out, i did a fresh install and then installed bumblebee with: sudo add-apt-repository ppa:bumblebee/stable sudo apt-get install bumblebee bumblebee-nvidia And Ubuntu recognized (System Settings - Details) Intel Card for primary card, and i could launch applications with optirun. I have: 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: NVIDIA Corporation GF119 [GeForce GT 520MX] (rev ff) Now, a few days back, i did the exact same thing, installed from the same DVD everything the same, installed bumblebee the same way. Nvidia with Optirun works fine, but Ubuntu doesn't have 3D effects: root@deathstar:~# optirun /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce GT 520MX/PCIe/SSE2 OpenGL version string: 4.2.0 NVIDIA 295.40 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: no GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no I never did anything to install Intel drivers (except installing mesa-utils), before bumblebee got everything working, but now i can't get Ubuntu to get Unity with 3D. Can someone help me please get Unity 3d working? Your help will be much appreciated

    Read the article

  • Suitable SDK to develop quick game?

    - by gRnt
    I'm currently undertaking a personal project at home that I need to turn around inside the next few months (which working full time and still learning programming makes it a tad difficult). I'm looking for suggestions on SDK's or tools (preferably free or that come with games, similar to steam tools) that I can use to develop a "game". I'm OK with coding but have no 3D development skills at all. I've very little experience with mod tools or SDK's at all but I'm hoping someone can point me in the direction of one that does the following: A decent library of prefab 3D models to build scenes. Ability to add scripting to the scene I've used Unity before and would prefer to continue to do so however I really have the worst 3D skills imaginable and can't waste time learning them. I'd be looking for pre-fab items that are both industrial and possibly more lush environments (trees etc). If it makes any difference (due to licencing and what-not) I WILL NOT be selling this game or marketing it in any way and I am a University Student if any places do educations licences. Another alternative would be to source free 3d models elsewhere but again while I'm still learning I have no idea where to look if someone could point me in the right direction I'll do the rest of the digging. Thanks

    Read the article

  • Can I set my Optimus Nvidia card to run Unity3D with bumblebee?

    - by manuhalo
    I'd like to know whether I can run compiz on my Nvidia card to speed things up. It's a Dell XPS15 laptop but I'm mostly using it as a desktop, so battery life is not important. Apparently my Intel integrated card is able to run unity 3D, but my Nvidia GT 420M is not. Here's the output of unity_support_test, both with optirun and without it: manuhalo@Ubuntu-XPS-L501X:~$ optirun /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce GT 420M/PCI/SSE2 OpenGL version string: 4.1.0 NVIDIA 280.13 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: no GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no manuhalo@Ubuntu-XPS-L501X:~$ /usr/lib/nux/unity_support_test -p OpenGL vendor string: Tungsten Graphics, Inc OpenGL renderer string: Mesa DRI Intel(R) Ironlake Mobile OpenGL version string: 2.1 Mesa 7.11 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: yes Any ideas of why this is happening? Thanks in advance to anyone able to shed some light on this. What I have tried: Installed the v290 drivers from the x-stable PPA. Tried forcing Unity-3D to work by telling Unity to ignore the unity-support-test results i.e. gksudo gedit /etc/environment add the following UNITY_FORCE_START=1 to the end of the file.

    Read the article

  • Low Graphics and laggy screen after update to 13.10 from 13.04

    - by Wh0RU
    After updating from 13.04 to 13.10 many of 3D unity stuff has stopped working, I am suing Samsung Series 3 (NP350V5X) Laptop which has Switchable Intel and AMD Radeon HD 7670M GFX. xserver-xorg-video-ati does works but NO 3D support and graphics are very low. [I am currently using this] fglrx & fglrx-updates shows blank screen after Login. Intel Graphic Install doesn't work either (dependency error) Output of $ sudo lshw -c video *-display description: VGA compatible controller product: Thames [Radeon HD 7500M/7600M Series] vendor: Advanced Micro Devices, Inc. [AMD/ATI] physical id: 0 bus info: pci@0000:01:00.0 version: 00 width: 64 bits clock: 33MHz capabilities: pm pciexpress msi vga_controller bus_master cap_list rom configuration: driver=radeon latency=0 resources: irq:45 memory:e0000000-efffffff memory:c0120000-c013ffff ioport:3000(size=256) memory:c0100000-c011ffff *-display description: VGA compatible controller product: 3rd Gen Core processor Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 09 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:46 memory:bfc00000-bfffffff memory:d0000000-dfffffff ioport:4000(size=64) Similarly $ /usr/lib/nux/unity_support_test -p OpenGL vendor string: VMware, Inc. OpenGL renderer string: Gallium 0.4 on llvmpipe (LLVM 3.3, 256 bits) OpenGL version string: 2.1 Mesa 9.2.1 Not software rendered: no Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no Shows no 3D support. Can anyone please guide how to make things working again.

    Read the article

  • Blender mesh mirroring screws up normals when importing in Unity

    - by Shivan Dragon
    My issue is as follows: I've modeled a robot in Blender 2.6. It's a mech-like biped or if you prefer, it kindda looks like a chicken. Since it's symmetrical on the XZ plane, I've decided to mirror some of its parts instead of re-modeling them. Problem is, those mirrored meshes look fine in Blender (faces all show up properly and light falls on them as it should) but in Unity faces and lighting on those very same mirrored meshes is wrong. What also stumps me is the fact that even if I flip normals in Blender, I still get bad results in Unity for those meshes (though now I get different bad results than before). Here's the details: Here's a Blender screen shot of the robot. I've took 2 pictures and slightly rotated the camera around so the geometry in question can be clearly seen: Now, the selected cog-wheel-like piece is the mirrored mesh obtained from mirroring the other cog-wheel on the other (far) side of the robot torso. The back-face culling is turned of here, so it's actually showing the faces as dictated by their normals. As you can see it looks ok, faces are orientated correctly and light falls on it ok (as it does on the original cog-wheel from which it was mirrored). Now if I export this as fbx using the following settings: and then import it into Unity, it looks all screwy: It looks like the normals are in the wrong direction. This is already very strange, because, while in Blender, the original cog-wheel and its mirrored counter part both had normals facing one way, when importing this in Unity, the original cog-wheel still looks ok (like in Blender) but the mirrored one now has normals inverted. First thing I've tried is to go "ok, so I'll flip normals in Blender for the mirrored cog-wheel and then it'll display ok in Unity and that's that". So I went back to Blender, flipped the normals on that mesh, so now it looks bad in Blender: and then re-exported as fbx with the same settings as before, and re-imported into Unity. Sure enough the cog-wheel now looks ok in Unity, in the sense where the faces show up properly, but if you look closely you'll notice that light and shadows are now wrong: Now in Unity, even though the light comes from the back of the robot, the cog-wheel in question acts as if light was coming from some-where else, its faces which should be in shadow are lit up, and those that should be lit up are dark. Here's some things I've tried and which didn't do anything: in Blender I tried mirroring the mesh in 2 ways: first by using the scale to -1 trick, then by using the mirroring tool (select mesh, hit crtl-m, select mirror axis), both ways yield the exact same result in Unity I've tried playing around with the prefab import settings like "normals: import/calculate", "tangents: import/calculate" I've also tired not exporting as fbx manually from Blender, but just dropping the .blend file in the assets folder inside the Unity project So, my question is: is there a way to actually mirror a mesh in Blender and then have it imported in Unity so that it displays properly (as it does in Blender)? If yes, how? Thank you, and please excuse the TL;DR style.

    Read the article

  • Smooth terrain rendering

    - by __dominic
    I'm trying to render a smooth terrain with Direct3D. I've got a 50*50 grid with all y values = 0, and a set of 3D points that indicate the location on the grid and depth or height of the "valley" or "hill". I need to make the y values of the grid vertices higher or lower depending on how close they are to each 3D point. Thus, in the end I should have a smooth terrain renderer. I'm not sure at all what way I can do this. I've tried changing the height of the vertices based on the distance to each point just using this basic formula: dist = a² + b² + c² where a, b and c are the x, y, and z distance from a vertex to a 3D point. The result I get with this is not smooth at all. I'm thinking there is probably a better way. Here is a screenshot of what I've got for the moment: https://dl.dropbox.com/u/2562049/terrain.jpg

    Read the article

  • What are good sites that provide free media resources for hobby game development?

    - by m_oLogin
    (Note : This question is being transfered from stackoverflow to gamedev.stackexchange where it belongs) I really suck at graphics / music / 3D modeling / animation and it's a must-have when you have a hundred hobby game development projects you're working on. I'm looking for different quality sources on the web that provide free resources. [EDIT] Some resources given by the answers, in bold the biggest resource per section: MODELS TurboSquid archive3d 3Dvia Google Sketchup ShareCG SPRITES OpenGameArt (also contains textures and 3D models) LostGarden (a couple of isometric tilesets) The Protagonist Domain (2 sprite sets) Reiner's Tilesets (also contains a couple of 3D models) Flying Yogi (3 sprite sets) Has Graphics (a couple of links to tile sets) ANIMATED SPRITES The Spriters Resource (mostly sprites from released games) CLIPART / VECTOR BASED Clker OpenClipArt PHOTOS EveryStockPhoto TEXTURES CG Textures OpenFrag MUSIC Jamendo OpSound SOUND EFFECTS FreeSound StoneWashed MySoundFx OTHER PRECOMPILED LISTS FreeGameDev.net

    Read the article

  • How to Force Graphics Options in PC Games with NVIDIA, AMD, or Intel Graphics

    - by Chris Hoffman
    PC games usually have built-in graphics options you can change. But you’re not limited to the options built into games — the graphics control panels bundled with graphics drivers allow you to tweak options from outside PC games. For example, these tools allow you to force-enabling antialiasing to make old games look better, even if they don’t normally support it. You can also reduce graphics quality to get more performance on slow hardware. If You Don’t See These Options If you don’t have the NVIDIA Control Panel, AMD Catalyst Control Center, or Intel Graphics and Media Control Panel installed, you may need to install the appropriate graphics driver package for your hardware from the hardware manufacturer’s website. The drivers provided via Windows Update don’t include additional software like the NVIDIA Control Panel or AMD Catalyst Control Center. Drivers provided via Windows Update are also more out of date. If you’re playing PC games, you’ll want to have the latest graphics drivers installed on your system. NVIDIA Control Panel The NVIDIA Control Panel allows you to change these options if your computer has NVIDIA graphics hardware. To launch it, right-click your desktop background and select NVIDIA Control Panel. You can also find this tool by performing a Start menu (or Start screen) search for NVIDIA Control Panel or by right-clicking the NVIDIA icon in your system tray and selecting Open NVIDIA Control Panel. To quickly set a system-wide preference, you could use the Adjust image settings with preview option. For example, if you have old hardware that struggles to play the games you want to play, you may want to select “Use my preference emphasizing” and move the slider all the way to “Performance.” This trades graphics quality for an increased frame rate. By default, the “Use the advanced 3D image settings” option is selected. You can select Manage 3D settings and change advanced settings for all programs on your computer or just for specific games. NVIDIA keeps a database of the optimal settings for various games, but you’re free to tweak individual settings here. Just mouse-over an option for an explanation of what it does. If you have a laptop with NVIDIA Optimus technology — that is, both NVIDIA and Intel graphics — this is the same place you can choose which applications will use the NVIDIA hardware and which will use the Intel hardware. AMD Catalyst Control Center AMD’s Catalyst Control Center allows you to change these options on AMD graphics hardware. To open it, right-click your desktop background and select Catalyst Control Center. You can also right-click the Catalyst icon in your system tray and select Catalyst Control Center or perform a Start menu (or Start screen) search for Catalyst Control Center. Click the Gaming category at the left side of the Catalyst Control Center window and select 3D Application Settings to access the graphics settings you can change. The System Settings tab allows you to configure these options globally, for all games. Mouse over any option to see an explanation of what it does. You can also set per-application 3D settings and tweak your settings on a per-game basis. Click the Add option and browse to a game’s .exe file to change its options. Intel Graphics and Media Control Panel Intel integrated graphics is nowhere near as powerful as dedicated graphics hardware from NVIDIA and AMD, but it’s improving and comes included with most computers. Intel doesn’t provide anywhere near as many options in its graphics control panel, but you can still tweak some common settings. To open the Intel graphics control panel, locate the Intel graphics icon in your system tray, right-click it, and select Graphics Properties. You can also right-click the desktop and select Graphics Properties. Select either Basic Mode or Advanced Mode. When the Intel Graphics and Media Control Panel appears, select the 3D option. You’ll be able to set your Performance or Quality setting by moving the slider around or click the Custom Settings check box and customize your Anisotropic Filtering and Vertical Sync preference. Different Intel graphics hardware may have different options here. We also wouldn’t be surprised to see more advanced options appear in the future if Intel is serious about competing in the PC graphics market, as they say they are. These options are primarily useful to PC gamers, so don’t worry about them — or bother downloading updated graphics drivers — if you’re not a PC gamer and don’t use any intensive 3D applications on your computer. Image Credit: Dave Dugdale on Flickr     

    Read the article

  • Blender to Collada to Assimp - Rigid (Non-skinned) Animation

    - by gareththegeek
    I am trying to get simple animations to work, exporting from Blender and importing into my application. My first attempt was as follows: Open Blender at factory settings. Select the default cube and insert a location keyframe. Select another frame and move the cube. Insert a second location keyframe. Export to Collada. When I open the Collada file using assimp it contains zero animations, even though in Blender the cube animates correctly. On my next attempt, I inserted a bone armature with a single bone, made it the parent of the cube, and animated the bone instead. Again the animation worked correctly in Blender. Assimp now lists one animation but both key frames have the position [0, 0, 0] Does anyone have any suggestions as to how I can get animated (non-skinned) meshes from Blender into Assimp? My ultimate goal here is to export animated meshes from Blender, process them offline into my own model format, and load them into my SharpDX based graphics engine..

    Read the article

  • Alternative ways to make a battle system in a mobile indie game more fun and engaging

    - by Matt Beckman
    I'm developing an indie game for mobile platforms, and part of the game involves a PvP battle system (where the target player is passive). My vision is simple: the active player can select a weapon/item, then attack/use, and display the calculated outcome. I have a concept for battle modifiers that affect stats to make it more interesting, but I'm not convinced this by itself will add enough of a fun factor. I've received some inspiration from the game engine that powers Modern War/Kingdom Age/Crime City, but I want more control to make it more fun. In those games, you don't have the option to select weapons or use items, and the "battling" screen is simply 3D eye candy. Since this will be an indie game, I won't be spending $$$ on a team of professional 3D artists/animators, so my edge needs to be different. What are some alternatives to expensive eye candy that you or others have used to make a non-3D PvP game more fun and engaging? Did the alternative concepts survive the release?

    Read the article

  • android: How to apply pinch zoom and pan to 2D GLSurfaceView

    - by mak_just4anything
    I want to apply pinch zoom and panning effect on GLSurfaceView. It is Image editor, so It would not be 3D object. I tried to implement using these following links: https://groups.google.com/forum/#!topic/android-developers/EVNRDNInVRU Want to apply pinch and zoom to GLSurfaceView(3d Object) http://www.learnopengles.com/android-lesson-one-getting-started/ These all are links for 3D object rendering. I can not use ImageView as I need to work out with OpenGL so, had to implement it on GLSurfaceView. Suggest me or any reference links are there for such implementation. **I need it for 2D only.

    Read the article

  • How do I simplify terrain with tunnels or overhangs?

    - by KKlouzal
    I'm attempting to store vertex data in a quadtree with C++, such that far-away vertices can be combined to simplify the object and speed up rendering. This works well with a reasonably flat mesh, but what about terrain with overhangs or tunnels? How should I represent such a mesh in a quadtree? After the initial generation, each mesh is roughly 130,000 polygons and about 300 of these meshes are lined up to create the surface of a planetary body. A fully generated planet is upwards of 10,000,000 polygons before applying any culling to the individual meshes. Therefore, this second optimization is vital for the project. The rest of my confusion focuses around my inexperience with vertex data: How do I properly loop through the vertex data to group them into specific quads? How do I conclude from vertex data what a quad's maximum size should be? How many quads should the quadtree include?

    Read the article

  • nVidia GeForce Go 7600? can it ever run unity?

    - by Khaled Musleh
    my laptop Toshiba Qosmio G30 has nVidia GeForce Go 7600 card and it suppose to support 3D . i run unity 2d now . I run 12.04 and the graphic driver is--VESA: G73 Board - toshg73m-- by UBUNTU. when i run /usr/lib/nux/unity_support_test -p then i get this list Not software rendered: no Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no the card is not blacklisted but a similar one with GT is! Do you think that there is a chance the laptop can run the unity 3d? and may be i could change the resolution of the screen to a higher one too! I tried all the nvidia drivers provided but none works (except 96 in ubuntu 12.04 ). i get a black screen or terminal screen. best wishes to all

    Read the article

  • Microsoft présente ses recherches sur les Interfaces Utilisateur Naturelles dont une devrait être adaptée sur Kinect

    Microsoft présente ses recherches sur les Interfaces Utilisateur Naturelles Dont une devrait être adaptée sur Kinect Microsoft aime les Interfaces Utilisateur Naturelles (NUI). Ses laboratoires de recherche viennent de poster une démonstration d'une technologie baptisée « 3D Photo-Real Talking Heads » et d'autres concernant les affichages intelligents, toutes réalisées grâce à l'utilisation d' appareils photo pour créer de nouveaux types d'expériences inter-actives. La 3D Photo-Real Talking Head produit un visage en 3D à partir de photos et permet de suivre les mouvements de la tête, et des lèvres. Elle devrait bientôt être adaptée au capteur de mouvement Kinect pour la X...

    Read the article

  • I Need Help With A Game (Well The API)! [closed]

    - by user1758938
    I'm not "sure" which API (or language) I should use for a little 3D FPS game I'm gonna make although I don't have helpers lol. Anyway I'm ok with Java, C# and C++ but I need a good setup (easy to use) with the tools I need to make the game. I tried things like XNA but I want to check other options first because I don't like how it makes a installer and stuff, it's really annoying. I Need A API That Can Do These Things: 3D Rendering Input Sound And If It's Not Too Much To Ask Some Cool Shaders, Dynamic Lighting And A 3D Sound System Im "Ok" If I Have To Use Multiple APIs To Do This But Please Help Me!

    Read the article

  • Dynamic navigation mesh changes

    - by Nairou
    I'm currently trying to convert from grids to navigation meshes for pathfinding, since grids are either too coarse for accurate navigation, or too fine to be useful for object tracking. While my map is fairly static, and the navigation mesh could be created in advance, this is somewhat of a tower defense game, where objects can be placed to block paths, so I need a way to recalculate portions of the navigation mesh to allow pathing around them. Is there any existing documentation on good ways to do this? I'm still very new to navigation meshes, so the prospect of modifying them to cut or fill holes sounds daunting.

    Read the article

  • How can I make my game more popular without paying money?

    - by Marlon Drescher
    I am a game designer, software developer, composer and graphic artist and made the 3D Hack 'n Slay MMO Forgotten Elements on my own. It's playable at open Beta and will be released at the end of the year. I used Plain Old JAVA, JPCT 3D Engine, Tomcat Webserver and Blender 3D / Gimp to manage all the tasks. I developed the whole game from scratch. For me the hardest task in this challenge is probably the whole thing about marketing and advertisement. Because it is a independent project and I am the only person working on it, there is no money I could invest for making advertisement. But anyhow... How could it be possible to make this game more popular? What would you suggest me?

    Read the article

  • Models from 3ds max lose their transformations when input into XNA

    - by jacobian
    I am making models in 3ds max. However when I export them to .fbx format and then input them into XNA, they lose their scaling. -It is most likely something to do with not using the transforms from the model correctly, is the following code correct -using xna 3.0 Matrix[] transforms=new Matrix[playerModel.Meshes.Count]; playerModel.CopyAbsoluteBoneTransformsTo(transforms); // Draw the model. int count = 0; foreach (ModelMesh mesh in playerModel.Meshes) { foreach (BasicEffect effect in mesh.Effects) { effect.World = transforms[count]* Matrix.CreateScale(scale) * Matrix.CreateRotationX((float)MathHelper.ToRadians(rx)) * Matrix.CreateRotationY((float)MathHelper.ToRadians(ry)) * Matrix.CreateRotationZ((float)MathHelper.ToRadians(rz))* Matrix.CreateTranslation(position); effect.View = view; effect.Projection = projection; effect.EnableDefaultLighting(); } count++; mesh.Draw(); }

    Read the article

  • Assigning a colour to imorted obj. files that are being used as default material

    - by Salino
    I am having a problem with assigning a colour to the different meshes that I have on one object. The technique that I have used is the first approach on this site. Is it possible to export a simulation (animation) from Blender to Unity? So what I would like to do is the following. I have about 107 meshes that are different frames from my shape key animation of my blender model. What I would like to have is that the first mesh will be bright green and up to the 40th mesh the colour turns to be white /greyish... the best would be if I could assign every mesh by hand a colour, however they are all default materials. And if I assign the object a colour, the whole "animation" is going to be in that colour

    Read the article

  • How many BasicEffects do you have in a Game? What is the best way to render multiple objects/shapes at once?

    - by Deukalion
    I'm trying to understand 3D rendering and it seems that everytime you render a new object (A 3D Cube or something) you need to have a new BasicEffect for each Box you render unless you want the exact same texture? ...so if I have over a hundred boxes with each different textures, I need at least as many BasicEffects? Will that not be "too much" for the CPU/GPU in the end or result in lagging? Is there any good way to render multiple objects (cubes or other shapes) at the same time? I've tried changing the BasicEffect.Texture with each cube drawn, but it resulting in changing the first Cube's texture too. Any suggestions would be really appreciated, I'm really new to 3D in XNA so I'm trying to wrap my head around the best methods for example render a Map with objects (of shapes).

    Read the article

  • Common light map practices

    - by M. Utku ALTINKAYA
    My scene consists of individual meshes. At the moment each mesh has its associated light map texture, I was able to implement the light mapping using these many small textures. 1) Of course, I want to create an atlas, but how do you split atlases to pages, I mean do you group the lm's of objects that are close to each other, and load light maps on the fly if scene is expected to be big. 2) the 3d authoring software provides automatic uv coordinates for each mesh in the scene, but there are empty areas in the texel space, so if I scale the texture polygons the texel density of each face wil not match other meshes, if I create atlas like that there will be varying lm resolution, how do you solve this, just leave it as it is, or ignore resolution ? Actually these questions also applies to other non tiled maps.

    Read the article

  • Options for 2D Web/iPhone/Android, with Test Framework

    - by ashes999
    I would like to make a 2D game with one codebase that runs on iPhone, Android, and web (any flavour of web -- HTML5, Flash, Silverlight, etc.) What are my options? I should be able to write my code once, and run it anywhere. I also absolutely need the ability to write unit tests (or write a unit-testing framework) -- I cannot make sizable games without testing. Unity is good, but unity is 3d; even with hacks, the graphical assets will still be 3d. I'm after 2d, not 3d. (If you need a Mac or separate licensing for the Mac part, that's okay.)

    Read the article

  • How to use NVIDIA GeForce M310 on Ubuntu 12.10 running as guest in Virtualbox?

    - by huub
    Last couple of weeks played around with Ubuntu 12.10. This is running as a guest on Virtualbox hosted at Windows 7. There have been some challenges with the Unity 3D stuff. Particular due to not supporting X11 release 1.13 till very recently. Since today we are able to download Virtualbox version 4.2.2 which has through guest additions also support for X11 release 1.13. SO far great work everybody. Since Unity now only runs in 3D mode it would be nice to access the graphics card directly from Virtualbox. lshw -c display shows: VGA compatible controller; product: VirtualBox Graphics Adapter. QUESTION: how to get the 3D and other graphics directly supported by the hardware ie Nvidia GeForce M310

    Read the article

  • How to Use Text in Unity3d

    - by ZiG-ZaG
    How Can i Create Text in Unity3D? I Use "3D Text" But Its Always on Top Of Everything! Can You Suggest Anything? I creating a 2D Game So its not Necessarily a 3D Text.. Edit: Because I Building a 2D Game My Scene is Full of Planes in Front of Camera And I want My Text to be Over One of the Planes and when plane is moving My Text appears behind it. But When I Use "3D Text" Its Always In Front of Everything. Sorry for My Bad English...

    Read the article

  • C# style properties in python

    - by 3D-Grabber
    I am looking for a way to define properties in Python similar to C#, with nested get/set definitions. This is how far I got: #### definition #### def Prop(fcn): f = fcn() return property(f['get'], f['set']) #### test #### class Example(object): @Prop def myattr(): def get(self): return self._value def set(self, value): self._value = value return locals() # <- how to get rid of this? e = Example() e.myattr = 'somevalue' print e.myattr The problem with this is, that it still needs the definition to 'return locals()'. Is there a way to get rid of it? Maybe with a nested decorator?

    Read the article

< Previous Page | 45 46 47 48 49 50 51 52 53 54 55 56  | Next Page >