Search Results

Search found 6535 results on 262 pages for '3d secure'.

Page 125/262 | < Previous Page | 121 122 123 124 125 126 127 128 129 130 131 132  | Next Page >

  • Les CPU monocoeur des smartphones et tablettes sont-ils assez puissants ? NVIDIA souhaite mettre plus de coeurs dans ces appareils

    Les CPU monocoeur des smartphones et tablettes sont-ils assez puissants ? Non répond Nvidia, qui souhaite implémenter plus de coeurs dans ces appareils Pour Nvidia, "les processeurs dual-core seront le standard en 2011, et les quad-core ne tarderont pas à suivre". L'entreprise trouve en effet que les appareil mobiles, actuellement équipés de puces à un seul coeur, ne sont pas assez puissants. Pourtant, on leur en demande toujours plus, notamment de nos jours avec l'arrivée des jeux en 3D ; qui sont plutôt gourmands. Face à cela, Nvidia propose d'ajouter des coeurs supplémentaires dans les circuits de ces machines. Les bénéfices ? Une durée de vie des batteries allongée, et l'e...

    Read the article

  • What animation technique is used in 'Dont Starve'?

    - by Bugster
    While playing a few games in my personal time off development I've stumbled across a survival 2D/3D survival game. The game was apparently made in SDL and GLUT (Dont starve) but what really amazed me was the animations in the game. The animations are extremely smooth and fluent. There is no distortion while animating, what usually happens in hand-made animations is that pixels get removed, animations are jaggy and they simply aren't as smooth. That got me thinking on how they managed to accomplish such a quality of animations. Were they really handmade (If they were, then it must've taken a very talented artist), is it bone animation or are they using another technique?

    Read the article

  • Move Camera Freely Around Object While Looking at It

    - by Alex_Hyzer_Kenoyer
    I've got a 3D model loaded (a planet) and I have a camera that I want to allow the user to move freely around it. I have no problem getting the camera to orbit the planet around either the x or y axis. My problem is when I try to move the camera on a different axis I have no idea how to go about doing it. I am using OpenGL on Android with the libGDX library. I want the camera to orbit the planet in the direction that the user swipes their finger on the screen.

    Read the article

  • unit testing on ARM

    - by NomadAlien
    We are developing application level code that runs on an ARM processor. The BSP (low level code) is being delivered by a 3d party so our code sits just on top of this abstraction layer (code is written in c++). To do unit testing, I assume we will have to mock/stub out the BSP library(essentially abstracting out the HW), but what I'm not sure of is if I write/run the unit test on my pc, do I compile it with for example GCC? Normally we use Realview compiler to compile our code for the ARM. Can I assume that if I compile and run the code with x86 compiler and the unit tests pass that it will also pass when compiled with RealView compiler? I'm not sure how much difference the compiler makes and if you can trust that if the x86 compiled code pass the unit tests that you can also be confident that the Realview compiled code is ok.

    Read the article

  • How do you start modding a game without an editor?

    - by Protector one
    I often come across very impressive mods for PC games that don't have an official editor, other development tools or its source code publicly available. (Take this amazing Multiplayer mod for Just Cause 2, for example.) How do you go about creating mods for such games? I'm not talking about replacing the odd texture or 3D model—that sort of thing seems fairly easy given tools to pry them out of game files and put them back in—but more along the lines of adding game behavior. (Tweaking settings files also doesn't count.) Note that I'm not asking "how to create a mod", I just want to know where to start or where to go to learn.

    Read the article

  • Is it a bad practice to include all the enums in one file and use it in multiple classes?

    - by Bugster
    I'm an aspiring game developer, I work on occasional indie games, and for a while I've been doing something which seemed like a bad practice at first, but I really want to get an answer from some experienced programmers here. Let's say I have a file called enumList.h where I declare all the enums I want to use in my game: // enumList.h enum materials_t { WOOD, STONE, ETC }; enum entity_t { PLAYER, MONSTER }; enum map_t { 2D, 3D }; // and so on. // Tile.h #include "enumList.h" #include <vector> class tile { // stuff }; The main idea is that I declare all enums in the game in 1 file, and then import that file when I need to use a certain enum from it, rather than declaring it in the file where I need to use it. I do this because it makes things clean, I can access every enum in 1 place rather than having pages openned solely for accessing one enum. Is this a bad practice and can it affect performance in any way?

    Read the article

  • Premultiplying matrices with Perspective destroys them

    - by Shadows In Rain
    If I apply world_to_camera, perspective and camera_to_screen to my mesh, everything is okay. But if I premultiply given matrices (i.e. transform = world_to_camera * perpective * camera_to_screen) before applying, then it seems like only perspective has effect. If it is important... My 3d framework was written from scratch (test project for job interview). But it works flawlessly, or at least I think so. So, question. This is expected behaviour, or my implementation is wrong?

    Read the article

  • How would I go about implementing a globe-like "ballish" map?

    - by rFactor
    I am new to 3D development and I have this idea of having the game world like our globe is - a ball. So, there would be no corners in the map and the game is top-down RTS game. I would like the camera to go on and on and never stop even though you are moving the camera to the same direction all the time. I am not sure if this is even possible, but I would really like to build a globe-like map without borders. Can this be done, and how exactly? I am using XNA, C#, and DirectX. Any tutorials or links or help is greatly appreciated!

    Read the article

  • Most supported/easiest to get started gamedev language?

    - by user1009013
    In what language are the most libraries/frameworks (like lwjgl for Java, XNA for C#)? What language is the easiest to start making a game (very easy to get a 3D-environment rendered)? What language has the friendliest learning curve? Say I want to make a game and I don't know any programming languages, I want to develop for any platform(so don't give the answer "the one you know best/the platform you are working on"), then what is the best language to start with. I get this question a lot "I have this and that ideas for a game and want to make it, what language should I use"(mostly asked by beginning programmers), but I don't know how to answer that. The answer "use the one you are most familiar with", because sometimes they don't even know a language yet... I am not asking for someone's personal opinion, but an objective list of what languages are the easiest/most supported/have the most/best libraries/frameworks to get started with gamedevelopment.

    Read the article

  • Do I need the 'w' component in my Vector class?

    - by bobobobo
    Assume you're writing matrix code that handles rotation, translation etc for 3d space. Now the transformation matrices have to be 4x4 to fit the translation component in. However, you don't actually need to store a w component in the vector do you? Even in perspective division, you can simply compute and store w outside of the vector, and perspective divide before returning from the method. For example: // post multiply vec2=matrix*vector Vector operator*( const Matrix & a, const Vector& v ) { Vector r ; // do matrix mult r.x = a._11*v.x + a._12*v.y ... real w = a._41*v.x + a._42*v.y ... // perspective divide r /= w ; return r ; } Is there a point in storing w in the Vector class?

    Read the article

  • Wall avoidance steering

    - by Vodemki
    I making a small steering simulator using the reynolds boid algorythm. Now I want to add a wall avoidance feature. My walls are in 3D and defined using two points like that: ---------. P2 | | P1 .--------- My agents have a velocity, a position, etc... Could you tell me how to make avoidance with my agents ? Vector2D ReynoldsSteeringModel::repulsionFromWalls() { Vector2D force; vector<Wall *> wallsList = walls(); Point2D pos = self()->position(); Vector2D velocity = self()->velocity(); for (unsigned i=0; i<wallsList.size(); i++) { //TODO } return force; } Then I use all the forces returned by my boid functions and I apply it to my agent. I just need to know how to do that with my walls ? Thanks for your help.

    Read the article

  • System hangs at glReadPixel call with GL_TEXTURE_2D_ARRAY for texturing

    - by Roshan
    I am calling glReadPixel after glDrawArray call. I am rendering a geometry with 3D texture on it as a target GL_TEXTURE_2D_ARRAY. My systems hangs at glreadpixel call. When i use target as GL_TEXTURE_3D the issue does not occurs and it correctly reads the framebuffer contents. glReadPixels(0, 0, GetViewportWidth(), GetViewportHeight(), GL_RGBA, GL_UNSIGNED_BYTE, (GLvoid *)rendered_pixels); I am using SNORM textures with GL_byte data in glTeximage3D call and I am not calling glPixelStorei, is it because of this? What should be the parameter for pixelstore call?

    Read the article

  • How much time it will take to learn 3ds Max

    - by Mirror51
    I am not a 3d developer but i want to lean 3ds max just for simple house building with 2-3 rooms. Actually i don't want to develop from scratch . What i really want to do is get the existing models of homes , rooms , hotels from the internet and add my name there or my photo there , just for fun . SO i want to know that how much time do u think it will take me to that sort of stuff. Its not my career but just hobby . If its going to take longer time , then i don't want to waste but i can get going in one week or so that will go good but i want to ask from experience developers thanks

    Read the article

  • Acer Extensa 5620 - Graphic unknown!

    - by Nycxzon
    I install Ubuntu 12.04 Beta 2, I found out the my graphic driver is "Unknown" and experience is "Standard". Please help me know how to install my graphic driver. as per my laptop specs: Mobile Intel GM965 Express Chipset with integrated 3D graphics, featuring Intel Graphics Media Accelerator (GMA) X3100 with up to 358/3845 MB of Intel Dynamic Video Memory Technology 4.0 (8 MB of dedicated system memory, up to 350/3765 MB of shared system memory), supporting Microsoft DirectX 9. Hope you can help me! Thanks in advance! Noob @ Ubuntu :) Updates! ______________ I found a solution! I found a site where I ask how to check the graphic of my system. by using terminal command "glxinfo" command ask me to install mesa-utils! and update. After that, My graphic driver is listed! Thanks for your continues support! :)

    Read the article

  • Audio libraries for PC indie games [closed]

    - by bluescrn
    Possible Duplicate: Cross-Platform Audio API Suggestions What options are out there these days for audio playback/mixing in C++? Primarily for Windows, but portability (particularly to Mac and iOS) would be desirable. For a small indie game, potentially commercial, though - so I'm looking for something free/low-cost. My requirements are fairly basic - I don't need 3D sound, or many-channels - simple stereo is fine. Just need to be able to mix sound effects and a music stream, maybe decoding one or more compressed audio formats (.ogg/.mp3 etc), with all the basic controls over looping, pitch, volume, etc. Is OpenAL more-or-less the standard choice, or are there other good options out there?

    Read the article

  • System locks up at login after latest update on 12.04

    - by RCD
    For starters, I am a noob when it comes to Ubuntu and Linux...but I am hopeful that I have the "light" and the error of my past MS ways :) ..... After a recent update last week from 12.04 to 12.04.1 my system now locks up at the login screen. Before the update the system would run but must admit it ran EXTREMELY slow. I have installed 12.04 on a Dell P4 2.4 Ghz, 2 GB DDR Ram, 80 GB IDE HD and an integrated Intel 3D Extreme Graphics. Any ideas on why now the system locking up at login screen? Appreciate any ideas and/or tips Thanks

    Read the article

  • Game Asset Storage: Archive vs Individual files

    - by David Colson
    As I am in the process of creating a 3D c++ game and I was wondering what would be more beneficial when dealing with game assets with regards to storage. I have seen some games have a single asset file compressed with everything in it and other with lots of little compressed files. If I had lots of individual files I would not need to load a large file at once and use up memory but the code would have to go about file seeking when the level loads to find all the correct files needed. There is no file seeking needed when dealing with one large file, but again, what about all the assets not currently needed that would get loaded with the one file? I could also have an asset file for each level, but then how do I deal with shared assets This has been bothering me for a while so tell me what other advantages and disadvantages are there to either way of doing things.

    Read the article

  • How to get the Dash and HUD to appear. (and stop Unity spewing error messages.)

    - by Ubuntiac
    I just installed Ubuntu 12.04 on my wifes Dell Inspiron 1501, which uses an R300 ATI graphics chip. Neither the Dash or HUD appear when pushing the appropriate key. When I try unity --reset & in the terminal, I see that over and over it's spitting out: r300: CS space validation failed. (not enough memory?) Skipping rendering. This is just after starting Ubuntu with no apps open, so I find it hard to believe that just rendering the Dash / HUD is completely blowing out the VRAM. Any suggestions on getting this working? /usr/lib/nux/unity_support_test -p shows OpenGL vendor string: X.Org R300 Project OpenGL renderer string: Gallium 0.4 on ATI RS480 OpenGL version string: 2.1 Mesa 8.0.2 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: yes All sections say "YES"

    Read the article

  • Detailed Modern Opengl Tutorial?

    - by Kogesho
    I am asking for a specific modern opengl tutorial. I need a tutorial that does not skip to explain any lines of code. It should also include different independent objects moving/rotating (most tutorials use only one object), as well as imported 3d objects and collision detection for them. It should also avoid stuff that won't be used. Arcysnthesis for example gives a new concept, and after teaching it, in the next tutorial, it explains how bad it is for performance and introduces another method. Do you know any?

    Read the article

  • Given two sets of DNA, what does it take to computationally "grow" that person from a fertilised egg and see what they become? [closed]

    - by Nicholas Hill
    My question is essentially entirely in the title, but let me add some points to prevent some "why on earth would you want to do that" sort of answers: This is more of a mind experiment than an attempt to implement real software. For fun. Don't worry about computational speed or the number of available memory bytes. Computers get faster and better all of the time. Imagine we have two data files: Mother.dna and Father.dna. What else would be required? (Bonus point for someone who tells me approx how many GB each file will be, and if the size of the files are exactly the same number of bytes for everyone alive on Earth!) There would ideally need to be a way to see what the egg becomes as it becomes a human adult. If you fancy, feel free to outline the design. I am initially thinking that there'd need to be some sort of volumetric voxel-based 3D environment for simulation purposes.

    Read the article

  • Broadcom bcm4313 Ubuntu 13.10 connection time out

    - by Wahtever
    After upgrading to Ubuntu 13.10 I keep getting connection timed out every few seconds at which point i have to disconnect and reconnect to the WiFi network. The WiFi card worked fine on 13.04 with the bcmwl-kernel-source installed but giving problems on 13.10: *-network description: Wireless interface product: BCM4313 802.11bgn Wireless Network Adapter vendor: Broadcom Corporation physical id: 0 bus info: pci@0000:02:00.0 logical name: eth1 version: 01 serial: c0:14:3d:cc:c9:c7 width: 64 bits clock: 33MHz capabilities: pm msi pciexpress bus_master cap_list ethernet physical wireless configuration: broadcast=yes driver=wl0 driverversion=6.30.223.141 (r415941) ip=192.168.1.4 latency=0 multicast=yes wireless=IEEE 802.11abg resources: irq:17 memory:f0500000-f0503fff How can i fix this? Thanks

    Read the article

  • Alternatives to voxel-based terrain

    - by Neomex
    Are there any alternatives to voxel based terrains? Such terrain should be fully destructable, allow for arches, overhangs, preserve sharp features where needed and keep consistent topology. Maybe you can explain the problem that makes you ask this question? Voxel based terrain is basically just using a 3D grid of data to store data. There are lots of ways to render that data, but it doesn't get much simpler for storing it. – Byte56 Current isosurface extraction methods aren't most effective/bug-free. Cubical Marching Squares seem to solve most of the issues, however it is a relatively new method and there aren't too many resources about it. (I've found single university paper) Even if we stick to CMS, when we want to add multi-material support, we can either divide surface into multiple meshes, or pass a texture array or texture atlas to shaders, then we are limited to set amount of textures and additionally increase memory-usage alot.

    Read the article

  • IBM présente cinq innovations IT qui pourraient changer la vie dans les cinq ans à venir

    IBM présente cinq innovations IT qui pourraient changer la vie Dans les cinq années à venir IBM vient de dévoiler, lors de la cinquième édition de son événement « Next Five in Five », cinq innovations technologiques qui pourraient changer la façon de travailler, de vivre et de jouer au cours des cinq prochaines années. IBM a observé les tendances actuelles, les nouvelles technologies développées et fait le point sur les recherches en cours dans ses laboratoire à travers le monde. S'appuyant sur ces observations, IBM pense pouvoir, dans cinq ans, nous offrir la possibilité d'interagir avec nos amis en 3D, alimenter nos batteries d'ordinateurs, portables et autres avec l'air que nous respir...

    Read the article

  • Is it only possible to display 64k vertices on the monitor with 16bit?

    - by Aufziehvogel
    I did the first 3D tutorial over at riemers.net and stumbled upon that my graphic card only supports Shader 2.0 (Reach profile in XNA) which means I can only use Int16 to store the indices (triangle to vertex). This means that I can only store 2^16 = 65536 vertices. Also I read on the internet that you should prefer 16-bit over 32-bit because not all hardware (like mine) does support 32-bit. Yet, I am wondering: Do really all game scenes get along with only so little vertices? I though already faces of people used a lot of polygons (which are made up of vertices?). It’s not relevant for me yet, but I am interested: Do game scenes use only 65536 vertices? Do you use some trade-off to display more (e.g. 64k in GPU buffer rest on RAM) Is there some method to get more into the GPU buffer? I already read on some other posts that there seems to be a limit of 64k per mesh too, so maybe you can compact stuff to meshes?

    Read the article

  • VMWare Player pauses often

    - by pascal
    I'm using a 64bit Windows 8 inside vmplayer, with 2 virtual processor cores, virtual hard disk resides on a fast local disc and is not preallocated; host CPU is Intel i7 3770, should be capable of hardware virtualisation but I don't know if VMWare uses it; NAT networking; Sound card connected, USB connected, accelerated 3D graphics (NVidia 313.30 on host) My problem is, that the VM often pauses for a few seconds, and then speeds up for a few seconds to reach real time again. Time in the VM actually moves faster after the pause, for example all animations using timers speed up. When running, the vmware-vmx process shows ~150% CPU usage in top, but 0% when pausing (and D state i.e. waiting for IO). iotop shows normal disk writes from vmware-vmx threads, but during pauses, the flush kernel thread uses 99%. Are there some options to try so that VMWare doesn't wait for IO? I've tried a few things available from the GUI but the issue never went away…

    Read the article

< Previous Page | 121 122 123 124 125 126 127 128 129 130 131 132  | Next Page >