Search Results

Search found 11395 results on 456 pages for 'intel integrated graphics'.

Page 39/456 | < Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >

  • Intel fait durer le suspens sur Sandy Bridge en délivrant les informations au compte-gouttes sur la

    Intel fait durer le suspens sur Sandy Bridge En délivrant les informations au compte-gouttes sur la future micro-architecture de ses processeurs Sandy Bridge est le successeur de Nehalem, la micro-architecture utilisée par Intel pour ses processeurs. Bien qu'on en sache encore assez peu sur cette nouvelle technologie, elle fait d'ores et déjà couler beaucoup d'encre. La stratégie de communication d'Intel, qui délivre les informations au compte-gouttes, n'y est certainement pas pour rien. Lors des derniers Intel Developer Forum (IDF) qui vient de s'achever, David Perlmutter - Executive Vice President Intel Architecture Group - a néanmoins laissé filtrer quelques indices :

    Read the article

  • C++ Intel TBB : sortie de la version 3 de la bibliothèque open source pour le développement parallè

    La bibliothèque open source TBB d'Intel pour programmer en parallèle vient de sortir en version 3 Intel vient d'annoncer aujourd'hui la sortie de la troisième version de sa bibliothèque TBB (thread building blocks). Cette bibliothèque C++, disponible en open source, a pour objectif de permettre de programmer en parallèle, afin d'accéder aux ressources des machines multi-coeurs actuels. Citation: Today, Intel released Intel® Threading Building Blocks (Intel® TBB) 3.0, a high-level parallel programming toolkit that ...

    Read the article

  • Shouldn't WP8 Emulator Work With an Intel Q9650?

    - by Al Bundy
    My comp is as follows: -Windows 8 Pro -Visual Studio 12 Pro -Asus P5Q Pro Turbo -Intel Q9650 (Core 2 Quad 3.0 Ghz) As far as I can tell, this setup should support Windows Phone 8 Emulator, but When I installed Windows Phone 8 SDK, it said that my computer doesn't support hardware virtualization. It says here that it does: http://ark.intel.com/products/35428/Intel-Core2-Quad-Processor-Q9650-(12M-Cache-3_00-GHz-1333-MHz-FSB)

    Read the article

  • Installing Heatsink (XIGMATEK Gaia SD1283) onto Motherboard (ASRock X79 Extreme9 LGA 2011 Intel X79)

    - by Mike Hagstrom
    I recently have purchased: XIGMATEK Gaia SD1283, in the instructions it states two different type's for the installation of the Intel LGA 2011. It lists Type 1 and Type 2. Now I have the Intel Core i7-3930K (http://www.newegg.com/Product/Product.aspx?Item=N82E16819116492) and I have the ASRock X79 Extreme9 LGA 2011 Intel X79 (http://www.newegg.com/Product/Product.aspx?Item=N82E16813157285) What Type do I use for the installation of this heatsink?

    Read the article

  • Hyper-V Blue Screens with Nvidia GeForce 8400 GS Graphics Card

    - by Mahmoud Saleh
    I am using Windows Server 2008 R2 Enterprise x64. After installing the Hyper-V role and restarting the machine, I get a blue screen error and an immediate reboot. I have Googled the issue and tracked it down to the graphics card, so I uninstalled it, and then Windows loads fine. However, after installing the graphics driver again, the Blue Screen returns. The graphics card is an Nvidia GeForce 8400 GS. Does anyone know how I can resolve this issue?

    Read the article

  • Motherboard: Intel S5520HCR s1366 SSI EEB

    - by Crazy_Bash
    I'm building a storage server for online video streaming. I thought about adding two SSD drive for a OS. other 15*(12 SATA & 3 SSD) drives i want to build with aufs XFS and ethernet 4GB/sec network. But I'm confused a little. S5520HCR board supports 6, SATA/300, RAID: 0, 1, 10, Intel ICH10R. Does it mean i can use SATAIII HDD? I'm planing on buying SEAGATE SV35 Series (3.5, 3??, 64??, SATA III-600). also my Chassis supports up-to 16 sata and the motherboard only 6 what kind of sata controller should i use? What's better in terms of performance 1366 or 2011 socket? My server so far: AIC RSC-3EG-80R-SA1S-2 3U Motherboard: Intel S5520HCR s1366 SSI EEB Kingston DDR3 8192Mb PC3-10600 1333MHz (KVR1333D3N9/8G) Seagate 3000GB 64MB 3.5" 7200rpm SATAIII (ST3000DM001) Kingston 480GB SSD 2.5" SATAIII Intel E1G44HTBLK Intel Xeon E5606 2133MHz/L3-8192Kb/QPI s1366 tray SERVER ACC CARD SAS PCIE 16P HBA 9201-16I LSI00244 SGL LSI

    Read the article

  • How do I keep from running out of memory on graphics for an Android app?

    - by user279112
    I've been working on an Android app in Eclipse, and so far, my program hasn't really grown past midget size. However I've already run into an issue with an Out of Memory error. You see, I've been using graphics comprised solely of bitmaps and PNGs in this program, and recently, when I tried to add a little bit more functionality to the program (mainly including a few more bitmaps and causing an extra sprite to be created), it started crashing in the graphics thread's constructor - sprite's constructor. When I tracked the problem down, it turned out to be an Out of Memory error that is seemingly caused by adding too many picture files to the program and creating Drawables out of them. This would be a problem, as I really don't have that many picture resources worked into that program...maybe 20 or so. I haven't even started to include sound yet. These images aren't all that fancy. My questions are this: 1) Are programs for the Android phone really that limited on how much memory they can employ, or is it probably something other than the 20-30 resource pictures causing that error? 2) If the memory for Android apps is so awful it can't even handle 20-30 picture resources being loaded into Drawables that exist at the same time, then how in the world are you supposed to make decent graphics and sound for that thing? Thanks.

    Read the article

  • Ubuntu backlight problem with Nvidia graphics

    - by Vladimir
    I have a laptop mySN QMG6 / Chiligreen Mobilitas NW which is Quanta TW9 barebone with intel i3 and nvidia 335m GT onboard. On ubuntu distros 10.04, 10.10, 11.04 and 11.10 i had problem with changing screen backlight with nouveau and nvidia drivers. FN+F4/F5 buttons did not change my brightness. I tried to edit xorg.conf, adding Option “RegistryDwords” “EnableBrightnessControl=1? Also tried to add some lines to grug acpi_osi="Linux" acpi_backlight=vendor Neither worked for me. Today I installed Ubuntu 12.04 beta2 and... With nouveau driver my FN key works, and changes the brightness (is it a new 3.0.22 linux kernel, or patched nouveau driver, i don't know). This is a big step forward. But, when installing proprietary nvidia driver (295.33) FN button stops working and i can't change brightness. I also tried workaround with xorg and grub with no result. Tried to install acpi from apt - no result. Is there anything left to try? I really need that nvidia driver working with FN keys, as i would like to have a working 3D acceleration. P.S. Does the nouveau driver has 3d acceleration like nvidia drivers??? If there is need to provide some log data, please write what should i print, as i'm a bit new to Ubuntu. P.P.S. Same problems i had with other Linux distros (Mint, Fedora and others) P.P.P.S. Other FN buttons work with both drivers (Mute, VOL UP/DOWN, WiFi on/off, Bluetooth, Sleep, Start/Pause, Stop, Next/Prev song)

    Read the article

  • Stable ubuntu distribution for a broadcom bcm4313 wireless driver and an nvidia 630 graphics adapter

    - by Vivek Pradhan
    I have been trying to completely shift to ubuntu or a linux distro for almost 2 years now. I have tried all the ubuntu distros starting from 10.04 through 11.10, but there has always been some bugs with the display drivers or the wireless cards not being recognised and the additional drivers suggested from the ubuntu community not doing the trick always. I tried a lot to fix bugs, checked a lot of forums, launch pad, got some of them fixed, but could not really get a neat and complete ubuntu machine set up till now. Now I really like the whole open source community and the linux platform, my pc at home runs ubuntu 11.04 perfectly but there have been some glitches always with ubuntu on my laptop that has forced me to stick with windows only. Now I am currently on an hp dv6 laptop that has the broadcom bcm4313 wireless driver and the nvidia 630gtm graphics(optimus) driver. Now I tried to do some research on the support of these drivers in linux machines but could not get anywhere. So I would really appreciate if you guys could suggest some linux distro that I could use that has full support of these drivers or has stable bug fixes for these kind of issues. I tried precise pangolin (LTS) through a live CD but i still see a problem with wifi networks which is a little frustrating. Please help me find the perfect match for my laptop :P I would gladly provide any other information necessary.

    Read the article

  • kubuntu 12.10 will not boot on mac 2.93Ghz intel core 2 duo

    - by Jake Sweet
    I feel like I've tried it all and nothing is changing. I've tried booting from a liveUSB, a liveDVD, and I've checked the mod5 everything matches up. I've even tried different distro's same result on all of them. Just for reference: linuxmint 13kde and Fedora 17. I've also tried changing my liveUSB building software just in case. I've tried unetbootin and Linux USB builder. Both have same results, my opinion is that it is a hardware issue since I'm having near the same result with all of these variables. So now what is actually happening? I can boot up to a screen. I say A screen because some of the ways that dvd's and usb's boot differs. Now on liveusb I'm reaching a black screen with white text. Says booting: done, then below says loading ramdrive: done, then below that it says preparing to boot kernel this may take a while and buckle in or something to that effect. Then nothing. That's it computer freezes. I've waited up to 8 hrs and still nothing. Ok for the liveDVD Everything goes according to instructions per pdf files on every distro, until linux starts. I can only run in compatibility mode. When any other option is tried the computer seems to freeze/stall/be a pain in my butt... Ok well that seems to wrap it up. Also if I'm not explaining something well, I'm sorry I can try to clear anything up. I'm not the best at descriptions. I'm leaving with a tech specs of my mac: 2.93GHz Intel Core 2 Duo, 4 GB ram, NVIDIA GeForce GT 120 graphics, bought in late 09" it's the 24" model, let me know if anymore information will help. Also thanks in advance

    Read the article

  • Mixing XNA and silverlight gives wierd graphics

    - by Mech0z
    I making a small 3dgame which is made as a Silverlight and XNA app, but when I draw the sprites the graphics becomes all wierd. All my primitive types are rendered correctly, but my 3d models are just wierd My Draw is like this when silverlight is set to draw private void OnDraw(object sender, GameTimerEventArgs e) { // Render the Silverlight controls using the UIElementRenderer elementRenderer.Render(); // Clear the screen to a solid color SharedGraphicsDeviceManager.Current.GraphicsDevice.Clear(Color.CornflowerBlue); switch (gameState) { case GameState.ChooseStarter: TextBlockStatus.Text = "Find Starting Player"; break; case GameState.PlaceBrick: TextBlockPlayer.Text = (playerTurn == PlayerTurn.PlayerOne) ? "Player One" : "Player Two"; TextBlockState.Text = "Place Brick"; foreach (IGraphicObject obj in _3dObjects) { obj.Draw(cameraPosition, e); } break; case GameState.GiveBrick: TextBlockState.Text = "Give Brick"; break; } spriteBatch.Begin(); // Using the texture from the UIElementRenderer, // draw the Silverlight controls to the screen spriteBatch.Draw(elementRenderer.Texture, cameraProjection, Color.White); spriteBatch.End(); } This gives me this output If I comment the spritebatch lines out I get the correct output, except the silverlight text is of course not shown I am not entirely sure what to look for except that zero vector I am giving to the spritebatch, but if thats the source I have no idea what I am supposed to set it as epspecially when its a 2d vector

    Read the article

  • Intel 82576 Network card

    - by No1_Melman
    I have an Intel dual port pcie NIC card with two 82576 interfaces according to ubuntu 12.04. I run the command sudo lshw -html > /home/melman/Documents/hardware.html and it shows both of the interfaces but they're grayed out?! How can enable them? ifconfig output: bond0 Link encap:Ethernet HWaddr 00:00:00:00:00:00 inet addr:192.168.100.2 Bcast:192.168.100.255 Mask:255.255.255.0 UP BROADCAST MASTER MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:0 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) eth0 Link encap:Ethernet HWaddr e0:69:95:d1:db:ff inet addr:192.168.10.63 Bcast:192.168.10.255 Mask:255.255.255.0 inet6 addr: fe80::e269:95ff:fed1:dbff/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:2903 errors:0 dropped:0 overruns:0 frame:0 TX packets:2627 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:1524738 (1.5 MB) TX bytes:430196 (430.1 KB) Interrupt:20 Memory:f7f00000-f7f20000 eth3 Link encap:Ethernet HWaddr 00:50:b6:50:a7:f9 BROADCAST MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:0 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) eth4 Link encap:Ethernet HWaddr 00:1b:21:6e:99:77 BROADCAST MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:0 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) Memory:f7c00000-f7c20000 eth5 Link encap:Ethernet HWaddr 00:1b:21:6e:99:76 BROADCAST MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:0 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) Memory:f7c20000-f7c40000 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:246 errors:0 dropped:0 overruns:0 frame:0 TX packets:246 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:17584 (17.5 KB) TX bytes:17584 (17.5 KB)

    Read the article

  • intel atom getting too hot

    - by user59565
    I own an Asus 1215N which is getting very hot Intel Atom 525D dual core ION 2 (geforce 220 + GMA 3150) 4 GB RAM Ubuntu 12.04 it hits 86 C at idle. Some times (at load or turned on 1 hour) it shuts down due to heat, in Windows 7 it runs idle at 49 C. I tried an acpi call to shut down the nVidia chip which is cooled together with the Atom chip. That didn't solve the problem. To check up to see if it really turned off I checked how much power the laptop consumed, it only went from using 1400 mW to 860 mW, no changes in heat. I also tried reapply the standard heat adhesive, the old heat adhesive made it run at 97 (it couldn't even put up a useful install of Ubuntu). This really annoys me, as Ubuntu is the OS of choice to me. Should I try compile the kernel? Is it true that compile for a P4 is the better choice to the atom, when compiling the kernel for this processor architecture? Now I tried compiling the kernel for atom. Now temperature is 83 C (think the drop has more to do with ambient temp than the customized kernel) help appreciated

    Read the article

  • Keeping game model and graphics/animation separate but in sync

    - by AJM
    Suppose I'm building a chess game where I want to have animations. Pieces glide to their new squares when moved. Pieces perform attack animations when capturing other pieces. I'm not sure how to effectively separate the data and logic needed for these animations and the actual game model (in the MVC sense). The pieces themselves should ideally not have to worry about their pixel coordinates or current animation frame. At the same time, many changes to the model are effectively driven by animations. A moved piece changes its position after (before?) its sprite is done gliding. A piece is removed from the board after the capturing piece is finished its attack animation. How would you suggest I manage the game model, the graphics and animations, and their relationships? For example, where would the animations "live"? How would animations be created and managed in response to player moves? How would animations drive updates to the game model, or how would the game model drive animations?

    Read the article

  • Designing generic render/graphics component in C++?

    - by s73v3r
    I'm trying to learn more about Component Entity systems. So I decided to write a Tetris clone. I'm using the "style" of component-entity system where the Entity is just a bag of Components, the Components are just data, a Node is a set of Components needed to accomplish something, and a System is a set of methods that operates on a Node. All of my components inherit from a basic IComponent interface. I'm trying to figure out how to design the Render/Graphics/Drawable Components. Originally, I was going to use SFML, and everything was going to be good. However, as this is an experimental system, I got the idea of being able to change out the render library at will. I thought that since the Rendering would be fairly componentized, this should be doable. However, I'm having problems figuring out how I would design a common Interface for the different types of Render Components. Should I be using C++ Template types? It seems that having the RenderComponent somehow return it's own mesh/sprite/whatever to the RenderSystem would be the simplest, but would be difficult to generalize. However, letting the RenderComponent just hold on to data about what it would render would make it hard to re-use this component for different renderable objects (background, falling piece, field of already fallen blocks, etc). I realize this is fairly over-engineered for a regular Tetris clone, but I'm trying to learn about component entity systems and making interchangeable components. It's just that rendering seems to be the hardest to split out for me.

    Read the article

  • NVIDIA Graphics - resolution problems with new 12.04 LTS installation

    - by Daveisuser56810
    I've been trying to install Ubuntu 12.04 LTS on my desktop most of the day. The desktop uses a NVIDIA GEFORCE 9800 (GT I think) graphics card. I am unable to set the correct resolution (1680 x 1050) for the display. The first problem I had was that of the "Black Screen" during install. I overcame this by utilising the "nomodeset" switch on the install options (once I'd found how to do that). The second problem of course was the "Black screen" following the first reboot. Once again this was overcome by using "nomodeset", this time by "editing" the GRUB. This gave me a resolution of 1280x768 which, the Displays GUI allowed me to change to 1280x720 (appears to fit on screen). I then tried to install the NVIDIA drivers. 1) using additional drivers 2) manually by downloading driver and installing in root As soon as NVIDIA drivers are installed - resolution become restricted to 640x480 (max). At this resolution Ubuntu GUI is not usable as most screens are larger than the display. Removing the NVIDIA driver and removing the XORG.CONF file does not lift this restriction. I have tried most things that I have found and that were vaguely intelligible, but nothing appears to get me closer to a resolution of 1680x1050. UPDATE: reinstalled Ubuntu 12-04 and used the "NoModeSet" in the Grub to restore the resolution to 1280x720, which is at least usable. Will live with this for now.

    Read the article

  • Restart and/or graphics problem in Ubuntu 12.04

    - by kara
    I having been using 12.04 for a couple of months now, with v. little problems. The other day I restarted my computer, and though I think it rebooted, the screen would be black. I could not even get a visual from a live cd. Finally, I was able to get it to load, but the resolution has been completely off. The computer thinks I have a laptop screen, when I actually have a ViewSonic VP2330wb, and it detects only two resolutions. And still, I have a problem with rebooting. If the screen locks after I leave it for a while, I can't get a visual back, and then when I force a shutdown, it takes 3 times for me to get a grub screen. Then I have to boot in recovery mode, and then finally in normal mode, but the screen is still always off. This is my video card: description: VGA compatible controller product: 2nd Generation Core Processor Family Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 09 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list configuration: latency=0 resources: memory:fe000000-fe3fffff memory:d0000000-dfffffff ioport:f000(size=64) I am a new ubuntu user, and am at my wits end. Any help would be greatly appreciated.

    Read the article

  • Hybrid Graphics on Ubuntu 12.04 switching to discrete

    - by cfstras
    I have a Sony Vaio VPCCB-27FX with hybrid graphics. Using vgaswitcheroo enables me to switch my discrete card off to save power. Now when i want to switch to the discrete card for performance, my system freezes. I already tried logging out and killing x with service lightdm stop, but still, it freezes as soon as I echo DIS > switch. typing blindly, echo IGD > switch returns me to my console where it reads [ 179.555171] i915: switched off, but it seems the discrete card never gets switched on... running echo DDIS > switch gives me the following: [540....] [drm:atop_op_jump] *ERROR* atombios stuck in loop for more than 5secs aborting [540....] [drm:atom_execute_table_locked] *ERROR* atombios stuck executing CEE2 (len 62, WS 0, PS 0) @ 0xCEFE [540....] [drm:atom_execute_table_locked] *ERROR* atombios stuck executing BBF6 (len 1036, WS 4, PS 0) @ 0xBCF3 [540....] [drm:atom_execute_table_locked] *ERROR* atombios stuck executing BB8C (len 76, WS 0, PS 0) @ 0xBB94 [541....] [drm:r600_RING_TEST] *ERROR* radeon: ring test failed (scratch(0x8504)=0xFFFFFFFF) [541....] [drm:evergreen_resume] *ERROR* evergreen startup failed on resume after that, the atombios part repeats a few times. also, the terminal locks up again and sysrq+REISUB is my only rescue. Has anybody an idea how I can switch to my discrete card without the system locking up? #uname -srvmpio Linux 3.2.0-24-generic #39-Ubuntu SMP Mon May 21 16:52:17 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux #lsb_release -r Description: Ubuntu 12.04 LTS

    Read the article

  • AMD E-450 APU with HD-6320 graphics produces jerky videos

    - by user80424
    I try to make videos smooth playing on a Lenovo E325 laptop equipped with AMD E-450 APU. This processor have Ati HD-6320 GPU integrated. I installed ATI proprietary driver (Catalyst 12.04) as described here. Everything went fine and got no errors. However I can not play smooth HD videos. Almost every second frame has been dropped in VLC with hardware acceleration enabled. vainfo shows: libva: VA-API version 0.32.0 Xlib: extension "XFree86-DRI" missing on display ":0". libva: va_getDriverName() returns 0 libva: Trying to open /usr/lib/x86_64-linux-gnu/dri/fglrx_drv_video.so libva: va_openDriver() returns 0 vainfo: VA-API version: 0.32 (libva 1.0.15) vainfo: Driver version: Splitted-Desktop Systems XvBA backend for VA-API - 0.7.8 vainfo: Supported profile and entrypoints VAProfileH264High : VAEntrypointVLD VAProfileVC1Advanced : VAEntrypointVLD fglrxinfo says: display: :0 screen: 0 OpenGL vendor string: Advanced Micro Devices, Inc. OpenGL renderer string: AMD Radeon HD 6320 Graphics OpenGL version string: 4.2.11631 Compatibility Profile Context and fgl_glxgears produces ~250fps. Why are HD video frames dropped? CPU doesn't goes above 50% during playback.

    Read the article

  • Ubuntu 14.04 install fails with Via S3 UniChrome Pro graphics

    - by WizardNo.7
    I am trying to install Ubuntu 14.04 on a Fujitsu Siemens Amilo Pro laptop(it's quite old yes, has about 30GB Hard Drive and I think 192mb of RAM) which currently has Windows XP installed (which I'd like to keep for the time being). I have downloaded the 32-bit Desktop ISO and used unetbootin to create a Live USB for this laptop. When I boot from USB, I arrive at the unetbootin Grey and Blue menu and pick either "Try Ubuntu without installing", or "Install Ubuntu". The menu goes away and an Ubuntu loadscreen showing UBUNTU and four dots which progressively change between white and orange. At about the second color changing cycle a white underscore symbol appears next to the fourth dot and flickers. There is some leftover text from the kernel boot still visible, but there is no graphical desktop. After this I have to hard reboot or shut-down. $ lshw -c video *-display UNCLAIMED description: VGA compatible controller product: CN700/P4M800 Pro/P4M800 CE/VN800 Graphics [S3 UniChrome Pro] Vendor: VIA Technologies, Inc. physical id: 0 bus info: pci@0000:01:00.0 version: 01 width: 32 bits clock: 66MHz capabilities: pm agp agp-2.0 vga_controller bus_master cap_list configuration: latency=64 mingnt=2 resources: memory:f0000000-f3ffffff memory:d1000000-d1ffffff

    Read the article

  • Rendering 8 bit graphics

    - by Matjaz Muhic
    I have a strong programming background just not from game development. I only made some pong and snake in high school and I did some OpenGL in college. I want to make my own game engine. Nothing fancy just a simple 2D game engine. But because I'm kinda old school and feeling retro. I want graphics to look like old 8 bit games (megaman, contra, super mario, ...). So how were the old games made back then? I want the simplest approach. Were they also using assets (images) like newer engines now do? How do you achieve this kind of rendering using OpenGL? Keep in mind. Simplest solution. I want to know how it was made back then and how I can replicate that. Doesn't even have to be OpenGL. I can draw on window canvas. I do want to make it from scratch basically.

    Read the article

  • How can I switch between the HDMI and DVI outputs of my graphics card?

    - by Owen Melbourne
    I've got an Nvidia GTX 560ti card which currently I've got my 2 monitors hooked up to using the 2 DVI ports. However its got a mini-HDMI port which I've plugged a HDMI cable in (with mini adapter) and lead it into my TV which is across the room. I'm hoping to be able to toggle between the HDMI output and the DVI outputs, however I'm not sure how I'd go about this, Could somebody please point me in the right direction, I'm not really worried about having all 3 on at the same time so that isn't a problem, but if its possible then I'll do that.

    Read the article

  • Can I get dual monitor support (2xDVI) from my (DVI+HDMI) graphics card?

    - by nray
    It seems that pure 2 x DVI dual head video cards are becoming rare. Most cards feature something like 1 x DVI plus 1 x HDMI plus 1 x VGA or some other interface. The idea seems to be that you can just use an HDMI <= DVI adapter. One result is that cards are seldom marked " 2 x DVI " anymore, but does this mean that they support simultaneous output on all interfaces? Are all cards dual head these days? Take Asus's nVidia cards for example, they routinely have 1 x DVI plus 1 x HDMI instead of 2 x DVI, so my question is, are these equivalent to a dual head DVI card, or is there some detail required for dual monitor support? I use dual-monitor stretched desktops for digital signage projects.

    Read the article

< Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >