Search Results

Search found 8165 results on 327 pages for '3d graphics'.

Page 37/327 | < Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >

  • How to draw a spotlight in 3D

    - by RecursiveCall
    To be clear, I am not talking about the light result (the lit area) but the spotlight itself, like this The two common suggestions that I tried are 2D image and a 3D cone. The problem with the pre-regenerated 2D image is that it always look 2D and flat no matter how it is rotated in world space. The cone on the other hand is next to impossible to control when it comes to fade distance, it doesn't look soft (smooth) and it is expensive to compute.

    Read the article

  • Is the cooling fan on a 2004 graphics card really necessary?

    - by Andrew
    I have an old GeForce 6600 card in my computer, circa 2004. Recently the fan has started playing up and making loud irritating noises. I've tried oiling it with no luck. This is the second fan I've put on the card, the stock one broke ages ago. Is a card this old really likely to need a cooling fan or can I remove it altogether? It has a decent heatsink on the chip but there's not a lot of airflow in that part of the box. Edit: I should add that I seem to remember most mid range graphics cards at the time I bought that didn't have fans (pretty sure they had heatsinks only), which is why I'm wondering.

    Read the article

  • How to solve ATI Radeon 7670M/ Intel HD Switchable Graphics recurring BSOD error?

    - by Ashin Mandal
    I recently bought a VAIO E Series laptop with an ATI Radeon 7670M and Intel HD Switchable Graphics. Around about one month ago, my computer started shutting down with atikmpag.sys blue screen error and the frequency of the errors kept on increasing with time. I am really worried about my VGA Adapter and whether or not it's gone bad. It's a 3 month old laptop! I searched on Google about atikmpag.sys BSOD and most of them suggested that my driver probably needs to be updated. I searched the Sony website and also the ATI website, I did have the latest drivers installed. Yet, I uninstalled the driver and installed it back again. But blue screens just keep coming back again and again. It generally happens more frequently when I have a video running and have paused it to do something else. I hope it's a driver issue!

    Read the article

  • Client PC not booting when certain TFT plugged in - TFT or graphics card failure?

    - by Chake
    here comes something quite strange: On a client machine (DELL Vostro 420) we experienced problems when booting: when turning on the machine beeps normal but doesn't display anything and doesn't boot. After some testing I found out, that this only happens if one (of the two) monitors (Iiyama ProLite E2472HDD) is plugged in while booting. If the other monitor (TFT 2) is plugged in everything is fine. Here a small illustration, TFT 1 is the bad guy: TFT 1 | TFT 2 | failure x | x | x x | | x | x | After BIOS-Phase I can safely plug in TFT 1 and everything works just fine. The question is, what can be done to avoide this behavior: Change monitor? (Iiyama ProLite E2472HDD) Change graphics card? (GeForce 9800 GT) Other suggestions?

    Read the article

  • Will I have an easier time learning OpenGL in Pygame or Pyglet? (NeHe tutorials downloaded)

    - by shadowprotocol
    I'm looking between PyGame and Pyglet, Pyglet seems to be somewhat newer and more Pythony, but it's last release according to Wikipedia is January '10. PyGame seems to have more documentation, more recent updates, and more published books/tutorials on the web for learning. I downloaded both the Pyglet and PyGame versions of the NeHe OpenGL tutorials (Lessons 1-10) which cover this material: lesson01 - Setting up the window lesson02 - Polygons lesson03 - Adding color lesson04 - Rotation lesson05 - 3D lesson06 - Textures lesson07 - Filters, Lighting, input lesson08 - Blending (transparency) lesson09 - 2D Sprites in 3D lesson10 - Moving in a 3D world What do you guys think? Is my hunch that I'll be better off working with PyGame somewhat warranted?

    Read the article

  • Where can I buy freely redistributable (creative commons) game assets?

    - by Erlend
    I'd like to know about any 3D asset shops out there that specialize in game assets and, most importantly, license their assets under an open license like Creative Commons or similarly permissive. We are looking to buy some professional looking assets for use and redistribution with our open source 3D game engine. The problem is that all the commercial 3D assets we've come by are only sold under very restrictive licenses, which won't allow us to include the models in our code repository (since free code hosting repositories require that all your data, including media, is open source or otherwise copyleft) nor in turn redistribute the assets as part of our downloadable SDK. I realize this sounds like a weak business idea, since users could just buy the asset and start sharing it with everyone. But somehow this has worked for hundreds of WordPress theme shops, so I was hoping maybe someone's trying similar things for commercial game assets.

    Read the article

  • What exactly is UV and UVW Mapping?

    - by Michael Stum
    Trying to understand some basic 3D concepts, at the moment I'm trying to figure out how textures actually work. I know that UV and UVW mapping are techniques that map 2D Textures to 3D Objects - Wikipedia told me as much. I googled for explanations but only found tutorials that assumed that I already know what it is. From my understanding, each 3D Model is made out of Points, and several points create a face? Does each point or face have a secondary coordinate that maps to a x/y position in the 2D Texture? Or how does unwrapping manipulate the model? Also, what does the W in UVW really do, what does it offer over UV? As I understand it, W maps to the Z coordinate, but in what situation would I have different textures for the same X/Y and different Z, wouldn't the Z part be invisible? Or am I completely misunderstanding this?

    Read the article

  • 2d shapes in XNA 4.0?

    - by Lautaro
    Having some experience of XNA but none of 3D programming. I have an idea i want to realize but i have not decided to do it in 3d or 2d. Im not sure which one will be best in XNA. I want to have a shape like a blob that can reshape depending on input. The morphing does not need to be very advanced. It could be a circle (2d) or globe (3d) that just has one point that moves slightly in a random direction. In ASP.NET i have made this through the 2d Draw classes where i can make lines, circles, squares etc and then modify the points that makes them up. But it seems to me that XNA does not have classes for making 2d shapes (can i get this confirmed?). If it had, then this would be the quickest solution for me.

    Read the article

  • Depth is disabled - How to turn on?

    - by marc wellman
    In XNA 3.1 is there any other way to disable depth in 3D Worlds using DirectX models other than GraphicsDevice.RenderState.DepthBufferEnable = false; ? The reason for my question is I have quite a huge program which offers a 3D World with a couple of 3D DirectX models inside. Depth was never an issue since it ever worked fine but since a few days after doing some modifications my models are all depth-translucent i.e. depth-buffering and/or culling seems to be disabled. But in my whole source code I never touch any of the options related to Depth or Culling which means I never turn these settings on explicitly nor turn it off somewhere. So I am searching for some other statement maybe related to the GraphicsDevice that implicitly turns depth off - but I can't find it. (Sorry that I don't post any source code but I have too much source code and I simply don't know where to search) UPDATE: These are a couple of simple objects seen with correct depth. These are the same objects in their current state.

    Read the article

  • .NET Graphics.ScaleTransform converts print job to bitmap. Any other way to scale text?

    - by Philip Dunaway
    I'm using Graphics.ScaleTransform to stretch lines of text so they fit the width of the page, and then printing that page. However, this converts the print job to a bitmap - for a print with many pages this causes the size of the print job to rise to obscene proportions, and slows down printing immensely. If I don't scale like this, the print job remains very small as it is just sending text print commands to the printer. My question is, is there any way other than using Graphics.ScaleTransform to stretch the width of the text? Sample code to demonstrate this is below (would be called with Print.Test(True) and Print.Test(False) to show the effects of scaling on print job): Imports System.Drawing Imports System.Drawing.Printing Imports System.Drawing.Imaging Public Class Print Dim FixedFont As Font Dim Area As RectangleF Dim CharHeight As Double Dim CharWidth As Double Dim Scale As Boolean Const CharsAcross = 80 Const CharsDown = 66 Const TestString = "!""#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~" Private Sub PagePrinter(ByVal sender As Object, ByVal e As PrintPageEventArgs) Dim G As Graphics = e.Graphics If Scale Then Dim ws = Area.Width / G.MeasureString(Space(CharsAcross).Replace(" ", "X"), FixedFont).Width G.ScaleTransform(ws, 1) End If For CurrentLine = 1 To CharsDown G.DrawString(Mid(TestString & TestString & TestString, CurrentLine, CharsAcross), FixedFont, Brushes.Black, 0, Convert.ToSingle(CharHeight * (CurrentLine - 1))) Next e.HasMorePages = False End Sub Public Shared Sub Test(ByVal Scale As Boolean) Dim OutputDocument As New PrintDocument With OutputDocument Dim DP As New Print .PrintController = New StandardPrintController .DefaultPageSettings.Landscape = False DP.Area = .DefaultPageSettings.PrintableArea DP.CharHeight = DP.Area.Height / CharsDown DP.CharWidth = DP.Area.Width / CharsAcross DP.Scale = Scale DP.FixedFont = New Font("Courier New", DP.CharHeight / 100, FontStyle.Regular, GraphicsUnit.Inch) .DocumentName = "Test print (with" & IIf(Scale, "", "out") & " scaling)" AddHandler .PrintPage, AddressOf DP.PagePrinter .Print() End With End Sub End Class

    Read the article

  • 3D in Windows 7 on Thinkpad R51

    - by John W.
    I installed Windows 7 on my laptop, and now the 3D doesn't work. According to dxdiag DirectX 11 is installed, but when I open up World of Warcraft, this error comes up. World of Warcraft World of Warcraft was unable to start up 3D acceleration. Please make sure DirectX 9.0c is installed and your video drivers are up-to-date. There are no alerts from Windows to update any drivers. Why doesn't the 3D work?

    Read the article

  • Ubuntu 12.04.3 Graphics Issues: Broken Pipes, Reinstalled Xorg and Bumblebee

    - by user190488
    It seems I have a problem, and am only making it worse by following what I find online. I have a new Asus N550JV-D71 (not sure about the part after the dash, though I definitely know it includes 71). I decided to downgrade Windows 8 to 7, then dual boot Ubuntu 12.04 with it (there were issues with Windows 8, and I had a Windows 7 disk handy). It did work and, after installing Bumblebee in tty (because it wouldn't boot when it was first installed), it worked marvelously for a little less than a week. However, I restarted it last night and got the Could not write bytes: Broken pipes error. (I see it's a very common error, but I've looked at the majority of the suggested Similar Questions already.) I followed what I could find online, followed those instructions (making sure to not install any sort of graphics drivers other than what Bumblebee provides), and it just seems to go further and further downhill. I'm afraid I didn't write the exact steps to get to this point (it was late by the time I gave up the night before), but it involved reinstalling lightdm, xorg (and xserver?), and Bumblebee. I then changed the Bumblebee.conf file so that Device=nvidia. I'm pretty new to Linux in general (I've used it since 10.04, but I hadn't had issues up until this computer, so it let me stay a newbie), so I'm not exactly sure what log files to look at to find the errors to look up. However, I did look at lshw and noticed that displays was marked as unassigned. Also, if I try to start lightdm using the command line, it always stops at Stopping Mount network filesystems. I should note that there isn't an xorg.conf file, and no .Xauthority. I would really, really prefer not to reinstall 12.04 if possible. I managed to get grub to display only a short time ago, and I can't boot to the dvd drive unless I go into the BIOS settings and manually change the boot order (that was an issue from the beginning, before the Ubuntu install), and getting into those settings often means rebooting several times due to the fact that the window to get to it is extremely small. I have most of what I need backed up, however, in case it does get to that point. If I really have to, I can just use the latest Ubuntu version instead of the LTS, but the reason I chose 12.04 in the first place is because I need something stable-ish, and Windows isn't suitable to what I need to do. I should note that the reason I restarted last night in the first place was that it wasn't charging the battery, and the wifi kept on going out. Hardware: Nvidia GeForce GT 750M Intel HD graphics 4600

    Read the article

  • Software emulated OpenGL with higher version than my graphics card supports

    - by leemes
    I have an Intel GMA 950 chipset in my netbook. I want to learn how to write OpenGL shader programs with this fantastic tutorial and therefore need OpenGL 3.3. Sadly, my graphics card only supports OpenGL 1.4. I think that MESA can emulate OpenGL in software, so I'm wondering if it can emulate OpenGL 3.3 without any hardware accelleration (performance is very much no problem, since this is only for learning and testing puroses). Is there any possibility to do this?

    Read the article

  • ATI Radeon HD7000 Series (Laptop) - Switch Mode Between ATI & Intel Integrated GPU. Stuck on Boot Screen On Intel GPU Selection Mode

    - by Monkey Drone
    Laptop Specs: HP Pavilion G6-2020SE GPUs 1) ATI HD7000 Series 2) Intel Integrated OS Installed: x) Ubuntu 12.04 (64 bit) i) ATI Graphics Card Drivers Installed From AMD website. Note: Graphics Card Drivers are Working Fine in 3D Mode. It runs a little Hot as it should since its a GPU. Observation) AMD Catalyst Control Centre Lets me Choose If I want to run the system in HIGH-END (ATI GPU) OR Intel Integrated (Better battery life) While I am on High End GPU Choice, Ubuntu works fine. Problem) But when I switch to Intel Mode in the AMD CCC and reboot the Machine. Ubuntu goes into 'Low Graphics Mode'. The problem is not that it goes into low graphics mode, it is completely expected since I am no longer using the ATI GPU but the integrated Intel GPU. Problem starts with the 'Selection' of the options. During that screen, I have no mouse on the screen (even tried plugging in an external USB mouse) & No Keyboard functionality. Thus I am left completely disabled to choose any option and load into Ubuntu. The Only thing I can do is switch to a terminal and enable ATI GPU through command-line and Ubuntu works Fine again. Is it a bug that there is no mouse/keyboard available to me during the startup of Ubuntu when its launched in Low-Graphics Mode? Any suggestions on how to pass through that? My palms are sweating as I write this down because the ATI GPU is really heating up my laptop. I dont want to boot into Windows or keep it around any longer than necessary. Please advise with help and directions. Sincerely, MonkeyD Edit1: The Answer by Celso has helped me switch to Intel, thus giving me sufficient battery power. Kudos to Celso. Now I can at least use my laptop for the time being without having it burn hair off my skin. I am still looking for answer to my original question of, 'why is lightdm not working properly when I switch to Intel GPU using ATI HD7000 series official drivers provided by AMD'.

    Read the article

  • When to unload graphics object from main memory?

    - by piotrek
    I writing my resource mangaer, and I consider about how it can work for graphics objects (like textures, meshes). I think about this : I want to load texture (in pseudocode): Texture t = resMgr.GetTex("image.png"); and GetTex make something like this: load texture from disk to main memory create texture object (load it to gpu memory) unload texture from main memory I consider about 3 step, does game engines that you know unload meshes/textures after load them into gpu memory ?

    Read the article

  • How can I get Windows 7 to work with two Nvidia graphics cards with different drivers?

    - by Max
    This is similar to this question, but I am using more similar cards with Windows 7. I just purchased a Zotac Nvidia GeForce 7200 GS. I have a motherboard with two PCI Express x16 slots. There is already an MSI Nvidia GeForce 8800 GTS being used as the primary card, driving two LCD monitors. I would like the Zotac to output to a TV via DVI-out. Unfortunately, when Windows detects the Zotac and installs its drivers, or I manually install them, Windows stops being able to boot up. If I remove them and re-install the MSI 8800 drivers, I can boot again, but Windows can no longer see the Zotac 7200--it shows up as a yellow triangle in Device Manager. I've read conflicting reports about this. Some people claim that Windows 7 will support multiple heterogeneous graphics card drivers, as long as they are all using the same driver API ("WDDM?"). Others say that they have to be using the exact same driver, or it won't work. Others claim that you have to use the exact same card. which is it, exactly? I know I can run the MSI 8800 in SLI if I purchase another, but I don't need that kind of power--I just need HD-out to my television. I read somewhere that running two cards in SLI precludes you from using 100% of their output ports, so I'm not sure if that's an option. I suppose I could also run two MSI 8800's without SLI, but again, that's more power than I need (and more money than I'd like to spend). Also, I don't think this exact model is even manufactured anymore. Any ideas?

    Read the article

  • Did Adobe Photoshop just killed my Graphics Card for good?

    - by user6004
    I was working with Adobe Photoshop, just some regular work, when I came to edit a PSD file and change the text of some layer, when all of a sudden the PC froze. No mouse, screen is frozen, keyboard strokes aren't getting me anything, no Task Manager, nada. So I rebooted my PC, and then something quite terrifying appeared before my eyes. It was not the Checkdisk utility that was launched, that made me terrified (by the way, that reboot damaged the partition table of an external HDD that was connected at the time to my PC, but that's another story). It was the screen itself. Please have a look. So after Checkdisk finished and Windows loaded, I noticed that the resolution was not right. Instead of 1440x900 which I had set, it was 1280x1024. When I went to change it back, I had no option to change back to my old resolution, and has only 3 other general resolution properties, as if my Video Card (GeForce 8800 GTS btw) was not recognized. And what do you know, in the Device Manager it appeared with an exclamation mark. Inside the hardware, it said this: Windows has stopped this device because it has reported problems. (Code 43) Uninstalling the drivers, downloading the newest drivers from NVIDIA and installing them did not work. It always comes back to this. So, do you have any advice before I go out and buy a new graphics card? I thought this was the only option left, but maybe the experts at Super User can help me out. By the way, the dotted screen appears after every reboot, and I see the dots when the ASUS Motherboard screen shows up at boot. Thanks in advance.

    Read the article

< Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >