Search Results

Search found 13534 results on 542 pages for 'gpu programming'.

Page 18/542 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • Bad texture on model with different GPU

    - by Pacha
    I have some kind of distortion on the texture of my 3D model. It works perfectly well on an AMD GPU, but when testing on a integrated Intel HD graphics card it has a weird issue. I don't have a problem with the rest of my entities as they are not scaled. The models with the problems are scaled, as my engine supports different sizes for the platforms. I am using Ogre3D as rendering engine, and GLSL as shader language. Vertex shader: #version 120 varying vec2 UV; void main() { UV = gl_MultiTexCoord0; gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } Fragment shader: #version 120 varying vec2 UV; uniform sampler2D diffuseMap; void main(void) { gl_FragColor = texture(diffuseMap, UV); } Screenshot (the error is on the right and left side, the top and bottom part are rendered perfectly well):

    Read the article

  • How to render Minecraft on the GPU?

    - by l0b0
    Hardware: Intel i7 AMD Radeon HD 6970 SSD with plenty of space 6 GB RAM Software OpenJDK 6, 7, and Oracle Java 7 (reproducible with all three) AMD Catalyst 12.8 and open source driver (reproducible with both) Ubuntu 12.04 x86_64 and older Minecraft 1.3.2 vanilla and older On this setup I am getting rubbish frame rates after a short while of playing, dropping from about 45-55 to 15 in a couple of minutes. CPU use is 40-45 even when rendering the opening screen at 1920x1280, and gameRenderer is using about 90% CPU when playing. Rather than trying to eke out a few more FPS out of an obviously broken rendering pipeline, I really hope to find a solution to make the GPU render Minecraft.

    Read the article

  • What do you mean by the expressiveness in a programming language?

    - by prosseek
    I see a lot of the word 'expressiveness' when people want to stress one language is better than the other. But I don't see exactly what they mean by it. Is it the verboseness/succinctness? I mean, if one language can write down something shorter than the other, does that mean expressiveness? Please refer to my other question - http://stackoverflow.com/questions/2411772/article-about-code-density-as-a-measure-of-programming-language-power Is it the power of the language? Paul Graham says that one language is more powerful than the other language in a sense that one language can do that the other language can't do (for example, LISP can do something with macro that the other language can't do). Is it just something that makes life easier? Regular expression can be one of the examples. Is it a different way of solving the same problem: something like SQL to solve the search problem? What do you think about the expressiveness of a programming language? Can you show the expressiveness using some code? What's the relationship with the expressiveness and DSL? Do people come up with DSL to get the expressiveness?

    Read the article

  • What do you mean by the expressiveness in programming lanuguage?

    - by prosseek
    I see a lot of the word 'expressiveness' when people want to stress one language is better than the other. But I don't see exactly what they mean by it. Is it the verboseness/succinctness? I mean, if one language can write down something shorter than the other, does that mean expressiveness? Please refer to my other question - http://stackoverflow.com/questions/2411772/article-about-code-density-as-a-measure-of-programming-language-power Is it the power of the language? Paul Graham says that one language is more powerful than the other language in a sense that one language can do that the other language can't do (for example, LISP can do something with macro that the other language can't do). Is it just something that makes life easier? Regular expression can be one of the examples. Is it a different way of solving the same problem: something like SQL to solve the search problem? What do you think about the expressiveness of a programming lanuage? Can you show the expressiveness using some code? What's the relationship with the expressiveness and DSL? Do people come up with DSL to get the expressiveness?

    Read the article

  • Why do GPUs overheat?

    - by JAD
    About a year ago, I added a 9800GT (1 GB version) and a Corsair CX500 PSU to an HP M8000N computer. A few weeks ago, the HDD overheated and I decided to transfer the GPU & PSU to a new build, which consists of: i3 @ 3.3Ghz Gigabyte H61 Micro ATX Mobo 4GB RAM 500GB WD HDD DVD RW Drive Cooler Master Elite 430 Tower Once I had Win7 up and running, I installed all the essential drivers that came with the Gigabyte Mobo CD. However, whenever I tried installing the Graphics Media Accelerator driver, the computer would crash and enter an endless boot sequence on the next startup. I skipped installing this driver and installed the CD driver for the 9800GT, which by now is a year old. Everything was working fine, WEI rated my GPU at 6.6 graphics & aero performance. However, after updating my Nvidia drivers to the latest, the WEI dropped my rating to 3.3 for Aero, and 4.7 for graphics performance. Just to make sure that everything was ok, I ran Bad Company 2 on medium settings. The first few minutes ran just fine at a smooth framerate, so I dismissed this as Windows being Windows. About 6 hours later, I ran BC2 again. This time I averaged anywhere from 2-5 FPS. I checked the GPU temperature through GPU-Z, and it came back as 120C. The problem with this, is that the computer was on for six hours up to that point. Wouldn't the card have experienced a reactor core meltdown a lot sooner than that? Granted, the computer was "sleeping" some of the time, but still... The next day I took out a temperature gun and ran some tests. I would point the laser at a very specific area on the reverse side of the card (not the fan or "front"), and compare the temp reading with GPU-Z. After leaving the system on idle on idle for a few minutes, I ran BC2 twice. Here are the results: GPU-Z Reading / Temp Gun Reading / Time Null / 22.3°C / Comp is Off 53°C / 33.5°C / 1:49 78°C / 46°C / 1:53 - (First BC2 run; good framerate) 102°C / 64.6°C / 2:01 - (System is again on idle) 113°C / 64.8°C / 2:10 119°C / 71.8°C / 2:17 - (Second BC2 run; poor framerate) I should also mention that I also took a temp recording of another part of the GPU from 2:01-2:17. The temp in this area jumped from 75°C to 82.9°C in that time frame. This pretty much confirms that GPU-Z is reporting the temperature accurately, and the card is overheating. But I'd like to know why; the cars is doing nothing and still the temperature climbs at a steady rate. I thoroughly cleaned the GPU and PSU when I salvaged them from the old HP M8000N computer with a can of compressed air, dust cant be the issue. Similarly, the rest of the computer is brand new. I installed various Nvidia drivers, but no luck. It seems strange to me that a year-old card is suddenly failing on me; aren't they supposed to last at least two years? Could this be a driver issue? Is the motherboard faulty? Could the PSU be overfeeding the card on voltage? Neither case seems likely, as the CPU, RAM and otherwise the rest of the comp has worked flawlessly and has stayed well within respectable temp ranges (the i3 lingers around 50C, the HDD stays at 30C, so does the PSU). How can I pinpoint the issue?

    Read the article

  • How to determine if a programming language is verbose or terse?

    - by sunpech
    Programming languages can often be described as verbose or terse. From my understanding, a verbose language is easy to read and understand, while a terse language is concise and neat, but more difficult to read. Should there be other things to consider in the definitions? It seems much of the popular programming languages of today are verbose, and these terms two terms are only used to describe a language as being more or less, relative to than another language. How do we determine if a programming language is more verbose/terse over another? Example: Is C# more verbose than Java?

    Read the article

  • What are the options for setting up a UNIX environment to learn C using Kernighan and Richie's The C Programming Language?

    - by ssbrewster
    I'm a novice programmer and have been experimenting with Javascript, jQuery and PHP but felt I wasn't getting a real depth of understanding of what I was doing. So, after reading Joel Spolsky's response to a question on this site (which I can't find now!), I took it back to basics and read Charles Petzold's 'Code' and am about to move onto Kernighan and Richie's The C Programming Language. I want to learn this in a UNIX environment but only have access to a Windows system. I have Ubuntu 12.04 running on a virtualised machine via VMWare Player, and done some coding in the terminal. Is using a Linux distro the only option for programming in a UNIX environment on Windows? And what are the next steps to start programming in C in UNIX and where do I get a compiler from?

    Read the article

  • What would you do if your client required you not to use object-oriented programming?

    - by gunbuster363
    Would you try to persuade your client that using object-oriented programming is much cleaner? Or would you try to follow what he required and give him crappy code? Now I am writing a program to simulate the activity of ants in a grid. The ant can move around, pick up things and drop things. The problem is while the action of the ants and the positions of each ant can be tracked by class attributes easily (and we can easily create many instances of such ants) my client said that since he has a background in functional programming he would like the simulation to be made using functional programming. What would you do?

    Read the article

  • generic programming- where did it originate?

    - by user997112
    Im trying to work out if generic programming was a functional programming feature which was then introduced into Java, C++ and C# or did the latter copy it from the functional programming languages like Haskell, Lisp, OCaml etc? Google is giving me lots on what generic programming is, but not where it originated. All I can see is that Ada implemented it early on. Would you class it as a functional programming technique?

    Read the article

  • Is programming for me? It seems too rigid and unforgiving.

    - by AM
    This question is a follow-up to: Should I continue to pursue programming based on my experience? I am currently majoring in CS in college and was thinking along similar lines as the above question. I'm fine at math and logic, but I haven't yet found programming to be enjoyable. Although I like the idea of being able to build software, too much of it seems to consist of figuring out tiny details or dealing with annoying bugs. So far I've only done small school projects and the like. Does programming become more enjoyable once you have more experience? How can someone know if a career in it is for them?

    Read the article

  • How to deal with the need to know multiple programming languages? When to stop learning new languages?

    - by Raphael
    I am a relatively young programmer. I am 23 and I have been programming professionally for about 5 years. As most programmers I started with C, learned some x86 assembly for fun and then I found C++ which turned out to be my greatest passion in the programming world. Programming with C and C++ forces you to learn platform specific APIs, libs and frameworks all of each requires constant study and experimentation. After some time I had to move on to Java and C# as the demand on my region is basically for these languages. With these languages I entered the world of web development and then I had to learn javascript. Developing for the .NET Framework was exciting at first but I constantly felt as I was getting tied up by Microsoft (and of course the .NET Framework was driving me away from Linux). For desktop development I could do pretty much everything I did with .NET using C++ with Qt but for web development I had to look for an alternative. Quickly I found Django and then I proceeded to learn Python so I could use Django. Nowadays I am learning iOS development with Objective-C. So far it was pretty much easy to learn all these languages (C++ trained me well) but I am worried that someday I won't be able to keep track of them all. Just to clarify. The only languages I learned cause I had to were C# and Java. All of the others I learned for fun, because I love programming and learning new things. Also I like to keep my skills sharp on desktop, web and mobile development. My question is: How do you keep track of multiple programming languages? (I mean, keep track of changes to these languages and keep your skills sharp) and: Is there such a thing as enough programming languages?

    Read the article

  • Why hasn't functional programming taken over yet?

    - by pankrax
    I've read some texts about declarative/functional programming (languages), tried out Haskell as well as written one myself. From what I've seen, functional programming has several advantages over the classical imperative style: Stateless programs; No side effects Concurrency; Plays extremely nice with the rising multi-core technology Programs are usually shorter and in some cases easier to read Productivity goes up (example: Erlang) Imperative programming is a very old paradigm (as far as I know) and possibly not suitable for the 21st century Why are companies using or programs written in functional languages still so "rare"? Why, when looking at the advantages of functional programming, are we still using imperative programming languages? Maybe it was too early for it in 1990, but today?

    Read the article

  • ASUS K55VM Laptop unexpectedly shuts down

    - by Abhishek Sha
    I've read quite a few questions on SuperUser of people having laptop shutdown problems but mine is different. My laptop specs: Intel Core i7 3610QM (IvyBridge) NVIDIA GT630M 2GB and Intel GMA4000 8GB RAM Windows 7 64 Bit My laptop occasionally shuts down when playing FarCry 3. It's around 5 months old. I've played games like Crysis and it never shuts down unexpectedly. Since I experienced this shut down recently, I decided for GPU-Z to log the temperatures. The final log value at the time of shutdown were thus: GPU Core Clock [MHz] - 797.3 GPU Memory Clock [MHz] - 896.8 GPU Temperature [°C] - 89.0 GPU Load [%] - 99 Memory Controller Load [%] - 36 Video Engine Load [%] - 0 Memory Usage (Dedicated) [MB] - 535 Memory Usage (Dynamic) [MB] - 53 VDDC [V] - 1.0620 My drivers are up-to-date and I didn't encounter any BSODs at the time of shut down. It simply turns off.

    Read the article

  • Assembling a number-crunching computer [closed]

    - by tugrul büyükisik
    What is needed to make a GPU fully fed by CPU? Comparing their flops/s is enough? For example, if i could manage to make a very old(pentium-3) CPU with one of Nvidia-Fermi GPU, it would not be able to fed the gpu with data per sec. What is the criteria to fit CPU to GPU exactly when OpenCL or some similar work needed? Of course RAM and BUS will be chosen in a similar way but how exactly? Assume each GPU-core will calculate a sqrt and a division and an adding for 100 times for every itearation. Thanks.

    Read the article

  • Is there a programming language that performs currying when named parameters are omitted?

    - by Adam Gent
    Many functional programming languages have support for curried parameters. To support currying functions the parameters to the function are essentially a tuple where the last parameter can be omitted making a new function requiring a smaller tuple. I'm thinking of designing a language that always uses records (aka named parameters) for function parameters. Thus simple math functions in my make believe language would be: add { left : num, right : num } = ... minus { left : num, right : num } = .. You can pass in any record to those functions so long as they have those two named parameters (they can have more just "left" and "right"). If they have only one of the named parameter it creates a new function: minus5 :: { left : num } -> num minus5 = minus { right : 5 } I borrow some of haskell's notation for above. Has any one seen a language that does this?

    Read the article

  • Ubuntu Dual Screen Using Virtual Machine - AMD GPU

    - by Chris
    I've been searching online and reading tutorials and etc about how to make my ubuntu VM dual screen(x86_64). I have first tried to run these commands: sudo aticonfig --initial -f which gave me the ouput of: sudo: aitconfig: command not found I then googled the output and followed these instructions that I tells me to install my ATI drivers onto my ubuntu. wget http://www2.ati.com/drivers/linux/ati-driver-installer-11-5-x86.x86_64.run sudo sh ati-driver-installer-11-5-x86.x86_64.run --buildpkg Ubuntu/natty sudo dpkg -i *.deb sudo apt-get -f install sudo aticonfig -f --initial --adapter=all sudo reboot It all works well until I input sudo apt-get -f install which gives me the following output: sudo apt-get -f install Reading package lists... Done Building dependency tree Reading state information... Done 0 upgraded, 0 newly installed, 0 to remove and 25 not upgraded. 3 not fully installed or removed. After this operation, 0 B of additional disk space will be used. Setting up fglrx (2:8.850-0ubuntu1) ... update-alternatives: error: alternative link /usr/bin/aticonfig is already managed by x86_64-linux-gnu_gl_conf. dpkg: error processing fglrx (--configure): subprocess installed post-installation script returned error exit status 2 dpkg: dependency problems prevent configuration of fglrx-amdcccle: fglrx-amdcccle depends on fglrx; however: Package fglrx is not configured yet. dpkg: error processing fglrx-amdcccle (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of fglrx-dev: fglrx-dev depends on fglrx; however: Package fglrx is not configured yet. dpkg: error processing fglrx-dev (--configure): dependency problems - leaving unconfigured No apport report written because the error message indicates its a followup error from a previous failure. No apport report written because the error message indicates its a followup error from a previous failure. Errors were encountered while processing: fglrx fglrx-amdcccle fglrx-dev E: Sub-process /usr/bin/dpkg returned an error code (1) At this point, I don't know what to do since running: gksudo amdcccle For the record, I have 3D acceleration turned on. The following is my GPU for my VM lspci | grep VGA 00:02.0 VGA compatible controller: InnoTek Systemberatung GmbH VirtualBox Graphics Adapter Any Help on how I can make my VM dual screen with Ubuntu would be great. Thank you in advance.

    Read the article

  • GPU hung when switching graphic card

    - by Lie Ryan
    I have a laptop (Dell Inspiron N4110) with a switchable graphic. $ lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: ATI Technologies Inc NI Whistler [AMD Radeon HD 6600M Series] (rev ff) Normally, my laptop starts with both graphic cards enabled, which caused the laptop to turn very hot and the fan to become very noisy. I have been using a small script to disable the Radeon card. For some time, I'm quite happy with this arrangement. However, I have been having some issues with the Intel card (IGD), the Intel card often randomly hang when running OpenGL apps; and so I want to give the Radeon card (DIS) another chance. I have never been able to switch to the Radeon card, but recently, I found out that if I do a "delayed switching" (DDIS): # echo "DDIS" > /sys/kernel/debug/vgaswitcheroo/switch root@lieryan-dell-ubuntu:/sys/kernel/debug/vgaswitcheroo# cat switch 0:IGD:+:Pwr:0000:00:02.0 1:DIS: :Pwr:0000:01:00.0 then I logoff (i.e. to restart X), the screen switch to pseudo-tty and then it stuck there freezing. At this situation, mouse and keyboard stops working so I can't switch to another ptty. I tried ssh-ing from another computer to salvage logs (dmesg at that point) and whatnot; I found out that when freezing, the active graphic card is the AMD card: -- this is from ssh -- # cat switch 0:IGD: :Off:0000:00:02.0 1:DIS:+:Pwr:0000:01:00.0 but the GPU is apparently hung, looking at dmesg gives: ... [ 1411.649974] vga_switcheroo: client 0 refused switch [ 1411.649985] vga_switcheroo: setting delayed switch to client 1 [ 1423.911759] vga_switcheroo: processing delayed switch to 1 [ 1424.006564] fbcon: Remapping primary device, fb1, to tty 1-63 [ 1424.006799] i915: switched off [ 1424.840351] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id [ 1425.718088] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id [ 1426.622377] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id [ 1427.355683] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id [ 1428.193549] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id ... the invalid framebuffer id error is repeated for many times over ... I were able to successfully recover by switching back to the Intel card and restarting X from ssh; indicating that only the Radeon card has problems switching. System info: $ uname -a Linux lieryan-dell-ubuntu 3.0.0-14-generic #23-Ubuntu SMP Mon Nov 21 20:28:43 UTC 2011 x86_64 x86_64 x86_64 GNU/Linux $ lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 11.10 Release: 11.10 Codename: oneiric The laptop also do not have the option to set graphic card at BIOS and the proprietary driver, fglrx, also have never worked; when I installed it through jockey ("Additional Drivers"), glxinfo showed that it still being rendered by Mesa, the /sys/kernel/debug/vgaswitcheroo directory has gone missing, and the driver crashes with a traceback if I use xorg.conf to tell X to use fglrx. Anyone had any idea if it is possible to use this AMD card either with the radeon or the fglrx driver? logs: dmesg

    Read the article

  • What languages are the kids of today actually programming in? Does anyone have real data?

    - by Gaz Davidson
    Back in the 80s colleges were teaching Pascal because it is easy to learn, while myself and many others like me were learning BASIC because it was not only easy to learn but accessible and also fashionable (for an extremely liberal definition of fashion) It has just occurred to me that empirical data on the actual programming languages kids are choosing to use should be a good indicator of which language would be the ideal first choice for educators. Please note that this question is not "what do you think is a good programming language for kids?"

    Read the article

  • Why aren't web frameworks simple, elegant and fun like programming languages? [on hold]

    - by Ryan
    When I think of pretty much any programming language - like C, C++, PHP, SQL, JavaScript, Python, ActionScript, Haskell, Lua, Lisp, Java, etc - I'm like awesome I would love to develop a computer application using any of those languages. But when I think of web frameworks(I do mostly PHP) - like Cake, CI, Symfony, Laravel, Zend, Drupal, Joomla, Wordpress, Rails, Django, etc - I'm like god no. Why aren't there web frameworks that provide me with simple, fun and powerful constructs like a programming language?

    Read the article

  • Are closures with side-effects considered "functional style"?

    - by Giorgio
    Many modern programming languages support some concept of closure, i.e. of a piece of code (a block or a function) that Can be treated as a value, and therefore stored in a variable, passed around to different parts of the code, be defined in one part of a program and invoked in a totally different part of the same program. Can capture variables from the context in which it is defined, and access them when it is later invoked (possibly in a totally different context). Here is an example of a closure written in Scala: def filterList(xs: List[Int], lowerBound: Int): List[Int] = xs.filter(x => x >= lowerBound) The function literal x => x >= lowerBound contains the free variable lowerBound, which is closed (bound) by the argument of the function filterList that has the same name. The closure is passed to the library method filter, which can invoke it repeatedly as a normal function. I have been reading a lot of questions and answers on this site and, as far as I understand, the term closure is often automatically associated with functional programming and functional programming style. The definition of function programming on wikipedia reads: In computer science, functional programming is a programming paradigm that treats computation as the evaluation of mathematical functions and avoids state and mutable data. It emphasizes the application of functions, in contrast to the imperative programming style, which emphasizes changes in state. and further on [...] in functional code, the output value of a function depends only on the arguments that are input to the function [...]. Eliminating side effects can make it much easier to understand and predict the behavior of a program, which is one of the key motivations for the development of functional programming. On the other hand, many closure constructs provided by programming languages allow a closure to capture non-local variables and change them when the closure is invoked, thus producing a side effect on the environment in which they were defined. In this case, closures implement the first idea of functional programming (functions are first-class entities that can be moved around like other values) but neglect the second idea (avoiding side-effects). Is this use of closures with side effects considered functional style or are closures considered a more general construct that can be used both for a functional and a non-functional programming style? Is there any literature on this topic? IMPORTANT NOTE I am not questioning the usefulness of side-effects or of having closures with side effects. Also, I am not interested in a discussion about the advantages / disadvantages of closures with or without side effects. I am only interested to know if using such closures is still considered functional style by the proponent of functional programming or if, on the contrary, their use is discouraged when using a functional style.

    Read the article

  • Why do most programming languages only support returning a single value from a function?

    - by M4N
    Is there a reason or an explanation why functions in most(?) programming languages are designed to support any number of input parameters but only one return value? In most languages, it is possible to "work around" that limitation, e.g. by using out-parameters, returning pointers or by defining/returning structs/classes. But it seems strange, that programming languages were not designed to support multiple return values in a more "natural" way.

    Read the article

  • Will there be any more books in the Game Programming Gems series?

    - by Laurent Couvidou
    It's been more than three years now that the last Game Programming Gems book was published. The official website isn't updated anymore, and this page of Mark DeLoura's website seems to imply that the series is over. Was there ever an official statement about this? Was number 8 the last book? The Game Programming Gems were one of the most (if not the most) important resource for me and probably thousands of developers around the globe, did the Internet kill them?

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >