Search Results

Search found 21470 results on 859 pages for 'computer graphics'.

Page 9/859 | < Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >

  • can i get the font information from Graphics System.Drawing.Graphics in c#

    - by Bahgat Mashaly
    Hello i get the Graphics from Graphics g= System.Drawing.Graphics.FromHwnd(button1.Handle); can i get the font information from this Graphics i was try to get a font by using GetTextFace api function but it return "system" it mean default font in OS and i was try to use SendMessage(button1.Handle, WM_GETFONT, 0, 0); bu it return me 0 also it is mean default font in OS I have known the cause of the problem, it due to FlatStyle property See this link http://blogs.msdn.com/b/michkap/archive/2008/09/26/8965526.aspx thanks

    Read the article

  • Someone used or hacked my computer to commit a crime? what defense do I have?

    - by srguws
    Hello, I need IMMEDIATE Help on a computer crime that I was arrested for. It may involve my computer, my ip, and my ex-girlfriend being the true criminal. The police do not tell you much they are very vague. I was charged though! So my questions are: -If someone did use my computer at my house and business and post a rude craigslist ad about a friend of my girlfriend at the time from a fake email address, how can I be the ONLY one as a suspect. Also how can I be charged. I noticed the last few days there are many ways to use other peoples computers, connections, etc. Here are a few things I found: You can steal or illegally use an ip addresss or mac address. Dynamic Ip is less secure and more vulnerable than static. People can sidejack and spoof your Mac, Ip, etc. There is another thing called arp spoofing. I am sure this is more things, but how can I prove that this happened to me or didnt happen to me. -The police contacted Craigslist, the victim, aol, and the two isp companies. They say they traced the IP's to my business and my home. My ex was who I lived with and had a business with has access to the computers and the keys to bothe buildings. My brother also lives and works with me. My business has many teenagers who use the computer and wifi. My brother is a college kid and also has friends over the house and they use the computer freely. So how can they say it was me because of an angry ex girlfriend.

    Read the article

  • Strange Ubuntu Random Display [Video]

    - by d4v1dv00
    I had this random display issue ever since Ubuntu 11.04 and now running Ubuntu 11.10 and this problem still persist. It is very hard for me to explain, so I uploaded a video to elaborate itself. Before I convert from Windows 7, this issue never happened. The symptom is so random that I cannot reproduce or tell precisely when will this happen again. My wild guess is, should this be related to driver? Below are my detail system information: $ lspci 00:00.0 Host bridge: Intel Corporation 2nd Generation Core Processor Family DRAM Controller (rev 09) 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 00:16.0 Communication controller: Intel Corporation 6 Series/C200 Series Chipset Family MEI Controller #1 (rev 04) 00:1a.0 USB Controller: Intel Corporation 6 Series/C200 Series Chipset Family USB Enhanced Host Controller #2 (rev 05) 00:1b.0 Audio device: Intel Corporation 6 Series/C200 Series Chipset Family High Definition Audio Controller (rev 05) 00:1c.0 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 1 (rev b5) 00:1c.3 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 4 (rev b5) 00:1d.0 USB Controller: Intel Corporation 6 Series/C200 Series Chipset Family USB Enhanced Host Controller #1 (rev 05) 00:1f.0 ISA bridge: Intel Corporation H67 Express Chipset Family LPC Controller (rev 05) 00:1f.2 SATA controller: Intel Corporation 6 Series/C200 Series Chipset Family 6 port SATA AHCI Controller (rev 05) 00:1f.3 SMBus: Intel Corporation 6 Series/C200 Series Chipset Family SMBus Controller (rev 05) 02:00.0 Ethernet controller: Broadcom Corporation NetLink BCM57788 Gigabit Ethernet PCIe (rev 01) is there any other information i need to post and how do i do that?

    Read the article

  • Ubuntu and bumblebee and intel problem

    - by LnxSlck
    I have a brain teaser that i can't solve. When 12.04 (64bits) came out, i did a fresh install and then installed bumblebee with: sudo add-apt-repository ppa:bumblebee/stable sudo apt-get install bumblebee bumblebee-nvidia And Ubuntu recognized (System Settings - Details) Intel Card for primary card, and i could launch applications with optirun. I have: 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: NVIDIA Corporation GF119 [GeForce GT 520MX] (rev ff) Now, a few days back, i did the exact same thing, installed from the same DVD everything the same, installed bumblebee the same way. Nvidia with Optirun works fine, but Ubuntu doesn't have 3D effects: root@deathstar:~# optirun /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce GT 520MX/PCIe/SSE2 OpenGL version string: 4.2.0 NVIDIA 295.40 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: no GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no I never did anything to install Intel drivers (except installing mesa-utils), before bumblebee got everything working, but now i can't get Ubuntu to get Unity with 3D. Can someone help me please get Unity 3d working? Your help will be much appreciated

    Read the article

  • 2 folders in Sys/Class/Backlight?

    - by zebrapie
    ISSUE: Backlight brightness does not change. More Detail: Brightness will not change, using both 'System Settings-Screen', or FN keys (Brightness bar shows and moves, but screen brightness does not change). Notcied a post in this thread (http://ubuntuforums.org/showthread.php?t=1866283) about having multiple folders in Sys-Class-Backlight... I HAVE TWO FOLDERS TOO! 'intel_backlight' and 'acpi_video0' Using the function keys, alters the value in the acpi_video0's 'Brightness' file - But doesn't actually alter the brightness of the screen. If I add 'backlight=vendor' in Grub, my function keys then edit the value in the 'Intel_Backlight brightness file. - But again doesnt actually change the brightness of the screen. Computer: Fujitsu Siemans Pi2515, Intel Integrated Graphics, No hdd partition. Already Tried: -Editing grub to contain: acpi_osi=Linux acpi_backlight=vendor -http://ubuntuguide.net/change-screen-brightness-with-fn-key-in-ubuntu-11-0410-10 -sudo apt-get install acpi -$ sudo setpci -s 00:02.0 F4.B=20 -Brightness does not adjust in fallback mode either. -Reinstalling OS, Using Linux Mint (Same problem). -Upgrading and downgrading BIOS. Many thanks for reading, I understand this problem may need a bit of a Linux pro to sort. If anyones up for the challenge i'll spend any amount of time being walked through this, posting results. Don't want to give up here!

    Read the article

  • Computer Science: Arts or Science?

    - by sunpech
    Various colleges and universities may offer a degree in Computer Science either as an Arts or a Science. What differences are there between the two? Would recruiters and those who conduct interviews favor one over the other? (Bachelor of Arts vs Bachelor of Sciences etc...) Update - Just wanted to add this link to Joel Spolsky's site to give a better frame of reference: BA or BS in Computer Science

    Read the article

  • The Best Ways to Make Use of an Idle Computer

    - by Lori Kaufman
    If you leave your computer on when you are not using it, there are ways you can put your computer to use when it’s sitting idle. It can do scientific research, backup your data, and even look for signs of extraterrestrial life. How To Create a Customized Windows 7 Installation Disc With Integrated Updates How to Get Pro Features in Windows Home Versions with Third Party Tools HTG Explains: Is ReadyBoost Worth Using?

    Read the article

  • If Computer Problems were Physical Life Events [Video]

    - by Asian Angel
    Things can be bad (and frustrating) enough when you have problems with your computer, but what if those events actually crossed over into physical reality? Note: Video contains some language that may be considered inappropriate. If Computer Problems Were Real – Awkward Spaceship [via Fail Desk] How To Properly Scan a Photograph (And Get An Even Better Image) The HTG Guide to Hiding Your Data in a TrueCrypt Hidden Volume Make Your Own Windows 8 Start Button with Zero Memory Usage

    Read the article

  • How to manage two video cards on a laptop that runs Ubuntu 10.10?

    - by Marc-François Cochaux-Laberge
    I have a laptop with two video cards. One ATI and on integrated Intel. On Windows, I can choose which video card I want to use. For example, I use the Intel card for normal use and for gaming, I switch to my ATI card for better performance, but a shorter battery life. In Ubuntu 10.10, only the Intel driver is installed, the ATI driver for my card doesn't work at all and there's heat coming out of my computer all the time, like when I'm playing video games on Windows. I think both cards are active, but only the Intel one is usefull. How can I solve this by making sure Ubuntu is aware of the two video cards and by disabling my ATI. Or may be I am all wrong about this?

    Read the article

  • How to manage two video cards on a laptop (ATI and Intel)?

    - by Marc-François Cochaux-Laberge
    I have a laptop with two video cards. One ATI and on integrated Intel. On Windows, I can choose which video card I want to use. For example, I use the Intel card for normal use and for gaming, I switch to my ATI card for better performance, but a shorter battery life. In Ubuntu 10.10, only the Intel driver is installed, the ATI driver for my card doesn't work at all and there's heat coming out of my computer all the time, like when I'm playing video games on Windows. I think both cards are active, but only the Intel one is usefull. How can I solve this by making sure Ubuntu is aware of the two video cards and by disabling my ATI. Or may be I am all wrong about this?

    Read the article

  • Is 2 lines of push/pop code for each pre-draw-state too many?

    - by Griffin
    I'm trying to simplify vector graphics management in XNA; currently by incorporating state preservation. 2X lines of push/pop code for X states feels like too many, and it just feels wrong to have 2 lines of code that look identical except for one being push() and the other being pop(). The goal is to eradicate this repetitiveness,and I hoped to do so by creating an interface in which a client can give class/struct refs in which he wants restored after the rendering calls. Also note that many beginner-programmers will be using this, so forcing lambda expressions or other advanced C# features to be used in client code is not a good idea. I attempted to accomplish my goal by using Daniel Earwicker's Ptr class: public class Ptr<T> { Func<T> getter; Action<T> setter; public Ptr(Func<T> g, Action<T> s) { getter = g; setter = s; } public T Deref { get { return getter(); } set { setter(value); } } } an extension method: //doesn't work for structs since this is just syntatic sugar public static Ptr<T> GetPtr <T> (this T obj) { return new Ptr<T>( ()=> obj, v=> obj=v ); } and a Push Function: //returns a Pop Action for later calling public static Action Push <T> (ref T structure) where T: struct { T pushedValue = structure; //copies the struct data Ptr<T> p = structure.GetPtr(); return new Action( ()=> {p.Deref = pushedValue;} ); } However this doesn't work as stated in the code. How might I accomplish my goal? Example of code to be refactored: protected override void RenderLocally (GraphicsDevice device) { if (!(bool)isCompiled) {Compile();} //TODO: make sure state settings don't implicitly delete any buffers/resources RasterizerState oldRasterState = device.RasterizerState; DepthFormat oldFormat = device.PresentationParameters.DepthStencilFormat; DepthStencilState oldBufferState = device.DepthStencilState; { //Rendering code } device.RasterizerState = oldRasterState; device.DepthStencilState = oldBufferState; device.PresentationParameters.DepthStencilFormat = oldFormat; }

    Read the article

  • Computer Science Degrees and Real-World Experience

    - by Steven Elliott Jr
    Recently, at a family reunion-type event I was asked by a high school student how important it is to get a computer science degree in order to get a job as a programmer in lieu of actual programming experience. The kid has been working with Python and the Blender project as he's into making games and the like; it sounds like he has some decent programming chops. Now, as someone that has gone through a computer science degree my initial response to this question is to say, "You absolutely MUST get a computer science degree in order to get a job as a programmer!" However, as I thought about this I was unsure as to whether my initial reaction was due in part to my own suffering as a CS student or because I feel that this is actually the case. Now, for me, I can say that I rarely use anything that I learned in college, in terms of the extremely hard math, algorithms, etc, etc. but I did come away with a decent attitude and the willingness to work through tough problems. I just don't know what to tell this kid; I feel like I should tell him to do the CS degree but I have hired so many programmers that majored in things like English, Philosophy, and other liberal arts-type degrees, even some that never went to college. In fact my best developer, falls into this latter category. He got started writing software for his church or something and then it took off into a passion. So, while I know this is one of those juicy potential down vote questions, I am just curious as to what everyone else thinks about this topic. Would you tell a high school kid about this? Perhaps if he/she already knows a good deal of programming and loves it he doesn't need a CS degree and could expand his horizons with a liberal arts degree. I know one of the creators of the Django web framework was a American Literature major and he is obviously a pretty gifted developer. Anyway, thanks for the consideration.

    Read the article

  • Proof Identify stolen computer getting computer identification info from Launchpad bugs and comparing

    - by Kangarooo
    I sold my old laptop to neighbours and it was stolen from them. Well i think i have found thief so i want to check his computer id and compare it to my old Launchpad bugs id. How in Launchpad i can find from my bugs: Motherboard HDD Somthing else that can help identify it Maybe how to recover or find some overwritten files (couse now there is windows) I found in Launchpad one my bugs has LSPCI autogenerated from bug 682846 https://launchpadlibrarian.net/70611231/Lspci.txt but i dont see any id that can be used to identify specificly my comp. This can be used to identify many same models. Or i missed something in there? And what commands should i use to get all identification on that comp in one go fast? Just lspci? How to get same lspci as it is in that Launchpad link? Now testing laspci on my computer i dont get so much info. Also im now doing a search in my external hdd where i have many backups and maybe i have there result from lspci. So what containing keywords would help doing search with for small lspci and full reports ive done? I might have done sudo lshw somefilename

    Read the article

  • OpenGL - have object follow mouse

    - by kevin james
    I want to have an object follow around my mouse on the screen in OpenGL. (I am also using GLEW, GLFW, and GLM). The best idea I've come up with is: Get the coordinates within the window with glfwGetCursorPos. The window was created with window = glfwCreateWindow( 1024, 768, "Test", NULL, NULL); and the code to get coordinates is double xpos, ypos; glfwGetCursorPos(window, &xpos, &ypos); Next, I use GLM unproject, to get the coordinates in "object space" glm::vec4 viewport = glm::vec4(0.0f, 0.0f, 1024.0f, 768.0f); glm::vec3 pos = glm::vec3(xpos, ypos, 0.0f); glm::vec3 un = glm::unProject(pos, View*Model, Projection, viewport); There are two potential problems I can already see. The viewport is fine, as the initial x,y, coordinates of the lower left are indeed 0,0, and it's indeed a 1024*768 window. However, the position vector I create doesn't seem right. The Z coordinate should probably not be zero. However, glfwGetCursorPos returns 2D coordinates, and I don't know how to go from there to the 3D window coordinates, especially since I am not sure what the 3rd dimension of the window coordinates even means (since computer screens are 2D). Then, I am not sure if I am using unproject correctly. Assume the View, Model, Projection matrices are all OK. If I passed in the correct position vector in Window coordinates, does the unproject call give me the coordinates in Object coordinates? I think it does, but the documentation is not clear. Finally, to each vertex of the object I want to follow the mouse around, I just increment the x coordinate by un[0], the y coordinate by -un[1], and the z coordinate by un[2]. However, since my position vector that is being unprojected is likely wrong, this is not giving good results; the object does move as my mouse moves, but it is offset quite a bit (i.e. moving the mouse a lot doesn't move the object that much, and the z coordinate is very large). I actually found that the z coordinate un[2] is always the same value no matter where my mouse is, probably because the position vector I pass into unproject always has a value of 0.0 for z. Edit: The (incorrectly) unprojected x-values range from about -0.552 to 0.552, and the y-values from about -0.411 to 0.411.

    Read the article

  • Problems reviving old pc including graphics card issue.

    - by Mick
    I have a PC that seemed to have died years ago that I am trying to revive. It has a dual core athlon processor and a gigabyte motherboard. It had two dual output graphics cards, and I have long since forgotten which output would print out the diagnostic information as the PC starts up. Also I suspect that the resolution set on all the monitors was probably higher than my current single monitor is capable of displaying. The motherboard also has a built in graphics card, so I thought it may be simplest to remove both the graphics cards and plug my monitor into the onboard graphics just while I get things going. Does that seem sensible? Now the other problem: The PC has two hard drives. I have no idea which one is the primary one it is attempting to boot from. When I power up, the fan comes on and I hear some chuga-chuga-pause chuga-chuga-pause repeat indefinitely. I'm not sure which device is making the noise. There are no-beeps at any time. I see nothing on the screen at any time, not even for a second. Any suggestions? EDIT: If T start up the PC without the power connected to the CDrom there is no chuga-chugan noise.

    Read the article

  • Turing Machine & Modern Computer

    - by smwikipedia
    I heard a lot that modern computers are based on Turing machine. I'd like to share my understanding and hear your comments. I think the computer is a big general-purpose Turing machine. Each program we write is a small specific-purpose Turing machine. The classical Turing machine do its job based on the input and its current state inside and so do our programs. Let's take a running program (a process) as an example. We know that in the process's address space, there's areas for stack, heap, and code. A classical Turing machine doesn't have the ability to remember many things, so we borrow the concept of stack from the push-down automaton. The heap and stack areas contains the state of our specific-purpose Turing machine (our program). The code area represents the logic of this small Turing machine. And various I/O devices supply input to this Turing machine. The above is my naive understanding about the working paradigm of modern computer. I couln't wait to hear your comments. Thanks very much.

    Read the article

  • Learning computer architecture as a programmer

    - by Samaursa
    I typically run across gurus at SO and other places (instructors, book authors etc.) that would say something along the lines "This will cause alignment issues" or other low level tidbits. I want to learn about all these tidbits that are relevant to programming. Now usually when I see low level books (computer architecture books for example) they are too low level and geared towards people whose primary area of interest is computer architecture and not software design. Do you have recommendations for books that go through low-level stuff that is relevant to programmers?

    Read the article

  • Graphics Driver problem, ATI Radeon HD 3200, small screen size and slows everything down.

    - by Arvind Jangid
    Regards. I am using a: 2009 Compaq Presario CQ40-415AU Notebook AMD Athlon X2 Dual core Processor 2.1 GHz 1024 MB L2 cache 3GB DDR2 RAM ATI Radeon + HD 3200 Graphics 256 MB, screen is 14 inch widescreen with resolution of 1280*800. I installed Ubuntu 12.04 LTS 32bit on my laptop. It works brilliantly until I installed graphics driver. When I installed the driver, the graphics became slow. Everything slowed down. Even the splash screen resolution changed to something like 640*480. I have liked Ubuntu since 9.10 and for the freedom it provides and its versatility, but graphics problem remains the same. I even installed Ubuntu on a 50 GB partition with 6 GB swap partition. My HDD is 320 GB. Please tell me what is wrong.

    Read the article

  • Computer vision algorithms (how is this possible?)

    - by Maxim Gershkovich
    I recently stumbled across a company that has created what appears to be a computer vision technology that is capable of detecting shoplifting automatically and alert its users. LINK Watching some of the videos and examples provided by the company has left me completely baffled and amazed as to how on earth they may have achieved this functionality. I understand that no-one here will be able to tell me exactly how this may have been achieved but is anyone aware - and could point me to - research in this field or alternatively perhaps provide details as to how something like this could be implemented or guidance of where one might start? My understanding was the computer vision algorithms were many years away from being this sophisticated. Is this sort of application really possible? Anyone willing to hazard a guess at how they achieved this?

    Read the article

  • Master Degree in MIS for computer science student

    - by tnhan07
    I'm junior student in computer science. After taking half of my major related courses, I found that I don't like this theoretical side of IT. As a result, I decided that I would devote my career to CIS/MIS because it is more interesting. However, some veteran programmers in this forum said that having a strong computer science foundation would help much for CIS. Therefore, I think it's better for me to complete my CS degree then have a Master Degree in MIS than have a minor in MIS. After some internet searching, I found that top universities(in my reach) offering master degree in CIS/MIS are all business schools, is there any obstacle for a CS student who lacks of business knowledge like me if I study in these schools? Do you have any advice for me?

    Read the article

  • Good Intro to Computer Science Book for FE Developer [on hold]

    - by Squirkle
    I am a JavaScript developer/architect who, like many developers these days, did not come from a Computer Science background (I studied Philosophy at a liberal arts college), but instead learned development by actually building applications, and by reading books explaining language grammars, design patterns, and best practices. I have never felt that my ignorance of CS concepts has hurt my ability to build great apps or find employment. Recently, however, I have felt the itch to grow in this direction. Do you have any suggestions for some good introductory CS resources/books? I know that Computer Science is a huge field and my question is very general, but I am looking for a 101-type survey of the high-level concepts, from which I can branch off into more specific areas of study. Thanks!

    Read the article

  • Are VM-based languages becoming viable for Graphics since the move to GPU computing?

    - by skiwi
    Perhaps the title is not the most clear, so let me elaborate it more: I am talking about VM-based languages, by that I mean languages that run on the JVM (java) and for example C#. Also I am talking about 3D graphics, just to be clear. Lately the trend has been that most computing is being done on the GPU and not on the CPU, and since times the issue with programming games on a VM-based language is that garbage collecting may happen randomly. So let's take a look which is responsible for what: Showing the graphics: GPU Uploading graphics to the GPU: CPU? Needs to be done every frame? Calculating physics constraints: GPU Doing the real game logic (Determining when to move objects (independent of physics calculations), processing AI): CPU Is my list actually correct? And if it is, is for example Java becoming more viable? Or is uploading the graphics (vertices) still the most expensive operation? Would like to get more insight into this.

    Read the article

  • Does running Nexuiz gives extra pressure on Processor if you dont have external Graphics card?

    - by Curious Apprentice
    Its a rather stupid question, though I want to be sure. Does having a external graphics card can lower the stress over the processor? what kind of graphics card Ubuntu supports ? Well I'm planning to buy a graphics card for Windows 7 as I have started learning Adobe Premiere Pro. Which G card should I buy? Do i consider the card or the availability of the card drivers for Ubuntu Linux ? If I install a Graphics card and does not install its drivers can I left it unused on Ubuntu ? I don't think theres a much need for G card on Ubuntu Though.

    Read the article

  • Is Ubuntu recognizing and/or using my NVIDIA graphics card?

    - by user212860
    This is my first post here, and I'm pretty new to Ubuntu/Linux. I currently have no other OS except for Ubuntu 13.10. (I used to have Win7 until i got a new terabyte hard drive). My current PC build, if any of this helps: CPU: Intel i5 quad-core Graphics: NVIDIA GeForce GTX 650 RAM: 8 GB HDD: 1 TB SATA 3 Motherboard: MSi Z77 A-G41 OS Ubuntu 13.10 So I recently installed Ubuntu 13.10 and put Steam on it, and I'm seeing that my games run a lot slower than they did when I had Win7. I figured it was a graphics problem, so I checked System Settings Details Overview. It says in "Graphics" that I have "Gallium 0.4 on NVE7" (don't really know what that is). Does this mean that Ubuntu is not using my graphics card? In System Settings Software & Updates Additional Drivers, it clearly shows like this: NVIDIA Corporation: GK107 [GeForce GTX 650] -This device is using an alternative driver (And then it shows a list of drivers that I can switch back and forth to) So this is a bit confusing. In Software and Updates, it clearly shows that I have my NVIDIA card installed, and that I have a driver selected for it. But in System Settings, it shows I have some Gallium 0.4 thing. I had done a bit of research, and ended up typing command: "lspci|grep VGA" in the Terminal. It showed this in response: VGA compatible controller: NVIDIA Corporation GK107 [GeForce GTX 650] (rev a1) The Terminal seems to recognize my graphics card. What it looks like to me, is that I don't have the proper driver, and I might be using my CPU's integrated graphics. When I switch around which driver I am using in that list, it still does not see my card in System Settings. Some of the drivers in the list give me some sort of OpenGL error when I try to run a game. It might just be that my games are running slow because the game developers have not optimized it for Ubuntu that well. However, that still doesn't take away from the fact that System Settings is not showing my NVIDIA card. TL;DR Version: How do I know if my video card is being recognized/used? If my video card is not being used, what is the best way fix that? Please make your answers easy to understand. I do not mind wordy responses, as long as I can follow what you're saying. Any help would be greatly appreciated! Thanks, Jabber5

    Read the article

< Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >