Search Results

Search found 8831 results on 354 pages for 'intel graphics'.

Page 31/354 | < Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >

  • Quartz compositions created in Snow Leopard (10.6) doesn't work in Leopard (10.5) despite testing in

    - by adib
    Hi I have a reasonably advanced (many patches and subpatches) quartz composition that was created in Snow Leopard but doesn't run well (many elements are not rendered) in Leopard. The composition tested OK via Quartz Composer's Test in Runtime option and works fine for both Leopard 32-bits and Leopard 64-bits (menu item "File | Test in Runtime | Leopard 32-bits". In an actual Leopard (32-bits) system, a lot of elements are not rendered in the quartz composition. Below are the log file excerpt when the composition is run in QuickTime Player under Leopard: QuickTime Player[134] *** <QCNodeManager | namespace = "com.apple.QuartzComposer" | 335 nodes>: Patch with name "/units to pixels" is missing QuickTime Player[134] *** Message from <QCPatch = 0x06D82880 "(null)">:Cannot create node of class "/units to pixels" and identifier "(null)" QuickTime Player[134] *** Message from <QCPatch = 0x06D7C130 "(null)">:Cannot create node of class "/resize image to target" and identifier "(null)" QuickTime Player[134] *** Message from <QCPatch = 0x06D7C130 "(null)">:Cannot create connection from ["outputValue" @ "Math_1"] to ["Target_Pixels" @ "Patch_2"] The patch "units to pixels" is a system patch in Snow Leopard whereas the patch "resize image to target" is a custom virtual patch located in my home directory. It seems that we can cross out problems in which the composition is referencing a missing virtual patch. I have tested the composition under another user's account and it ran fine which shows that it already embeds the "resize image to target" virtual patch that is located in my home directory. I'm really puzzled why the composition passes the Leopard Runtime test but yet fail to run in an actual Leopard OS? Is there a post-processing step that I need to run to the composition file? Is there any way to make this patch more compatible with Leopard? Thanks in advance.

    Read the article

  • Which are the best tools for Graphic Designing?

    - by Jen
    Hello, I want to take up Graphic Designing as my profession. I would be designing Logos, Icons, Stationery, Brochures, Handouts, Book Covers, etc. But I am thoroughly confused as to which tools are the best and which books/resources will help me learn these tools and graphic designing like a professional. I am ready to shell out money to purchase the resources. Please help me out! Thanks, Jen

    Read the article

  • CATiledLayer blanking tiles before drawing contents

    - by Greg Plesur
    All, I'm having trouble getting behavior that I want from CATiledLayer. Is there a way that I can trigger the tiles to redraw without having the side-effect that their areas are cleared to white first? I've already subclassed CATiledLayer to set fadeDuration to return 0. To be more specific, here are the details of what I'm seeing and what I'm trying to achieve: I have a UIScrollView with a big content size...~12000x800. Its content view is a UIView backed by a CATiledLayer. The UIView is rendered with a lot of custom-drawn lines Everything works fine, but the contents of the UIView sometimes change. When that happens, I'd like to redraw the tiles as seamlessly as possible. When I use setNeedsDisplay on the view, the tiles redraw but they are first cleared to white and there's a fraction-of-a-second delay before the new content is drawn. I've already subclassed CATiledLayer so that fadeDuration is set to 0. The behavior that I want seems like it should be possible...when you zoom in on the scrollview and the content gets redrawn at a higher resolution, there's no blanking before the redraw; the new content is drawn right on top of the old one. That's what I'm looking for. Thanks; I appreciate your ideas. Update: Just to follow up - I realized that the tiles weren't being cleared to white before the redraw, they're being taken out entirely; the white that I was seeing is the color of the view that's beneath my CATiledLayer-backed view. As a quick hack/fix, I put a UIImageView beneath the UIScrollView, and before triggering a redraw of the CATiledLayer-backed view I render its visible section into the UIImageView and let it show. This smooths out the redraw significantly. If anyone has a better solution, like keeping the redraw-targeted tiles from going away before being redrawn in the first place, I'd still love to hear it.

    Read the article

  • How to fill a path with gradient in drawRect:?

    - by Derrick
    filling a path with a solid color is easy enough: CGPoint aPoint; for (id pointValue in points) { aPoint = [pointValue CGPointValue]; CGContextAddLineToPoint(context, aPoint.x, aPoint.y); } [[UIColor redColor] setFill]; [[UIColor blackColor] setStroke]; CGContextDrawPath(context, kCGPathFillStroke); I'd like to draw a gradient instead of solid red, but I am having trouble. I've tried the code listed in the Question/Answer: http://stackoverflow.com/questions/422066/gradients-on-uiview-and-uilabels-on-iphone which is: CAGradientLayer *gradient = [CAGradientLayer layer]; [gradient setFrame:rect]; [gradient setColors:[NSArray arrayWithObjects:(id)[[UIColor blueColor] CGColor], (id)[[UIColor whiteColor] CGColor], nil]]; [[self layer] setMasksToBounds:YES]; [[self layer] insertSublayer:gradient atIndex:0]; However, this paints the entire view that this is in with the gradient, covering up my original path.

    Read the article

  • Problem draw line by Quartz 2D with alpha property < 1.0 on iPhone

    - by The Khanh
    Hello Everybody ! This code i use to draw in my app. So i have problem, if i draw with alpha property = 1. It is very good but if i change alpha property = 0.2 then my paint is not good. How do i make for better with alpha property = 0.2. http://www.flickr.com/photos/9601621@N05/page1/ Draw with alpha = 1: It is good Draw with alpha = 0.2: It is bad - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { if ([self.view superview] && (headerView.frame.origin.y == -30)) { mouseSwiped = YES; UITouch *touch = [touches anyObject]; CGPoint currentPoint = [touch locationInView:self.view]; currentPoint.y -= 20; UIGraphicsBeginImageContext(self.view.frame.size); CGContextRef context = UIGraphicsGetCurrentContext(); [drawImage.image drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)]; CGContextSetLineCap(context, kCGLineCapRound); CGContextSetLineWidth(context, currentBrushProperty.brushSize); CGContextSetRGBStrokeColor(context, [self red], [self green], [self blue], currentBrushProperty.brushTransparency); CGContextSetRGBStrokeColor(context, 1.0, 0.0, 0.0, 1.0); CGContextBeginPath(context); CGContextMoveToPoint(context, lastPoint.x, lastPoint.y); CGContextAddLineToPoint(context, currentPoint.x, currentPoint.y); CGContextStrokePath(context); drawImage.image = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); lastPoint = currentPoint; }} Help me, please.

    Read the article

  • Draw a parallel line

    - by VOX
    I have x1,y1 and x2,y2 which forms a line segment. How can I get another line x3,y3 - x4,y4 which is parallel to the first line as in the picture. I can simply add n to x1 and x2 to get a parallel line but it is not what i wanted. I want the lines to be as parallel in the picture.

    Read the article

  • gluLookAt vectors and FPS-style camera

    - by Kevin Pamplona
    I am attempting to implemented an FPS-style camera by updating three vectors: EYE, DIR, UP. These vectors are the same that are used by gluLookAt (since gluLookAt is specified by the position of the camera, the direction it is looking at, and an up vector). I have already implemented the left-right and up-down strafing movements, but I'm having a lot of trouble understanding the math behind making the camera look-around while remaining stationary. In this case, the EYE vector remains the same, while I must update DIR and UP. Below is the code I tried, but it doesn't seem to work properly. Any suggestions? Thanks. void Transform::left(float degrees, vec3& dir, vec3& up) { vec3 axis; axis = glm::normalize(up); mat3 R = rotate(-degrees, axis); dir = R*dir; dir = R*up; }; void Transform::up(float degrees, vec3& dir, vec3& up) { vec3 axis; axis=glm::normalize(glm::cross(dir,up)); mat3 R = rotate(-degrees, axis); dir = R*dir-; up = R*up; };

    Read the article

  • Shapes-tool creating a vector mask every time, cannot seem to fix in CS3?

    - by Bryan
    Every time I create a shape using the shape tool, it places a vector mask on top of this. I don't know how I enabled this but it does not do it on my laptop version, only my desktop. I can seem to disable this problem I am having. Even reinstalling and restoring defaults I cannot seem to stop this. Very frustrating, anyone have a fix for this problem? Thanks in advance!

    Read the article

  • Draw 2 parallel lines

    - by Ben Martin
    How can I calculate the points to draw 2 parallel lines. I know the start and end points for the centre of the parallel lines. To makes thing a little bit harder, it needs to support straight and Bezier curved lines.

    Read the article

  • Intel Matrix Storage Manager not showing on Asus P5W DH Deluxe?

    - by Leon
    I have set, under "Main" - "IDE Configuration" - "Configure SATA as" to "Raid" and "Onboard Serial-ATA BootRom" to "Enabled", but upon POST I still do not see the Intel Matrix Storage Manager screen where I can press Ctrl+I to set up my raid? I have the latest BIOS version EDIT: Although I had set the rom to "enabled", I then went and reset the bios settings to default. I then set the bootrom setting to "disabled", restarted and then "enabled" again and it seemed to work.

    Read the article

  • When Intel/AMD plan to use new CPU sockets? [closed]

    - by psihodelia
    It is very expensive always to use most modern hardware especially buying new mainboard if only a new CPU is desired. It would be much better if one knows whether and when major CPU producers plan to change CPU sockets. Do you know when it is planed to change sockets the next time? I am particularly interested in not buying Intel i7 CPU if a new CPU will be released soon with not compatible pins.

    Read the article

  • Professional graphics cards a "must" for rendering static environments?

    - by Imhotep
    I'm not sure if the title is clear but with more words what I want to say is: I'm building a PC for a decorator who's main work is to render photorealistic images of house interiors. For that she uses 3dsMax and AutoCAD with Accurender and Photoshop. Is there a need for professional graphics card like Quadro series or FireGL series? Do these cards offer any improvements on rendering time or are they only used for real time rendering?

    Read the article

  • Checking if Intel VT-x acceleration is enabled from inside a VMware virtual machine?

    - by user269950
    My (Fortune 500) company just rolled out new VMs and everyone is complaining they are dog slow. Is there any way I could verify, from inside a VM, whether Intel virtualization (VT-x) acceleration has been properly enabled? The processor claims to be a Xeon E7-2830 but the experience has been more like a first-gen Atom. I'd ask IT directly but I get the impression they're unlikely to respond to any suggestion that they are, in fact, drooling imbeciles.

    Read the article

  • Can a PCI Graphics card, and AGP be used together?

    - by Everyone
    The question pretty much says it all. I use an old 845GBV board ( to-date reliant upon the integrated graphics processor ). All slots on the board are unused. Lately I've been thinking in terms changing it to a dual monitor so that I can use one console for documentation/help/sample code/whatever, the other one to play with code. Assuming this board can handle a PCI GPU, can an AGP4x board coexist with a PCI GPU?

    Read the article

  • Regarding compatibility of Intel Pentium D 805 CPU with new motherboard

    - by aniruddhabhide
    I currently have an old configuration with Intel Pentium D 805 CPU and Intel D101GGC chipset. Now I am planning to upgrade my system except CPU and hard disk since it doesn't fit in the budget. QUESTION: I am planning to get Gigabyte GA-B75M-D3H Motherboard which has LGA1155 socket. But my processor has PLGA775 socket type. Will my CPU fit in thee new motherboard's socket? LINKS: CPU specs (Intel site): http://ark.intel.com/products/27511/Intel-Pentium-D-Processor-805-2M-Cache-2_66-GHz-533-MHz-FSB New Motherboard specs (Vendor site): http://www.flipkart.com/gigabyte-ga-b75m-d3h-motherboard/p/itmdacp36gegyeqt?pid=MBDDACP2GUBGFPFM

    Read the article

  • Graphics trouble after resuming from hibernate or suspend

    - by Voyagerfan5761
    I have a Dell Inspiron 2650 (with NVidia graphics, using nouveau drivers) that I'm using to try out Ubuntu. It's all great, except that Hibernate and Suspend aren't usable. Yes, I know that questions about power-save issues are rampant in the Linux support universe, but it seems that every time I find a solution it's for a very specific hardware combination and doesn't apply to me. So anyway, here goes. When I resume from either power-saving mode, I'll get graphics problems anywhere on the range from a few scattered random-colored pixels that won't change; all the way to full-screen patterns that don't change as I move the mouse, hit keys on the keyboard, or even bring up the shutdown dialog using the power button. Those full-screen issues (which may involve stripes with random pixels, partial black screen, or both) always end in me forcing the machine to shut down by holding the power button. I haven't done much testing yet to determine what severity level is most commonly associated with each mode, but I do avoid using either power-save option because of these issues. I'll add info on my hardware as I can gather it (no home internet connection, and this laptop is tethered to my desk by a dead battery and casing degradation). Please feel free to request something specific in the question comments. Hardware Info See this hardinfo report for my system's hardware configuration. (No, my username is not "myuser"; I sanitized hardinfo's output before publishing it.) Screenshots These screenshots are from a relatively mild occurrence, which happened after the second hibernation I took that session. The first one worked great, though I used the wireless card and Firefox heavily between the two hibernation attempts. Take a look at what happened when I opened my home directory in Nautilus and scrolled it: See below for the situations I've tested so far. The real trouble comes when the machine resumes to an unusable state; in such cases I can't even unlock the screen or properly reboot, much less take a screenshot. I have a hunch that putting a CD in the drive will cause such major failures, and I will try that at some point; see related question. Situations Tested Maverick (10.10) Suspend Seems to suspend nicely with nothing running Seems to suspend nicely with flash drive plugged in On resume from suspend with no flash drive, Terminal and gedit running: Funky graphics on top of log output, then blank screen with pixelated cursor; no response to power button (normally will shutdown 60 seconds later) Hibernate Seems to hibernate nicely with nothing running Seems to hibernate nicely with a few apps (Terminal, Mouse preferences) running Seems to not hibernate when flash drive plugged in Seems to not hibernate when System Monitor is running Have encountered failed hibernation (after several hours and one successful hibernate/thaw cycle) with no external media connected and no programs running except normal background stuff Natty LiveCD (11.04_2010-12-22) When I tested it, Natty wouldn't stay logged in. It played part of the login sound and then [ OK ] appeared in the top right corner (white-on-black terminal text) for a few seconds. Then it kicked me back to the Unlock screen. It did that four times before I gave up and just tested suspend from the Unlock screen. Suspend Resumed to vertical gray and black lines 2px (?) wide, then shifted to vertical "jail bars" of black over a black screen with above-described random pixels and mouse pointer. No apparent response to input from mouse (clicking randomly). Keyboard and touchpad unrecognized.

    Read the article

  • Unity stuck in 2D mode, Nvidia Quadro graphics "unknown", Nvidia-Current active but not in use

    - by Jordan Lund
    I've seen this problem reported under several questions, but I haven't been able to resolve any of it so I thought I'd bring it all in under one umbrella. I started a new job and was given a Dell Precision M6400 laptop with Nvidia Quadro FX 2700M graphics card. It had a previous version of Ubuntu on it, but nobody had any passwords for it so I wiped the drive and did a fresh install of 11.10 from scratch. I didn't do any updates during installation, preferring to do them after boot. Updates ran fine and the system works... except Unity is in 2D mode. System Settings - Additional Drivers shows that Nvidia-Current is active but not in use. System Settings - System Info shows Graphics Driver Unknown, Experience Standard Nvidia X Server Settings is installed and working, re-writing the xorg.conf did nothing. /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: Quadro FX 2700M/PCI/SSE2 OpenGL version string: 3.3.0 NVIDIA 285.05.09 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: yes One suggestion was to do a sudo apt-get --purge remove nvidia* and that resulted in a scrambled screen on boot and a non-bootable installation. Pressing the Delete key on boot allowed me to access the recovery console and do a sudo apt-get install nvidia-current, which brought me back to a working, bootable system. Another suggestion was to edit /etc/default/grub and change the line reading "GRUB_CMDLINE_LINUX_DEFAULT="quiet splash" to read "GRUB_CMDLINE_LINUX_DEFAULT="quiet splash vmalloc=192MB" thus allocating more video RAM. I did that, followed by a sudo update-grub and a re-boot. No change. Created a brand new standard user and logged on with that account, no change.

    Read the article

  • How to make Unity 3D work with Bumblebee using the Intel chipset

    - by EboMike
    I have a Sony VAIO S laptop with the dreaded Optimus and finally managed to get Bumblebee to work fully on Ubuntu 12.04 so that I can utilize both the hardware acceleration of the Intel chipset as well as the Nvidia one via optirun and/or bumble-app-settings. However, the desktop effects don't work. But they should, I vaguely remember that they worked for a while before I had Bumblebee installed. This is what I get with the support test: :~$ /usr/lib/nux/unity_support_test -p Xlib: extension "NV-GLX" missing on display ":0". OpenGL vendor string: Tungsten Graphics, Inc OpenGL renderer string: Mesa DRI Intel(R) Ivybridge Mobile OpenGL version string: 1.4 (2.1 Mesa 8.0.2) Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: no GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no First of all, I kind of doubt that the chipset doesn't support VBOs (essentially a standard feature in GL). Neither Xorg.0.log nor Xorg.8.log show any particular errors. As for the Nvidia drivers: In order to get them to work, I had to install the 304.22 drivers (older ones wouldn't work). They clobbered libglx.so, so I reinstated the xserver-xorg-core libglx.so in its original place, moved Nvidia's libglx.so to an nvidia-specific folder and specified that folder in the bumblebee.config. That seems to work and shouldn't cause the problem I see here. For fun, I tried to use the Nvidia chipset for Unity, but that didn't fly either: ~$ optirun /usr/lib/nux/unity_support_test -p OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce GT 640M LE/PCIe/SSE2 OpenGL version string: 4.2.0 NVIDIA 304.22 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: no GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: no

    Read the article

  • Intel 5100 AGN disconnects and then disables my Verizon FiOS Actiontec router

    - by Anthony
    I am a new user (my first trial) of Ubuntu and booted up my Windows Vista laptop up with the Ubuntu CD. I was able to connect wirelessly to my router and stay connected for about a minute. After that it would disconnect and my router would be disabled: none of my other wireless devices would see the signal any longer, as if it disappeared. I would have to cycle the router's power toggle off/on to get it to come back on and put out a signal again. This happened repeatedly. I experienced no other problems trying the software (i.e., accessed my files w/o issue). I did not attempt to connect with an Ethernet cable. Here are the specs on my system: laptop is HP Pavilion dv5 Notebook PC system type is 64-bit operating system CPU is Intel Core 2 Duo P8600 2.4GHz ram is 4 GB network card is Intel WiFi Link 5100 AGN I read elsewhere in this forum that 64-bit systems may not be compatible with Ubuntu. Can anyone help me with this? I'd really like to be able to use this op system.

    Read the article

  • Thinkpad W510 with default graphics drivers shows weird brightness issues

    - by Chantz
    Hey guys, I am currently running 10.10 - 32 bit on a new Thinkpad W510 with nVidia Quadro FX 880M graphics card. I am running with the default graphics drivers that installed with ubuntu install. My problem is that when I am logging in the screen acts normally as far as birghtness is concerned. I can increase/decrease brightness with Fn keys. But few seconds after I log in screen goes pitch dark. Hitting Fn+Home flickers the screen to all the way bright, then all the way dark. This behavior continues until I reach maximum brightness, in which case the screen stays all the way bright, for a few more seconds and then again goes dark if there is no activity & the cycle continues. Have you guys faced any of these issues? If so any pointers on how to resolve it. I am not alone, on ubuntu forum I saw another person having the same issue - link but no solution. Please help! UPDATE I followed the instructions that htorque mentions in his answer and it worked.

    Read the article

  • Nvidia GeForce Gt-520M-cn on intel dh61ww Ubuntu 12.04

    - by j goseeped
    hi people i hope you can help a little bit , i appreciate your time look: i have a this desktop i7 2600, 8gb ram ddr3, board intel dh61ww, Geforce Nvidia GT520-cn 2Gb ddr3, i just install ubuntu 64bits 12.04 kernel 3.2.0-23-generic , i want to setup two monitors samsung led 22" and get start mi video card 1) i download and installed nvidia driver 295.59 and also try with 302.17 to apt-update and upgrade, apt-get install build-essential linux-headers-$(uname -r), apt-get remove --purge nvidia*, apt-get remove --purge xserver-xorg-video-nouveau, vim /etc/modprobe.d/blacklist.conf blacklist vga16fb blacklist nouveau blacklist rivafb blacklist nvidiafb blacklist rivatv sh NVIDIA.run, sudo service lightdm start, reboot, nvidia-xorgconf 2)after reboot i get 800x600 and nvidia-settings say this. You do not appear to be using the NVIDIA X driver. Please edit your X configuration file (just run nvidia-xconfig as root), and restart the X server. 3) i change a little bit xorg.conf to set up a resolution to work property 4) i dont have any image in the monito and i dont have any option on Nvidia X server settings lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: NVIDIA Corporation GF119 [GeForce GT 520] (rev a1) egrep -i 'glx|nvidia' /var/log/Xorg.0.log [ 12.005] (II) LoadModule: "glx" [ 12.005] (II) Loading /usr/lib/xorg/modules/extensions/libglx.so [ 12.575] (II) Module glx: vendor="NVIDIA Corporation" [ 12.585] (II) NVIDIA GLX Module 302.17 Tue Jun 12 16:22:45 PDT 2012 [ 12.585] (II) Loading extension GLX [ 13.037] (EE) Failed to initialize GLX extension (Compatible NVIDIA X driver not found) [ 13.044] (II) config/udev: Adding input device HDA NVidia HDMI/DP,pcm=3 (/dev/input/event10) [ 13.044] (II) config/udev: Adding input device HDA NVidia HDMI/DP,pcm=7 (/dev/input/event9) glxinfo | grep direct Xlib: extension "GLX" missing on display ":0.0". Error: couldn't find RGB GLX visual or fbconfig sorry my english is no very well. and thanks guys

    Read the article

< Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >