Search Results

Search found 8172 results on 327 pages for 'vector graphics'.

Page 52/327 | < Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >

  • C++ int vector to c#

    - by Stefan Koenen
    I'm doing a C# project and I want to call next_permutation from the algorithm library in C++. I found the way to call c++ functions in c# but i dont know how to get vectors from c++ and use it in c# (cause next_permutation require a int vector...) this is what I'm trying at the moment: extern void NextPermutation(vector<int>& permutation) { next_permutation (permutation.begin(),permutation.end()); } [DllImport("PEDLL.dll", CallingConvention = CallingConvention.Cdecl)] private static extern void NextPermutation(IntPtr test);

    Read the article

  • Creating a series of vectors from a vector

    - by bluetongue
    I have a simple two vector dataframe (length=30) that looks something like this: > mDF Param1 w.IL.L 1 AuZgFw 0.5 2 AuZfFw 2 3 AuZgVw 74.3 4 AuZfVw 20.52 5 AuTgIL 80.9 6 AuTfIL 193.3 7 AuCgFL 0.2 8 ... I'd like to use each of the rows to form 30 single value numeric vectors with the name of the vector taken from mDF$Param1, so that: > AuZgFw [1] 0.5 etc I've tried melting and casting, but I suspect there may be an easier way?? Thanks in advance BT

    Read the article

  • AS3 Embed Image Instead of drawing graphics

    - by David
    I've got a custom AS3 scrollbar that I need to modify. The scroll thumb (the draggable part of a scrollbar) is currently just done by drawing a rectangle. But I need it to look more like a real scrollbar. Not sure how to modify the below code to import/use a scroll thumb image: scrollThumb = new Sprite(); scrollThumb.graphics.lineStyle(); scrollThumb.graphics.beginFill(0x0066ff); scrollThumb.graphics.drawRect(0, 0, 1, 1); addChild(scrollThumb); I know that I would embed an image by doing something like this: [Embed(source="images/image1.png")] private static var Image1Class:Class; But then how do I set the scrollThumb = to the image? Thanks!

    Read the article

  • How can let Qt Graphics View Framework support custom layers

    - by jnblue
    Qt's graphics view frameworks is very powerful, but I have not found a way to support custom layers. In Qt, there is a QGraphicsScene::ItemLayer,but QGraphicsScene renders all items are in this layer. I want manage the items with several layers, Just like Illustrator and CorelDraw. all the item only in the current layer will receive the event, be selected or get the key focus etc.. Other layers(not current layer) will not receive all scene event. The most reasons of using layers is I could catalogue a large number of items more clearly.And without needing transfer events to all the layers' items ,I think the graphics frameworks will be more efficient. The last question, does QGraphicsView support rendering server stacked graphics scenes at the same time? If support, I think the "custom layers" can be solved in this way. Thanks very much!

    Read the article

  • Playing with Graphics in Flex

    - by Anoop
    Hi All, I was just going through one code used to draw one chart. This code is written in the updateDisplayList of the itemrenderer of column chart. I am not good at the graphics part of flex. Can anybody please explain me what this code is doing. I can see the final output, but am not sure how is this achieved. var rc:Rectangle = new Rectangle(0, 0, width , height ); var g:Graphics = graphics; g.clear(); g.moveTo(rc.left,rc.top); g.beginFill(fill); g.lineTo(rc.right,rc.top); g.lineTo(rc.right,rc.bottom); g.lineTo(rc.left,rc.bottom); g.lineTo(rc.left,rc.top); g.endFill(); Regards, PK

    Read the article

  • as3 this.graphics calls do nothing

    - by zzz
    class A: [SWF(width='800',height='600',frameRate='24')] public class A extends MovieClip { private var b:B; public function A(){ super(); b = new B(); addChild(b); addEventListener(Event.ENTER_FRAME, update); } private function update(e:Event):void { b.draw(); } } class B: public class B extends MovieClip { public function draw():void { //! following code works well if put in constructor, but not here this.graphics.beginFill(0xff0000); this.graphics.drawCircle(200,200,50); } } this.graphics calls do nothing in draw method, but work fine inside B`s constructor, what i am doing wrong ?

    Read the article

  • Why is my second monitor not working?

    - by StampedeXV
    Since I have my new computer, I have a very weird problem. Facts: New Computer: Motherboard: ASRock Z77 Pro 3 Graphics-card: Asus1GB D5 X EN GTX560 DCII OC/2DI R CPU: Intel i5-3570 Windows 7 64bit 500W beQuiet special edition (92% efficiency) 8GB 1333MHz DDR3 Corsair RAM (CL9) Scythe Mugen 2 2 magnetic HDDs + 1 SDD 1 DVD-R Old Computer: Motherboard: Asus P55 something Graphics-card: Asus1GB D5 X EN GTX560 DCII OC/2DI R CPU: Intel i7-870 Windows 7 64bit 550W Corsair 8GB 1333MHz DDR3 Corsair RAM (CL9) Scythe Mugen 3 2 magnetic HDDs + 1 SDD 1 DVD-R On the old computer it worked fine with two monitors. Moving to the new (I took the same Graphics-card) it only works with one. The weird thing I mentioned is: not matter which one. But if I put both there, only one is available. There is no reaction at the start (where normally (at least if I remember correctly) the monitor shortly went from "standby" to "on"). Windows does not recognize a second monitor in the Device Manager. I have the latest drivers for Motherboard and Graphics-card. I have the latest BIOS drivers. I am out of ideas. Edit: completed computer setup

    Read the article

  • If I run two monitors from two different graphic cards, can I still have Twinview?

    - by rumtscho
    I am planning to get a second 2560x1440 monitor for home. The trouble is, I only have 1xDVI, 1xVGA on my graphics card (a 250 GT). I don't want to buy a new graphics card until the prices for the 500 series have stabilized, so probably not before summer (or will it happen earlier? I don't remember how it was for other series, and I couldn't find long-term price history for video cards). The solution I had in mind is to get the 7600 GS from my old PC, which also has 1xDVI, 1xVGA, and run each monitor on a separate card. I have never done that, and I was wondering 1. If I will be able to run the monitors in Twinview then, or will I be stuck with separate X sessions, and 2. Whether there are some other disadvantages as compared to a single-head graphics card. (I am using the proprietary driver because I need compiz). As an aside, how do I find out whether the DVI port on the old graphics card is dual link?

    Read the article

  • Unity desktop "smears" (doesn't refresh) and shows no wallpaper

    - by Cedric Reichenbach
    Since a couple of days now, my unity desktop background smears everything, just like what old Windows versions were famous for: Of course, I tried rebooting a couple of times. Also, I switched graphics driver and I tried to change wallpaper and theme, but none of them solved the problem. What could be causing that problem, and where can I search on for its source? Infomation update I'm using Ubuntu 13.04 (not updated to 13.10 yet). The following command were all run from cinnamon (on the same Ubuntu installation). sudo lsb_release -a: No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 13.04 Release: 13.04 Codename: raring sudo uname -a: Linux cedric-MacBookPro 3.8.0-32-generic #47-Ubuntu SMP Tue Oct 1 22:35:23 UTC 2013 x86_64 x86_64 x86_64 GNU/Linux sudo dpkg -l | grep xserver-xorg-video: ii xserver-xorg-video-all 1:7.7+1ubuntu4 amd64 X.Org X server -- output driver metapackage ii xserver-xorg-video-ati 1:7.1.0-0ubuntu2 amd64 X.Org X server -- AMD/ATI display driver wrapper ii xserver-xorg-video-cirrus 1:1.5.2-0ubuntu1 amd64 X.Org X server -- Cirrus display driver ii xserver-xorg-video-fbdev 1:0.4.3-0ubuntu1 amd64 X.Org X server -- fbdev display driver ii xserver-xorg-video-intel 2:2.21.6-0ubuntu4.3 amd64 X.Org X server -- Intel i8xx, i9xx display driver ii xserver-xorg-video-mach64 6.9.3-0ubuntu1 amd64 X.Org X server -- ATI Mach64 display driver ii xserver-xorg-video-mga 1:1.6.2-0ubuntu1 amd64 X.Org X server -- MGA display driver ii xserver-xorg-video-modesetting 0.7.0-0ubuntu2 amd64 X.Org X server -- Generic modesetting driver ii xserver-xorg-video-neomagic 1:1.2.7-0ubuntu1 amd64 X.Org X server -- Neomagic display driver ii xserver-xorg-video-nouveau 1:1.0.7-0ubuntu1 amd64 X.Org X server -- Nouveau display driver ii xserver-xorg-video-openchrome 1:0.3.1-0ubuntu1.13.04.1 amd64 X.Org X server -- VIA display driver ii xserver-xorg-video-qxl 0.1.0-0ubuntu3 amd64 X.Org X server -- QXL display driver ii xserver-xorg-video-r128 6.9.1-0ubuntu1 amd64 X.Org X server -- ATI r128 display driver ii xserver-xorg-video-radeon 1:7.1.0-0ubuntu2 amd64 X.Org X server -- AMD/ATI Radeon display driver ii xserver-xorg-video-s3 1:0.6.5-0ubuntu3 amd64 X.Org X server -- legacy S3 display driver ii xserver-xorg-video-savage 1:2.3.6-0ubuntu1 amd64 X.Org X server -- Savage display driver ii xserver-xorg-video-siliconmotion 1:1.7.7-0ubuntu1 amd64 X.Org X server -- SiliconMotion display driver ii xserver-xorg-video-sis 1:0.10.7-0ubuntu1 amd64 X.Org X server -- SiS display driver ii xserver-xorg-video-sisusb 1:0.9.6-0ubuntu1 amd64 X.Org X server -- SiS USB display driver ii xserver-xorg-video-tdfx 1:1.4.5-0ubuntu1 amd64 X.Org X server -- tdfx display driver ii xserver-xorg-video-trident 1:1.3.6-0ubuntu2 amd64 X.Org X server -- Trident display driver ii xserver-xorg-video-vesa 1:2.3.2-0ubuntu1 amd64 X.Org X server -- VESA display driver ii xserver-xorg-video-vmware 1:12.0.2+git.e5ac80d8-0ubuntu1 amd64 X.Org X server -- VMware display driver sudo lspci | grep VGA: 01:00.0 VGA compatible controller: NVIDIA Corporation GT216M [GeForce GT 330M] (rev a2)

    Read the article

  • GPU hung when switching graphic card

    - by Lie Ryan
    I have a laptop (Dell Inspiron N4110) with a switchable graphic. $ lspci | grep VGA 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: ATI Technologies Inc NI Whistler [AMD Radeon HD 6600M Series] (rev ff) Normally, my laptop starts with both graphic cards enabled, which caused the laptop to turn very hot and the fan to become very noisy. I have been using a small script to disable the Radeon card. For some time, I'm quite happy with this arrangement. However, I have been having some issues with the Intel card (IGD), the Intel card often randomly hang when running OpenGL apps; and so I want to give the Radeon card (DIS) another chance. I have never been able to switch to the Radeon card, but recently, I found out that if I do a "delayed switching" (DDIS): # echo "DDIS" > /sys/kernel/debug/vgaswitcheroo/switch root@lieryan-dell-ubuntu:/sys/kernel/debug/vgaswitcheroo# cat switch 0:IGD:+:Pwr:0000:00:02.0 1:DIS: :Pwr:0000:01:00.0 then I logoff (i.e. to restart X), the screen switch to pseudo-tty and then it stuck there freezing. At this situation, mouse and keyboard stops working so I can't switch to another ptty. I tried ssh-ing from another computer to salvage logs (dmesg at that point) and whatnot; I found out that when freezing, the active graphic card is the AMD card: -- this is from ssh -- # cat switch 0:IGD: :Off:0000:00:02.0 1:DIS:+:Pwr:0000:01:00.0 but the GPU is apparently hung, looking at dmesg gives: ... [ 1411.649974] vga_switcheroo: client 0 refused switch [ 1411.649985] vga_switcheroo: setting delayed switch to client 1 [ 1423.911759] vga_switcheroo: processing delayed switch to 1 [ 1424.006564] fbcon: Remapping primary device, fb1, to tty 1-63 [ 1424.006799] i915: switched off [ 1424.840351] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id [ 1425.718088] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id [ 1426.622377] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id [ 1427.355683] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id [ 1428.193549] [drm:drm_mode_getfb] *ERROR* invalid framebuffer id ... the invalid framebuffer id error is repeated for many times over ... I were able to successfully recover by switching back to the Intel card and restarting X from ssh; indicating that only the Radeon card has problems switching. System info: $ uname -a Linux lieryan-dell-ubuntu 3.0.0-14-generic #23-Ubuntu SMP Mon Nov 21 20:28:43 UTC 2011 x86_64 x86_64 x86_64 GNU/Linux $ lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 11.10 Release: 11.10 Codename: oneiric The laptop also do not have the option to set graphic card at BIOS and the proprietary driver, fglrx, also have never worked; when I installed it through jockey ("Additional Drivers"), glxinfo showed that it still being rendered by Mesa, the /sys/kernel/debug/vgaswitcheroo directory has gone missing, and the driver crashes with a traceback if I use xorg.conf to tell X to use fglrx. Anyone had any idea if it is possible to use this AMD card either with the radeon or the fglrx driver? logs: dmesg

    Read the article

  • Cannot get 3D OpenGL support in Vmware guests, how can I fix this?

    - by jjapol
    I have been working at this problem for 2 days now. I cannot for the life of me enable 3D support in VMWare 9 guests. My specifications are: Hardware: Dell Latitude E5520 laptop. Processor: Intel i7-2620M CPU @ 2.70GHz × 4. Memory: 8GB. Video: Intel Sandybridge Mobile x86/MMX/SSE2 OS: Ubuntu 12.04.1 LTS, 32 bit. Vmware Workstation: 9.0.1 build-894247 Glxgears functions fine. Frame rate is ~60fps. Vmware guest: Windows 7 Starting the Windows 7 guest in VMware throws the following errors: No 3D support is available from the host. and Hardware graphics acceleration is not available. I've read through this VMware forum thread, but again the hardware in the post is different (nVidia). I've followed the instructions at this Ask Ubuntu post as closely as possible as the question is nearly the same as mine although my hardware is different. Answer 1 regarding setting mks.gl.allowBlacklistedDrivers = TRUE; in my vmx configuration file causes the VM to crash when it starts. The second answer I followed as closely as possible. I uninstalled VMware, Did sudo apt-get install build-essential linux-headers-$(uname -r) at a terminal, Added the PPA https://launchpad.net/~glasen/+archive/intel-driver, Then at a terminal did sudo apt-get update && sudo apt-get upgrade -y I reinstalled VMware and have the same results: no 3D in guests. I'm getting the feeling that something is awry with the Sandy Bridge driver, but I can't seem to come up with any solutions. Has anyone out there run across this problem also? By the way, the operation of the likes of Solidworks and AutoCad within a Windpws 7 guest does appear to be improved in VMware 9 vs VMware 8 in spite of the fact that 3D support is lacking in the Windows 7 guest. I'd also add that my glxinfo file was nearly identical to the glxinfo file posted at askubuntu.com/questions/181829/…. I had a total of seven minor differences per a comparison using Meld. –

    Read the article

  • vga_switcheroo and Intel HD 3000 on Ubuntu 12.04

    - by Ikalou
    I'm trying to get vga_switcheroo to enable my integrated Intel HD 3000 instead of my ATI card. My problem is that there is no vgaswitcheroo directory in /sys/kernel/debug/ on my system. > grep -i switcheroo /boot/config-3.2.0-26-generic CONFIG_VGA_SWITCHEROO=y And yet: > sudo ls /sys/kernel/debug/ acpi bdi bluetooth dri extfrag gpio ieee80211 kprobes mce mmc0 regmap regulator sched_features suspend_stats tracing usb wakeup_sources x86 I am NOT using the fglrx driver. Here is the output of lspci; glxinfo | grep renderer: 00:00.0 Host bridge: Intel Corporation 2nd Generation Core Processor Family DRAM Controller (rev 09) 00:01.0 PCI bridge: Intel Corporation Xeon E3-1200/2nd Generation Core Processor Family PCI Express Root Port (rev 09) 00:16.0 Communication controller: Intel Corporation 6 Series/C200 Series Chipset Family MEI Controller #1 (rev 04) 00:19.0 Ethernet controller: Intel Corporation 82579LM Gigabit Network Connection (rev 04) 00:1a.0 USB controller: Intel Corporation 6 Series/C200 Series Chipset Family USB Enhanced Host Controller #2 (rev 04) 00:1b.0 Audio device: Intel Corporation 6 Series/C200 Series Chipset Family High Definition Audio Controller (rev 04) 00:1c.0 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 1 (rev b4) 00:1c.1 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 2 (rev b4) 00:1c.2 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 3 (rev b4) 00:1c.3 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 4 (rev b4) 00:1c.7 PCI bridge: Intel Corporation 6 Series/C200 Series Chipset Family PCI Express Root Port 8 (rev b4) 00:1d.0 USB controller: Intel Corporation 6 Series/C200 Series Chipset Family USB Enhanced Host Controller #1 (rev 04) 00:1f.0 ISA bridge: Intel Corporation QM67 Express Chipset Family LPC Controller (rev 04) 00:1f.2 SATA controller: Intel Corporation 6 Series/C200 Series Chipset Family 6 port SATA AHCI Controller (rev 04) 01:00.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI Seymour [Radeon HD 6400M Series] 01:00.1 Audio device: Advanced Micro Devices [AMD] nee ATI Caicos HDMI Audio [Radeon HD 6400 Series] 24:00.0 FireWire (IEEE 1394): JMicron Technology Corp. IEEE 1394 Host Controller (rev 30) 24:00.1 System peripheral: JMicron Technology Corp. SD/MMC Host Controller (rev 30) 24:00.2 SD Host controller: JMicron Technology Corp. Standard SD Host Controller (rev 30) 25:00.0 Network controller: Intel Corporation Centrino Advanced-N 6205 (rev 34) 26:00.0 USB controller: NEC Corporation uPD720200 USB 3.0 Host Controller (rev 04) OpenGL renderer string: Gallium 0.4 on AMD CAICOS Both xserver-xorg-video-intel and xserver-xorg-video-radeon packages are installed. I know there are tons of posts about hybrid-graphics already but I couldn't quite find a solution to my problem. Does anyone know why is /sys/kernel/debug/vgaswitcheroo not showing?

    Read the article

  • Insert SVG directly into Microsoft Publisher without converting file format

    - by nhinkle
    How can I insert an SVG image into a Microsoft Publisher 2010 document as a vector image without having to first convert it to a bitmap format like PNG? Copying and pasting an SVG file into a Publisher document does not work. I am aware that one can convert an SVG to EPS, and insert that, since Publisher accepts EPS files. The problem is that it is time consuming to convert, and often the colors come out wrong. If this is the only way to get vector graphics into Publisher, then is there a one-step method to convert an SVG to EPS and paste it into Publisher at one fell swoop?

    Read the article

  • vector quality of svg and pdf

    - by Kasper
    I'm converting pdf files to svg as it is easier to use svg files on webpages. I first thought the quality of svg must be similar to pdf, as they are both vector graphics. However, now I look a little better on it, it seems that pdf is a bit superior: (https://dl.dropboxusercontent.com/u/58922976/Photos/1.png) I wonder if I could change this in some way. Is this because pdf vectors are just better quality ? Or is this because chrome renders svg in lower quality than adobe reader renders pdf ? Is this a setting in the svg file that I could change ? Here is the pdf file: https://dl.dropboxusercontent.com/u/58922976/syllabusLinAlg2012.59.pdf And here is the svg file: (https://dl.dropboxusercontent.com/u/58922976/syllabusLinAlg2012.59.svg) I've made this svg file in illustrator, and only chrome is able to use the embedded svg fonts. So firefox and internet explorer won't give the expected result.

    Read the article

  • PHP Zend Hash Vulnerability Exploitation Vector [closed]

    - by Resurrected Laplacian
    Possible Duplicate: CVE-2007-5416 PHP Zend Hash Vulnerability Exploitation Vector (Drupal) According to exploit-db, http://www.exploit-db.com/exploits/4510/, it says the following: Example: http://www.example.com/drupal/?_menu[callbacks][1][callback]=drupal_eval&_menu[items][][type]=-1&-312030023=1&q=1/ What are "[callbacks]","[1]" and all these stuffs? What should I put in to these stuffs? Can anyone present a real possible example? I wasn't asking for a real website; I was asking for a possible example! So, how would address be like - what should I put in to these stuffs, as the question says..

    Read the article

  • How to find vector for the quaternion from X Y Z rotations

    - by can poyrazoglu
    I am creating a very simple project on OpenGL and I'm stuck with rotations. I am trying to rotate an object indepentdently in all 3 axes: X, Y, and Z. I've had sleepless nights due to the "gimbal lock" problem after rotating about one axis. I've then learned that quaternions would solve my problem. I've researched about quaternions and implementd it, but I havent't been able to convert my rotations to quaternions. For example, if I want to rotate around Z axis 90 degrees, I just create the {0,0,1} vector for my quaternion and rotate it around that axis 90 degrees using the code here: http://iphonedevelopment.blogspot.com/2009/06/opengl-es-from-ground-up-part-7_04.html (the most complicated matrix towards the bottom) That's ok for one vector, but, say, I first want to rotate 90 degrees around Z, then 90 degrees around X (just as an example). What vector do I need to pass in? How do I calculate that vector. I am not good with matrices and trigonometry (I know the basics and the general rules, but I'm just not a whiz) but I need to get this done. There are LOTS of tutorials about quaternions, but I seem to understand none (or they don't answer my question). I just need to learn to construct the vector for rotations around more than one axis combined. UPDATE: I've found this nice page about quaternions and decided to implement them this way: http://www.cprogramming.com/tutorial/3d/quaternions.html Here is my code for quaternion multiplication: void cube::quatmul(float* q1, float* q2, float* resultRef){ float w = q1[0]*q2[0] - q1[1]*q2[1] - q1[2]*q2[2] - q1[3]*q2[3]; float x = q1[0]*q2[1] + q1[1]*q2[0] + q1[2]*q2[3] - q1[3]*q2[2]; float y = q1[0]*q2[2] - q1[1]*q2[3] + q1[2]*q2[0] + q1[3]*q2[1]; float z = q1[0]*q2[3] + q1[1]*q2[2] - q1[2]*q2[1] + q1[3]*q2[0]; resultRef[0] = w; resultRef[1] = x; resultRef[2] = y; resultRef[3] = z; } Here is my code for applying a quaternion to my modelview matrix (I have a tmodelview variable that is my target modelview matrix): void cube::applyquat(){ float& x = quaternion[1]; float& y = quaternion[2]; float& z = quaternion[3]; float& w = quaternion[0]; float magnitude = sqrtf(w * w + x * x + y * y + z * z); if(magnitude == 0){ x = 1; w = y = z = 0; }else if(magnitude != 1){ x /= magnitude; y /= magnitude; z /= magnitude; w /= magnitude; } tmodelview[0] = 1 - (2 * y * y) - (2 * z * z); tmodelview[1] = 2 * x * y + 2 * w * z; tmodelview[2] = 2 * x * z - 2 * w * y; tmodelview[3] = 0; tmodelview[4] = 2 * x * y - 2 * w * z; tmodelview[5] = 1 - (2 * x * x) - (2 * z * z); tmodelview[6] = 2 * y * z - 2 * w * x; tmodelview[7] = 0; tmodelview[8] = 2 * x * z + 2 * w * y; tmodelview[9] = 2 * y * z + 2 * w * x; tmodelview[10] = 1 - (2 * x * x) - (2 * y * y); tmodelview[11] = 0; glMatrixMode(GL_MODELVIEW); glPushMatrix(); glLoadMatrixf(tmodelview); glMultMatrixf(modelview); glGetFloatv(GL_MODELVIEW_MATRIX, tmodelview); glPopMatrix(); } And my code for rotation (that I call externally), where quaternion is a class variable of the cube: void cube::rotatex(int angle){ float quat[4]; float ang = angle * PI / 180.0; quat[0] = cosf(ang / 2); quat[1] = sinf(ang/2); quat[2] = 0; quat[3] = 0; quatmul(quat, quaternion, quaternion); applyquat(); } void cube::rotatey(int angle){ float quat[4]; float ang = angle * PI / 180.0; quat[0] = cosf(ang / 2); quat[1] = 0; quat[2] = sinf(ang/2); quat[3] = 0; quatmul(quat, quaternion, quaternion); applyquat(); } void cube::rotatez(int angle){ float quat[4]; float ang = angle * PI / 180.0; quat[0] = cosf(ang / 2); quat[1] = 0; quat[2] = 0; quat[3] = sinf(ang/2); quatmul(quat, quaternion, quaternion); applyquat(); } I call, say rotatex, for 10-11 times for rotating only 1 degree, but my cube gets rotated almost 90 degrees after 10-11 times of 1 degree, which doesn't make sense. Also, after calling rotation functions in different axes, My cube gets skewed, gets 2 dimensional, and disappears (a column in modelview matrix becomes all zeros) irreversibly, which obviously shouldn't be happening with a correct implementation of the quaternions.

    Read the article

  • Nothing drawing on screen OpenGL with GLSL

    - by codemonkey
    I hate to be asking this kind of question here, but I am at a complete loss as to what is going wrong, so please bear with me. I am trying to render a single cube (voxel) in the center of the screen, through OpenGL with GLSL on Mac I begin by setting up everything using glut glutInit(&argc, argv); glutInitDisplayMode(GLUT_RGBA|GLUT_ALPHA|GLUT_DOUBLE|GLUT_DEPTH); glutInitWindowSize(DEFAULT_WINDOW_WIDTH, DEFAULT_WINDOW_HEIGHT); glutCreateWindow("Cubez-OSX"); glutReshapeFunc(reshape); glutDisplayFunc(render); glutIdleFunc(idle); _electricSheepEngine=new ElectricSheepEngine(DEFAULT_WINDOW_WIDTH, DEFAULT_WINDOW_HEIGHT); _electricSheepEngine->initWorld(); glutMainLoop(); Then inside the engine init camera & projection matrices: cameraPosition=glm::vec3(2,2,2); cameraTarget=glm::vec3(0,0,0); cameraUp=glm::vec3(0,0,1); glm::vec3 cameraDirection=glm::normalize(cameraPosition-cameraTarget); cameraRight=glm::cross(cameraDirection, cameraUp); cameraRight.z=0; view=glm::lookAt(cameraPosition, cameraTarget, cameraUp); lensAngle=45.0f; aspectRatio=1.0*(windowWidth/windowHeight); nearClippingPlane=0.1f; farClippingPlane=100.0f; projection=glm::perspective(lensAngle, aspectRatio, nearClippingPlane, farClippingPlane); then init shaders and check compilation and bound attributes & uniforms to be correctly bound (my previous question) These are my two shaders, vertex: #version 120 attribute vec3 position; attribute vec3 inColor; uniform mat4 mvp; varying vec3 fragColor; void main(void){ fragColor = inColor; gl_Position = mvp * vec4(position, 1.0); } and fragment: #version 120 varying vec3 fragColor; void main(void) { gl_FragColor = vec4(fragColor,1.0); } init the cube: setPosition(glm::vec3(0,0,0)); struct voxelData data[]={ //front face {{-1.0, -1.0, 1.0}, {0.0, 0.0, 1.0}}, {{ 1.0, -1.0, 1.0}, {0.0, 1.0, 1.0}}, {{ 1.0, 1.0, 1.0}, {0.0, 0.0, 1.0}}, {{-1.0, 1.0, 1.0}, {0.0, 1.0, 1.0}}, //back face {{-1.0, -1.0, -1.0}, {0.0, 0.0, 1.0}}, {{ 1.0, -1.0, -1.0}, {0.0, 1.0, 1.0}}, {{ 1.0, 1.0, -1.0}, {0.0, 0.0, 1.0}}, {{-1.0, 1.0, -1.0}, {0.0, 1.0, 1.0}} }; glGenBuffers(1, &modelVerticesBufferObject); glBindBuffer(GL_ARRAY_BUFFER, modelVerticesBufferObject); glBufferData(GL_ARRAY_BUFFER, sizeof(data), data, GL_STATIC_DRAW); glBindBuffer(GL_ARRAY_BUFFER, 0); const GLubyte indices[] = { // Front 0, 1, 2, 2, 3, 0, // Back 4, 6, 5, 4, 7, 6, // Left 2, 7, 3, 7, 6, 2, // Right 0, 4, 1, 4, 1, 5, // Top 6, 2, 1, 1, 6, 5, // Bottom 0, 3, 7, 0, 7, 4 }; glGenBuffers(1, &modelFacesBufferObject); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, modelFacesBufferObject); glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(indices), indices, GL_STATIC_DRAW); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0); and then the render call: glClearColor(0.52, 0.8, 0.97, 1.0); glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glEnable(GL_DEPTH_TEST); //use the shader glUseProgram(shaderProgram); //enable attributes in program glEnableVertexAttribArray(shaderAttribute_position); glEnableVertexAttribArray(shaderAttribute_color); //model matrix using model position vector glm::mat4 mvp=projection*view*voxel->getModelMatrix(); glUniformMatrix4fv(shaderAttribute_mvp, 1, GL_FALSE, glm::value_ptr(mvp)); glBindBuffer(GL_ARRAY_BUFFER, voxel->modelVerticesBufferObject); glVertexAttribPointer(shaderAttribute_position, // attribute 3, // number of elements per vertex, here (x,y) GL_FLOAT, // the type of each element GL_FALSE, // take our values as-is sizeof(struct voxelData), // coord every (sizeof) elements 0 // offset of first element ); glBindBuffer(GL_ARRAY_BUFFER, voxel->modelVerticesBufferObject); glVertexAttribPointer(shaderAttribute_color, // attribute 3, // number of colour elements per vertex, here (x,y) GL_FLOAT, // the type of each element GL_FALSE, // take our values as-is sizeof(struct voxelData), // coord every (sizeof) elements (GLvoid *)(offsetof(struct voxelData, color3D)) // offset of colour data ); //draw the model by going through its elements array glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, voxel->modelFacesBufferObject); int bufferSize; glGetBufferParameteriv(GL_ELEMENT_ARRAY_BUFFER, GL_BUFFER_SIZE, &bufferSize); glDrawElements(GL_TRIANGLES, bufferSize/sizeof(GLushort), GL_UNSIGNED_SHORT, 0); //close up the attribute in program, no more need glDisableVertexAttribArray(shaderAttribute_position); glDisableVertexAttribArray(shaderAttribute_color); but on screen all I get is the clear color :$ I generate my model matrix using: modelMatrix=glm::translate(glm::mat4(1.0), position); which in debug turns out to be for the position of (0,0,0): |1, 0, 0, 0| |0, 1, 0, 0| |0, 0, 1, 0| |0, 0, 0, 1| Sorry for such a question, I know it is annoying to look at someone's code, but I promise I have tried to debug around and figure it out as much as I can, and can't come to a solution Help a noob please? EDIT: Full source here, if anyone wants

    Read the article

  • Can't remove JPanel from JFrame while adding new class into it

    - by A.K.
    Basically, I have my Frame class, which instantiates all the properties for the JFrame, and draws a JLabel with an image (my title screen). Then I made a separate JPanel with a start button on it, and made a mouse listener that will allow me to remove these objects while adding in a new Board() class (Which paints the main game). *Note: The JLabel is SEPARATE from the JPanel, but it still gets moved to the side by it. Problem: Whenever I click the button though, it only shows a little square of what I presume is my board class trying to run. Code below for the Frame Class: package OurPackage; //Made By A.K. 5/24/12 //Contains Frame. import java.awt.BorderLayout; import java.awt.Color; import java.awt.Container; import java.awt.Dimension; import java.awt.Graphics; import java.awt.Graphics2D; import java.awt.GridBagLayout; import java.awt.GridLayout; import java.awt.Image; import java.awt.Rectangle; import java.awt.event.ActionEvent; import java.awt.event.ActionListener; import java.awt.event.KeyEvent; import java.awt.event.MouseAdapter; import java.awt.event.MouseEvent; import java.awt.event.MouseListener; import javax.swing.*; import javax.swing.plaf.basic.BasicOptionPaneUI.ButtonActionListener; public class Frame implements MouseListener { public static boolean StartGame = false; ImageIcon img = new ImageIcon(getClass().getResource("/Images/ActionJackTitle.png")); ImageIcon StartImg = new ImageIcon(getClass().getResource("/Images/JackStart.png")); public Image Title; JLabel TitleL = new JLabel(img); public JPanel panel = new JPanel(); JButton StartB = new JButton(StartImg); JFrame frm = new JFrame("Action-Packed Jack"); public Frame() { TitleL.setPreferredSize(new Dimension(1200, 420)); frm.add(TitleL); frm.setLayout(new GridBagLayout()); frm.add(panel); panel.setSize(new Dimension(220, 45)); panel.setLayout(new GridBagLayout ()); panel.add(StartB); StartB.addMouseListener(this); StartB.setPreferredSize(new Dimension(220, 45)); frm.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); frm.setSize(1200, 420); frm.setVisible(true); frm.setResizable(false); frm.setLocationRelativeTo(null); } public static void main(String[] args) { new Frame(); } public void mouseClicked(MouseEvent e) { StartB.setContentAreaFilled(false); panel.remove(StartB); frm.remove(panel); frm.remove(TitleL); //frm.setLayout(null); frm.add(new Board()); //Add Game "Tiles" Or Content. x = 1200 frm.validate(); System.out.println("Hit!"); } @Override public void mouseEntered(MouseEvent arg0) { // TODO Auto-generated method stub } @Override public void mouseExited(MouseEvent arg0) { // TODO Auto-generated method stub } @Override public void mousePressed(MouseEvent arg0) { // TODO Auto-generated method stub } @Override public void mouseReleased(MouseEvent arg0) { // TODO Auto-generated method stub } }

    Read the article

  • nvidia driver problems after upgrading to 3.2.0-26 on Ubuntu 12.04 64bit

    - by Lev Levitsky
    After installing latest updates I can't set screen resolution higher than 1024x768; every time after the boot I get a message Could not apply the stored configuration for the monitors (Note: removing ~/.config/monitors.xml stopped the message, but not the problem) I can boot with 3.2.0-25 and the graphics look normal. Here's what I have in /var/log/apt/term.log (excerpt): Setting up linux-image-3.2.0-26-generic (3.2.0-26.41) ... Running depmod. update-initramfs: deferring update (hook will be called later) Examining /etc/kernel/postinst.d. run-parts: executing /etc/kernel/postinst.d/dkms 3.2.0-26-generic /boot/vmlinuz-3.2.0-26-generic Error! Problems with depmod detected. Automatically uninstalling this module. DKMS: Install Failed (depmod problems). Module rolled back to built state. run-parts: executing /etc/kernel/postinst.d/initramfs-tools 3.2.0-26-generic /boot/vmlinuz-3.2.0-26-generic update-initramfs: Generating /boot/initrd.img-3.2.0-26-generic run-parts: executing /etc/kernel/postinst.d/pm-utils 3.2.0-26-generic /boot/vmlinuz-3.2.0-26-generic run-parts: executing /etc/kernel/postinst.d/update-notifier 3.2.0-26-generic /boot/vmlinuz-3.2.0-26-generic run-parts: executing /etc/kernel/postinst.d/zz-update-grub 3.2.0-26-generic /boot/vmlinuz-3.2.0-26-generic Generating grub.cfg ... Found linux image: /boot/vmlinuz-3.2.0-26-generic Found initrd image: /boot/initrd.img-3.2.0-26-generic Found linux image: /boot/vmlinuz-3.2.0-25-generic Found initrd image: /boot/initrd.img-3.2.0-25-generic Found linux image: /boot/vmlinuz-3.2.0-24-generic Found initrd image: /boot/initrd.img-3.2.0-24-generic Found linux image: /boot/vmlinuz-3.2.0-23-generic Found initrd image: /boot/initrd.img-3.2.0-23-generic Found linux image: /boot/vmlinuz-3.0.0-17-generic Found initrd image: /boot/initrd.img-3.0.0-17-generic Found memtest86+ image: /boot/memtest86+.bin I went to "additional drivers" and saw some updates available there, but an attempt to install them failed, leaving the following in /var/log/jockey.log (long log, pasted here). The full log won't fit in the question, so I'm showing $ fgrep 'ERROR' /var/log/jockey.log 2012-06-30 17:29:57,897 WARNING: modinfo for module vmxnet failed: ERROR: modinfo: could not find module vmxnet 2012-06-30 17:29:57,937 WARNING: modinfo for module wl failed: ERROR: modinfo: could not find module wl 2012-06-30 17:29:58,072 WARNING: modinfo for module nvidia_96 failed: ERROR: modinfo: could not find module nvidia_96 2012-06-30 17:29:58,240 WARNING: modinfo for module nvidia_current failed: ERROR: modinfo: could not find module nvidia_current 2012-06-30 17:29:58,293 WARNING: modinfo for module nvidia_current_updates failed: ERROR: modinfo: could not find module nvidia_current_updates 2012-06-30 17:29:58,351 WARNING: modinfo for module nvidia_173_updates failed: ERROR: modinfo: could not find module nvidia_173_updates 2012-06-30 17:29:58,385 WARNING: modinfo for module nvidia_173 failed: ERROR: modinfo: could not find module nvidia_173 2012-06-30 17:29:58,420 WARNING: modinfo for module nvidia_96_updates failed: ERROR: modinfo: could not find module nvidia_96_updates 2012-06-30 17:29:58,455 WARNING: modinfo for module ath_pci failed: ERROR: modinfo: could not find module ath_pci 2012-06-30 17:29:58,478 WARNING: modinfo for module fglrx_updates failed: ERROR: modinfo: could not find module fglrx_updates 2012-06-30 17:29:58,531 WARNING: modinfo for module fglrx failed: ERROR: modinfo: could not find module fglrx 2012-06-30 17:29:58,588 WARNING: modinfo for module omapdrm_pvr failed: ERROR: modinfo: could not find module omapdrm_pvr 2012-06-30 17:29:59,537 WARNING: modinfo for module nvidia_current failed: ERROR: modinfo: could not find module nvidia_current 2012-06-30 17:29:59,613 WARNING: modinfo for module nvidia_173_updates failed: ERROR: modinfo: could not find module nvidia_173_updates 2012-06-30 17:29:59,686 WARNING: modinfo for module nvidia_173 failed: ERROR: modinfo: could not find module nvidia_173 2012-06-30 17:29:59,764 WARNING: modinfo for module nvidia_current_updates failed: ERROR: modinfo: could not find module nvidia_current_updates 2012-06-30 17:30:29,544 WARNING: modinfo for module nvidia_current_updates failed: ERROR: modinfo: could not find module nvidia_current_updates 2012-06-30 17:30:29,545 ERROR: XorgDriverHandler.enable(): package or module not installed, aborting I'm not sure if it's a bug, as the first log shows some errors. What can I try?

    Read the article

  • Cannot establish maximum resolution on ASUS PB278Q

    - by dentuzhik
    I've recently bought brand new ASUS PB278Q monitor. When trying to connect to my laptop, everything works great, except that I can't get the native resolution of my monitor (2560x1440) working. The automatic is 1920x1080. My graphic card is Nvidia GeForce 320m. Here's output from lspci for it: ~$ lspci | grep VGA 02:00.0 VGA compatible controller: NVIDIA Corporation GT216M [GeForce GT 320M] (rev a2) and also xrandr: ~$ xrandr Screen 0: minimum 8 x 8, current 3286 x 1437, maximum 8192 x 8192 VGA-0 disconnected (normal left inverted right x axis y axis) LVDS-0 connected primary 1366x768+0+669 (normal left inverted right x axis y axis) 344mm x 193mm 1366x768 60.0*+ HDMI-0 connected 1920x1080+1366+0 (normal left inverted right x axis y axis) 600mm x 340mm 1920x1080 60.0*+ 59.9 50.0 30.0 25.0 24.0 60.0 50.0 1680x1050 60.0 1440x900 59.9 1280x1024 75.0 60.0 1280x960 60.0 1280x800 59.8 1280x720 60.0 59.9 50.0 1152x864 75.0 1024x768 75.0 70.1 60.0 800x600 75.0 72.2 60.3 56.2 720x576 50.0 720x480 59.9 640x480 75.0 59.9 59.9 480x576 50.0 480x480 59.9 I have proprietary drivers installed on my machine, here's the info about the monitor from nvidia-settings (Actually I don't have enough reputation to post images, so here's the text): Chip Location: Internal Signal: TDMS Connection link: Single Native resolution: 2560x1440 Refresh rate: 60.00 Hz The monitor is connected to laptop via HDMI cable, and honestly I have no idea what version it is, and what version is my HDMI output of my graphics card. I tried to find how I can figure it out on the web, but had no luck. Also my video card has only VGA and HDMI outs so I can't test neither DVI-D cable nor DisplayPort. So apparently, there's some problem over there. At least I want to know exactly what's going on. I've tried to see if it a linux-specific problem, but windows also gave me the same resolution by default. What I've already tried: Connect through VGA (stupid one, of course it gave me 1920x1080). Checked two HDMI cables (not sure if they're the same or not, as mentioned above). Played around with xrandr and adding custom modes. Didn't help. Surfed for the info a lot on the web, but couldn't get appropriate results. Actually xrandr gives me the following: ~$ cvt 2560 1440 60 # 2560x1440 59.96 Hz (CVT 3.69M9) hsync: 89.52 kHz; pclk: 312.25 MHz Modeline "2560x1440_60.00" 312.25 2560 2752 3024 3488 1440 1443 1448 1493 -hsync +vsync ~$ xrandr --newmode "2560x1440_60.00" 312.25 2560 2752 3024 3488 1440 1443 1448 1493 -hsync +vsync ~$ xrandr Screen 0: minimum 8 x 8, current 3286 x 1437, maximum 8192 x 8192 VGA-0 disconnected (normal left inverted right x axis y axis) LVDS-0 connected 1366x768+0+669 (normal left inverted right x axis y axis) 344mm x 193mm 1366x768 60.0*+ HDMI-0 connected primary 1920x1080+1366+0 (normal left inverted right x axis y axis) 600mm x 340mm 1920x1080 60.0*+ 59.9 50.0 30.0 25.0 24.0 60.0 50.0 1680x1050 60.0 1440x900 59.9 1280x1024 75.0 60.0 1280x960 60.0 1280x800 59.8 1280x720 60.0 59.9 50.0 1152x864 75.0 1024x768 75.0 70.1 60.0 800x600 75.0 72.2 60.3 56.2 720x576 50.0 720x480 59.9 640x480 75.0 59.9 59.9 480x576 50.0 480x480 59.9 2560x1440_60.00 (0x34f) 312.2MHz h: width 2560 start 2752 end 3024 total 3488 skew 0 clock 89.5KHz v: height 1440 start 1443 end 1448 total 1493 clock 60.0Hz ~$ xrandr --addmode HDMI-0 2560x1440_60.00 X Error of failed request: BadMatch (invalid parameter attributes) Major opcode of failed request: 140 (RANDR) Minor opcode of failed request: 18 (RRAddOutputMode) Serial number of failed request: 29 Current serial number in output stream: 30 What I intend to do next: Try another HDMI cable? Try HDMI to DVI-D cable? Try HDMI to DisplayPort cable? Another type of adapters? VGA to DVI-D? Buy another laptop with another graphic card. Damn. My ideas pretty much end here. Any ideas? Any explanations why it isn't working are appreciated.

    Read the article

  • How do I capture a WinForm window to a bitmap without the caret

    - by Steve Dunn
    I've got window on a WinForm that I want to get the bitmap representation of. For this, I use the following code (where codeEditor is the control I want a bitmap representation of): public Bitmap GetBitmap( ) { IntPtr srcDC = NativeMethods.GetDC( codeEditor.Handle ) ; var bitmap = new Bitmap( codeEditor.Width, codeEditor.Height ) ; Graphics graphics = Graphics.FromImage( bitmap ) ; var deviceContext = graphics.GetHdc( ) ; bool blitted = NativeMethods.BitBlt( deviceContext, 0, 0, bitmap.Width, bitmap.Height, srcDC, 0, 0, 0x00CC0020 /*SRCCOPY*/ ) ; if ( !blitted ) { throw new InvalidOperationException( @"The bitmap could not be generated." ) ; } int result = NativeMethods.ReleaseDC( codeEditor.Handle, srcDC ) ; if ( result == 0 ) { throw new InvalidOperationException( @"Cannot release bitmap resources." ) ; } graphics.ReleaseHdc( deviceContext ) ; graphics.Dispose( ) ; The trouble is, this captures the caret if it's flashing in the window at the time of capture. I tried calling the Win32 method HideCaret before capturing, but it didn't seem to have any effect.

    Read the article

  • ubuntu 9.10 screen flickering problem on Thinkpad R31 with Intel 83830 Graphics

    - by PA
    I am trying to revitalize an old Thinkpad R31 that has the Intel 82830 graphics and only 256 MB of RAM. I have tried a Xubuntu 9.10 Live CD. After booting the screen blinks so much that it is practically unreadable. I have searched for updated IBM Thinkpad drivers but I only found drivers for Windows on the Thinkpad support web site. EDIT: I have changed the title and the description. It is not a problem of the drivers. It seems to be a problem with the kernel. See my own answer below.

    Read the article

  • GeForce 9600GT support for dual monitor setup

    - by Theo
    I have a NVIDIA GeForce 9600 GT (point of view, link) on an Asus M2V-MX motherboard. The card has one DVI, HDMI and S-Video output. The graphics card does not seem to support dual monitors: the driver control panel for the graphics card does not detect a second monitor that is attached via S-Video (which leads me to believe that it doesn't support dual monitors with DVI and the other connections). Would it be possible to use a VGA/DVI splitter to attach two monitors to the single DVI output on the graphics card? Would this allow for a dual monitor setup, or only mirroring? How do I know if the graphic card supports this? With this particular motherboard, would it be possible to use the onboard video for another monitor?

    Read the article

  • Selection between two laptops for casual gaming [closed]

    - by Prabhpreet
    I have selected two laptops that meet my budget. Here are the differences: Laptop #1: 4GB Ram, Intel Core i5 2450M 2nd Gen processor w/ 2.5 Ghz clockspeed NVIDIA GeForce GT 520MX DDR3 1 GB Dedicated Graphics 750 GB SATA II Hard disk USB 2.0 Ports 6 hrs battery life Laptop #2: 4 GB Ram, Intel Core i5 3210M 3rd Gen processor w/ 2.5 Ghz clockspeed Integrated Intel HD Graphics 4000 500 GB SATA Hard disk USB 3.0 Ports 3 hrs battery life (This concerns me) First and foremost, does dedicated graphics matter for a casual gamer like me? Secondly, does the generation of the processors make a difference despite the same clockspeed. Thirdly, do usb 3.0 ports make a difference? And lastly, which laptop is more future proof? Please help me out. Thanks!

    Read the article

  • Drupal + Zen - not proper layout/no graphics

    - by Luma
    Hello, I installed Drupal (both using Fantastico, then started over from scratch) and Drupal works great except when I setup the Zen theme (following the instructions on the website and in the readme-first.txt) the theme does not show up properly. it has no graphics except for the tiny drupal logo way at the bottom. the rest is text. I have uploaded the theme to the /public_html/sites/all/themes folder and it does show up in themes in the administer section so I select enable and default. permissions on this folder are 755. any ideas? I checked the Drupal forums and someone is having the same problem but no one replied (this was in April) so I figured I would have a better shot on this awesome website. thanks. Luma

    Read the article

< Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >