Search Results

Search found 5806 results on 233 pages for 'graphics'.

Page 107/233 | < Previous Page | 103 104 105 106 107 108 109 110 111 112 113 114  | Next Page >

  • unknown monitor in lenova laptop

    - by kumar
    I have set of two lenova laptops in which Fedora core 13 is installed. In one machine, the monitor is detected properly such that It is possible to connect another monitor. But In another laptop, monitor is shown as unknown monitor. I tried to fix it by reinstalling xorg-x11-drv-intel.i686. But the problem remains same(unknown monitor). it is not possible to connect another monitor with this setting. Laptop model: G460. Graphics card: Intel Graphics Media accelerator HD. Thanks!

    Read the article

  • How can I use 2 monitors plus laptop with my Dell e6420 w/ Nvidia nvs 4200m

    - by KallDrexx
    I have just hooked up a 2nd external monitor to my Dell e6420 laptop with a Nvidia NVS 4200m graphics card, running Windows 8 64 bit. However, the computer won't let me have both monitors and the laptop display active at the same time. I installed the latest Nvidia graphics drivers (310.70) but it claims that my GPU can only support up to 2 monitors. Nivdia's website implies differently (as does various other laptops around the office). The monitors are connected both via DVI to my dell docking station that has multiple DVI ports. Both monitors are working correctly, I just can't get all 3 working together. Attempting to download the driver from dell fails, as their driver installer is broken apparently Any ideas?

    Read the article

  • Force Direct3D anti-aliasing in a Direct3D game?

    - by James McLaughlin
    Some old games look really jagged nowadays on large displays without any anti-aliasing, but don't have any option built-in to the game to enable it. On a PC with an NVIDIA graphics card, it's possible to force anti-aliasing in the NVIDIA control panel which can really improve this. But I'm playing the game in Parallels on a Mac, and although the Mac has an NVIDIA graphics card, it's Parallels' emulated card that Windows sees and so obviously there's no NVIDIA control panel. Is there some generic way I can force anti-aliasing for a Direct3D game without using the NVIDIA control panel?

    Read the article

  • java.lang.OutOfMemoryError: bitmap size exceeds VM budget

    - by Angel
    Hi, I am trying to change the layout of my application from portrait to landscape and vice-versa. But if i do it frequently or more than once then at times my application crashes.. Below is the error log. Please suggest what can be done? < 01-06 09:52:27.787: ERROR/dalvikvm-heap(17473): 1550532-byte external allocation too large for this process. 01-06 09:52:27.787: ERROR/dalvikvm(17473): Out of memory: Heap Size=6471KB, Allocated=4075KB, Bitmap Size=9564KB 01-06 09:52:27.787: ERROR/(17473): VM won't let us allocate 1550532 bytes 01-06 09:52:27.798: DEBUG/skia(17473): --- decoder-decode returned false 01-06 09:52:27.798: DEBUG/AndroidRuntime(17473): Shutting down VM 01-06 09:52:27.798: WARN/dalvikvm(17473): threadid=3: thread exiting with uncaught exception (group=0x4001e390) 01-06 09:52:27.807: ERROR/AndroidRuntime(17473): Uncaught handler: thread main exiting due to uncaught exception 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): java.lang.RuntimeException: Unable to start activity ComponentInfo{}: android.view.InflateException: Binary XML file line #2: Error inflating class 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2596) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2621) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.app.ActivityThread.handleRelaunchActivity(ActivityThread.java:3812) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.app.ActivityThread.access$2300(ActivityThread.java:126) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1936) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.os.Handler.dispatchMessage(Handler.java:99) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.os.Looper.loop(Looper.java:123) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.app.ActivityThread.main(ActivityThread.java:4595) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at java.lang.reflect.Method.invokeNative(Native Method) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at java.lang.reflect.Method.invoke(Method.java:521) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:860) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:618) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at dalvik.system.NativeStart.main(Native Method) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): Caused by: android.view.InflateException: Binary XML file line #2: Error inflating class 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.view.LayoutInflater.createView(LayoutInflater.java:513) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at com.android.internal.policy.impl.PhoneLayoutInflater.onCreateView(PhoneLayoutInflater.java:56) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.view.LayoutInflater.createViewFromTag(LayoutInflater.java:563) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.view.LayoutInflater.inflate(LayoutInflater.java:385) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.view.LayoutInflater.inflate(LayoutInflater.java:320) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.view.LayoutInflater.inflate(LayoutInflater.java:276) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at com.android.internal.policy.impl.PhoneWindow.setContentView(PhoneWindow.java:207) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.app.Activity.setContentView(Activity.java:1629) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at onCreate(Game.java:98) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1047) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2544) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): ... 12 more 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): Caused by: java.lang.reflect.InvocationTargetException 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.widget.LinearLayout.(LinearLayout.java:92) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at java.lang.reflect.Constructor.constructNative(Native Method) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at java.lang.reflect.Constructor.newInstance(Constructor.java:446) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.view.LayoutInflater.createView(LayoutInflater.java:500) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): ... 22 more 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): Caused by: java.lang.OutOfMemoryError: bitmap size exceeds VM budget 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.graphics.BitmapFactory.nativeDecodeAsset(Native Method) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.graphics.BitmapFactory.decodeStream(BitmapFactory.java:464) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.graphics.BitmapFactory.decodeResourceStream(BitmapFactory.java:340) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.graphics.drawable.Drawable.createFromResourceStream(Drawable.java:697) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.content.res.Resources.loadDrawable(Resources.java:1705) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.content.res.TypedArray.getDrawable(TypedArray.java:548) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.view.View.(View.java:1850) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.view.View.(View.java:1799) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): at android.view.ViewGroup.(ViewGroup.java:296) 01-06 09:52:27.857: ERROR/AndroidRuntime(17473): ... 26 more

    Read the article

  • Android: Programatically Add UI Elements to a View

    - by Shivan Raptor
    My view is written as follow: package com.mycompany; import android.view.View; import java.util.concurrent.TimeUnit; import java.util.ArrayList; import android.content.Context; import android.graphics.Canvas; import android.graphics.Color; import android.util.AttributeSet; import android.graphics.Paint; import android.graphics.Point; import android.hardware.Sensor; import android.hardware.SensorEvent; import android.hardware.SensorEventListener; import android.hardware.SensorManager; import android.widget.*; public class GameEngineView extends View implements SensorEventListener { GameLoop gameloop; String txt_acc; float accY; ArrayList<Point> bugPath; private SensorManager sensorManager; private class GameLoop extends Thread { private volatile boolean running = true; public void run() { while (running) { try { TimeUnit.MILLISECONDS.sleep(1); postInvalidate(); pause(); } catch (InterruptedException ex) { running = false; } } } public void pause() { running = false; } public void start() { running = true; run(); } public void safeStop() { running = false; interrupt(); } } public void unload() { gameloop.safeStop(); } public GameEngineView(Context context, AttributeSet attrs, int defStyle) { super(context, attrs, defStyle); // TODO Auto-generated constructor stub init(context); } public GameEngineView(Context context, AttributeSet attrs) { super(context, attrs); // TODO Auto-generated constructor stub init(context); } public GameEngineView(Context context) { super(context); // TODO Auto-generated constructor stub init(context); } private void init(Context context) { txt_acc = ""; // Adding SENSOR sensorManager=(SensorManager)context.getSystemService(Context.SENSOR_SERVICE); // add listener. The listener will be HelloAndroid (this) class sensorManager.registerListener(this, sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL); // Adding UI Elements : How ? Button btn_camera = new Button(context); btn_camera.setLayoutParams(new LinearLayout.LayoutParams(LinearLayout.LayoutParams.FILL_PARENT, LinearLayout.LayoutParams.FILL_PARENT)); btn_camera.setClickable(true); btn_camera.setOnClickListener(new OnClickListener() { @Override public void onClick(View v) { System.out.println("clicked the camera."); } }); gameloop = new GameLoop(); gameloop.run(); } @Override protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) { // TODO Auto-generated method stub //super.onMeasure(widthMeasureSpec, heightMeasureSpec); System.out.println("Width " + widthMeasureSpec); setMeasuredDimension(widthMeasureSpec, heightMeasureSpec); } @Override protected void onDraw(Canvas canvas) { // TODO Auto-generated method stub // super.onDraw(canvas); Paint p = new Paint(); p.setColor(Color.WHITE); p.setStyle(Paint.Style.FILL); p.setAntiAlias(true); p.setTextSize(30); canvas.drawText("|[ " + txt_acc + " ]|", 50, 500, p); gameloop.start(); } public void onAccuracyChanged(Sensor sensor,int accuracy){ } public void onSensorChanged(SensorEvent event){ if(event.sensor.getType()==Sensor.TYPE_ACCELEROMETER){ //float x=event.values[0]; accY =event.values[1]; //float z=event.values[2]; txt_acc = "" + accY; } } } I would like to add a Button to the scene, but I don't know how to. Can anybody give me some lights? UPDATE: Here is my Activity : public class MyActivity extends Activity { private GameEngineView gameEngine; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); // add Game Engine gameEngine = new GameEngineView(this); setContentView(gameEngine); gameEngine.requestFocus(); } }

    Read the article

  • Problem with intel chipset 4 serie and centos dealing with dual head

    - by Antoine
    I've a fujitsu lifebook S7220, it's been a while since i try to configure it to use a dual head with centos 5.4 x86_64. Everytime I try, the xserver crash... I've got an intel chipset mobile 4 serie (GMA 4500MHD, if I recall good!) When I do an lspci -v i've got these : 00:02.0 VGA compatible controller: Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller (rev 07) (prog-if 00 [VGA controller]) Subsystem: Fujitsu Limited. Unknown device 1451 Flags: bus master, fast devsel, latency 0, IRQ 177 Memory at f2000000 (64-bit, non-prefetchable) [size=4M] Memory at d0000000 (64-bit, prefetchable) [size=256M] I/O ports at 1800 [size=8] Capabilities: [90] Message Signalled Interrupts: 64bit- Queue=0/0 Enable- Capabilities: [d0] Power Management version 3 00:02.1 Display controller: Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller (rev 07) Subsystem: Fujitsu Limited. Unknown device 1451 Flags: bus master, fast devsel, latency 0 Memory at f2400000 (64-bit, non-prefetchable) [size=1M] Capabilities: [d0] Power Management version 3 My question is, anyone already got this problem and how did you fix it? Thank you for your answer!

    Read the article

  • Unknown monitor in Lenovo laptop

    - by kumar
    I have set of two Lenovo laptops on which Fedora core 13 is installed. On one machine, the monitor is detected properly such that it is possible to connect another monitor. But on another laptop, the monitor is shown as unknown monitor. I tried to fix it by reinstalling xorg-x11-drv-intel.i686. But the problem remains same (unknown monitor) and it is not possible to connect another monitor with this setting. Laptop model: Lenovo G460. Graphics card: Intel Graphics Media accelerator HD. Thanks!

    Read the article

  • HDMI connection does not support HDCP

    - by mroggi
    Hi, My problem: error message when playing blu-ray movies stating that the HDCP encryption could not be established. My setup: A new projector (EPSON EMP TW700) with a HDCP-compliant HDMI port a PC with a brand-new graphics adapter (Sapphire HD 4350 512MB DDR2) supporting HDCP Connection made with a DVI cable (it's installed in my wall) and a DVI-HDMI adapter to connect the projector Latest drivers and software My questions: What can I do to establish the HDCP connection? Would it help to use the HDMI output of the graphics adapter instead of the DVI (could it be that the HDCP chip is only supported on HDMI?) Any other ideas? I am very thankful for any hint.

    Read the article

  • What are my X client options for MS Windows?

    - by Nick Bolton
    I need to connect to a headless X Windows server (running on Ubuntu) from my MS Windows 7 computer over a 100 Mbit network. I could use VNC (or any other remote viewer) but the 3D graphics performance would be lousy I imagine. I used to have it hooked up to a monitor, but that's broken now and I can't afford a new one. A friend advised that I could try and use an X client, and that the 3D graphics wont suffer too much over 100 Mbit. Cygwin seems to be an option, but I was wondering if there were any more lightweight options.

    Read the article

  • Is there any way I can use two monitors in the console in Linux?

    - by Alex
    I have recently become the proud owner of two monitors in my workspace. (Ok not owner, but you know what I mean) and I'd like to use both of them at once. Problem is, I much much prefer to use a Linux Server console over a desktop environment. The graphics card on the machine is a GTX295 (don't ask why, it's a long story.) so I essentially have two graphics cards. Each has a DVI output. Is there any way I can get the console to stretch across two screens? Or will I have to install a desktop Ubuntu for this to work?

    Read the article

  • Motherboard Issue - 3 Beep Bios (memory error) despite new RAM

    - by Glenn
    I have an Intel dG43RK motherboard, bought new and sealed, and have tried two different brands and speeds of RAM with a 3-beep BIOS indicating a memory error, which also occurs without RAM installed (as it should). The memory tried is; 1x4GB 1333 Kingston HyperX DDR3 RAM (New and Sealed) 2x4GB Team Elite 1066 DDR3 RAM (New and Sealed) I have tried multiple configurations and seating layouts and still no luck. I also have a GT520 graphics card on board as I dislike in-built graphics in most cases and had it at hand (also new and sealed). The only used parts are the CPU, which worked in my previous tower and was directly taken from the PC into the new set-up and the CPU Fan which will be replaced with a new fan in the foreseeable future once this is resolved. I've run out of ideas myself and any help is appreciated.

    Read the article

  • Resolution of monitor is not supported by motherboard

    - by Sandesh
    I have Desktop with configuration Pentium 4,945 intel chipset board,dual booting with win 7 and ubuntu 10.10 (no graphics card) Recently i purchased Dell IN2020M 20" with native resolution of 1600x900 but my display allow maximum of 1024x768 because of this when i play any video in full screen mode it doesn't play smoothly or frames are refreshed jerkily I have tried updating my VGA driver but its doesn't helping me much. Is there any way to solve this problem ? 1if i want to replace monitor what maximum resolution should i buy ? 2if i want to upgrade(graphics card/motherboard) my desktop what is the minimum configuration to support the current system. Thanks in advance

    Read the article

  • How do you reset the range of available ports that libvirt autoport can use?

    - by bcmcfc
    Libvirt is using its autoport setting to automatically allocate ports within a range starting at 5900. Example excerpt from an XML configuration for a VM: <graphics type='spice' port='6000' autoport='yes' listen='127.0.0.1' keymap='en-gb'> <listen type='address' address='127.0.0.1'/> </graphics> Currently, there are free ports at various points within the range 5900 to 5999. However, newly booted VMs are picking up ports from 6000 on. I need for it to reuse the available ports in the 59xx range. Is this possible? If so, how do I do this? The problem arose because VMs are being accessed via websockets, and it tried to use 6000 which is a reserved port for X11. A solution that explains how to blacklist ports from being picked up by autport would also be sufficient.

    Read the article

  • Change MacOS X guest screen resolution for VirtualBox

    - by Pymoo
    I have tried all alternatives and resources that I found on internet to achieve to change screen resolution in my MacOS X guest. I have the latest VirtualBox version (4.1.22) and I have MacOS X 10.6.3 Snow Leopard running in a vm guest. Some solutions that don't work for me are: Tuning virtual machine settings: Adding and in the .vbox file, or running these two commands: vboxmanage setextradata "MAC OS X" "CustomVideoMode1" "1360x768x32" vboxmanage setextradata "MAC OS X" "GUI/CustomVideoMode1" "1360x768x32" Editing Guest OS boot configuration: Modify /Library/Preferences/SystemConfiguration/com.apple.boot.plist with these lines: <key>Kernel Flags</key> <string>"Graphics Mode"="1360x768x32"</string> <key>Graphics Mode</key> <string>1360x768x32</string> Any other suggestion, something that I was missing. Thanks in advance,

    Read the article

  • How to prevent dual booted OSes from damaging each other?

    - by user1252434
    For better compatibility and performance in games I'm thinking about installing Windows additionally to Linux. I have security concerns about this, though. Note: "Windows" in the remaining text includes not only the OS but also any software running on it. Regardless of whether it comes included or is additionally installed, whether it is started intentionally or unintentionally (virus, malware). Is there an easy way to achieve the following requirements: Windows MUST NOT be able to kill my linux partition or my data disk neither single files (virus infection) nor overwriting the whole disk Windows MUST NOT be able to read data disk (- extra protection against spyware) Linux may or may not have access to the windows partition both Linux and Windows should have full access to the graphics card this rules out desktop VM solutions for gaming I want the manufacturer's windows graphics card driver Regarding Windows to be unable to destroy my linux install: this is not just the usual paranoia, that has happened to me in the past. So I don't accept "no ext4 driver" as an argument. Once bitten, twice shy. And even if destruction targeted at specific (linux) files is nearly impossible, there should be no way to shred the whole partition. I may accept the risk of malware breaking out of a barrier (e.g. VM) around the whole windows box, though. Currently I have a system disk (SSD) and a data disk (HDD), both SATA. I expect I have to add another disk. If i don't: even better. My CPU is a Intel Core i5, with VT-x and VT-d available, though untested. Ideas I've had so far: deactivate or hide other HDs until reboot at low level possible? can the boot loader (grub) do this for me? tiny VM layer: load windows in a VM that provides access to almost all hardware, except the HDs any ready made software solution for this? Preferably free. as I said: the main problem seems to be to provide full access to the graphics card hardware switch to cut power to disks commercial products expensive and lots of warnings against cheap home built solutions preferably all three hard disks with one switch (one push) mobile racks - won't wear of daily swapping be a problem?

    Read the article

  • Windows freezes, showing a random color or pattern

    - by Manu
    I have a PC with windows 7, and I'm experiencing random freezes with increasing frequency. After some time (from 30 minutes to a few hours) the screen will shows a random color or pattern (vertical lines) and nothing works anymore, I need to reboot manually. I've checked the events log, but nothing is shown except for the unexpected reboot. I've tested the RAM with Memtest86+ 4.20, and the graphics card with FurMark 1.10.3, no errors have been found. I've updated the graphic drivers and openned the case to remove dust, but the issue is still there. Also, the problem doesn't seem to arise when I'm playing games fullscreen, but when I'm surfing the web, using itunes, or coding. My hardware is as follows : intel core i5 750 CPU, ATI Radeon HD 5670 graphics card, ASUSTeK P7P55D motherboard, two 2Gb KINGSTON DDR3 ram sticks, 2 SATA hard drives, a Netgear dongle for wifi.

    Read the article

  • Is a memory upgrade a viable option to fix performance issues? [closed]

    - by ratchet freak
    I'm currently seeing my PC getting bogged down by Firefox 11.0 alone with only one hundred tabs open. Resulting in a memory use of over 530M , VM size of over 800M and an insane amount of page faults (easily reaching 100 million over the course of the day). The PF delta during normal operation easily reaches 7k with peaks to 15k sometimes reaching over 20k. This leads to a (real) deterioration to response time when switching, opening and closing tabs, opening menus, typing, ... My question is: Am I right in assuming that plugging in more RAM (either adding 2x1GB or replacing the existing RAM with 2x2GB or 4x1GB) will solve this problem? My specs: Windows XP Home Edition SP3 (32 bit) Intel Core Duo 2,4 GHz 2x512MB RAM 800MHz DDR2 (dual channel) 4MB unified cache 320GB HDD Intel G33 (X3100) onboard graphics (no graphics card but PCI express x16 slot is available)

    Read the article

  • Fedora 16 Running Hot

    - by sdasdadas
    Since switching from Windows 7 to Fedora 16, my laptop has been running incredibly hot (by the air exhaust). The laptop is an Asus K73S. Running 'sensors', I receive: acpitz-virtual-0: 75.0 celsius nouveau-pci-0100: 66.0 celsius asus-isa-0000: 75.0 celsius The only CPU hog is Firefox at 30 - 40% on average. My GPU information (from lspci) is: Intel Corporation Xeon E3-1200/2nd Generation Core Process or Family Integrated Graphics Controller (rev 09). Running lspci | grep -i VGA, returns: 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: nVidia Corporation GF106 [GeForce GT 555M SDDR3] (rev a1) I don't notice a huge difference running without the battery, but it does seem a little cooler. Thanks!

    Read the article

  • Why am I having trouble loading Ubuntu alongside Windows as an application?

    - by STEVE PEAVEY
    I have two good CD ISO files. Both load OK, but when I boot to Ubuntu the screen is fragmented by dozens of white lines. Program works but is useless. I'm running Windows XP SP3 on D201GLY MB, CELERON CPU 220 1.02 GHZ, 512 RAM What could be my problem? CPU? Not enough RAM? Or maybe even the graphics card? to be clearer i am trying to load either ubuntu 8.04 or 9.04 inside windows as an aplication from known GOOD cd's. trying to load with the wubi installer that is loaded on the cd's. sis mirage graphics 32mb vid prosser sis 662.

    Read the article

  • mapping rect in small image to larger image (in order to do a copyPixels operation)

    - by skinnyTOD
    Hi all - this is (I think) a relatively simple math question but I've spent a day banging my head against it and have only the dents and no solution... I'm coding in actionscript 3 - the functionality is: large image loaded at runtime. The bitmapData is stored and a smaller version is created to display on the available screen area (I may end up just scaling the large image since it is in memory anyway). The user can create a rectangle hotspot on the smaller image (the functionality will be more complex: multiple rects with transparency: example a donut shape with hole, etc) 3 When the user clicks on the hotspot, the rect of the hotspot is mapped to the larger image and a new bitmap "callout" is created, using the larger bitmap data. The reason for this is so the "callout" will be better quality than just scaling up the area of the hotspot. The image below shows where I am at so far- the blue rect is the clicked hotspot. In the upper left is the "callout" - copied from the larger image. I have the aspect ratio right but I am not mapping to the larger image correctly. Ugly code below... Sorry this post is so long - I just figured I ought to provide as much info as possible. Thanks for any tips! --trace of my data values *source BitmapDada 1152 864 scaled to rect 800 600 scaled BitmapData 800 600 selection BitmapData 58 56 scaled selection 83 80 ratio 1.44 before (x=544, y=237, w=58, h=56) (x=544, y=237, w=225.04, h=217.28) * Image here: http://i795.photobucket.com/albums/yy237/skinnyTOD/exampleST.jpg public function onExpandCallout(event:MouseEvent):void{ if (maskBitmapData.getPixel32(event.localX, event.localY) != 0){ var maskClone:BitmapData = maskBitmapData.clone(); //amount to scale callout - this will vary/can be changed by user var scale:Number =150 //scale percentage var normalizedScale :Number = scale/=100; var w:Number = maskBitmapData.width*normalizedScale; var h:Number = maskBitmapData.height*normalizedScale; var ratio:Number = (sourceBD.width /targetRect.width); //creat bmpd of the scaled size to copy source into var scaledBitmapData:BitmapData = new BitmapData(maskBitmapData.width * ratio, maskBitmapData.height * ratio, true, 0xFFFFFFFF); trace("source BitmapDada " + sourceBD.width, sourceBD.height); trace("scaled to rect " + targetRect.width, targetRect.height); trace("scaled BitmapData", bkgnImageSprite.width, bkgnImageSprite.height); trace("selection BitmapData", maskBitmapData.width, maskBitmapData.height); trace("scaled selection", scaledBitmapData.width, scaledBitmapData.height); trace("ratio", ratio); var scaledBitmap:Bitmap = new Bitmap(scaledBitmapData); var scaleW:Number = sourceBD.width / scaledBitmapData.width; var scaleH:Number = sourceBD.height / scaledBitmapData.height; var scaleMatrix:Matrix = new Matrix(); scaleMatrix.scale(ratio,ratio); var sRect:Rectangle = maskSprite.getBounds(bkgnImageSprite); var sR:Rectangle = sRect.clone(); var ss:Sprite = new Sprite(); ss.graphics.lineStyle(8, 0x0000FF); //ss.graphics.beginFill(0x000000, 1); ss.graphics.drawRect(sRect.x, sRect.y, sRect.width, sRect.height); //ss.graphics.endFill(); this.addChild(ss); trace("before " + sRect); w = uint(sRect.width * scaleW); h = uint(sRect.height * scaleH); sRect.inflate(maskBitmapData.width * ratio, maskBitmapData.height * ratio); sRect.offset(maskBitmapData.width * ratio, maskBitmapData.height * ratio); trace(sRect); scaledBitmapData.copyPixels(sourceBD, sRect, new Point()); addChild(scaledBitmap); scaledBitmap.x = offsetPt.x; scaledBitmap.y = offsetPt.y; } }

    Read the article

  • Can't delete folder in Windows 7

    - by user18526
    I'm trying to delete a folder in Windows 7 and get a perplexing error message: "Could not find this item: This is no longer located in G:\Graphics. Verify the item's location and try again. I can see the folder -- I can find it. I just can't delete it. I also get a second error message (sometimes) when I click on the folder: G:\Graphics 2009-11-17 refers to a location that is unavailable...this information might have been moved to a different location. I'm using Windows 7; this folder is on an external hard drive. I've emptied the folder (there were items in it); I've scanned that external hard drive for errors. Trying to rename the folder yields the same enigmatic error message. Is there a way to delete this folder?

    Read the article

  • My Computer will not turn on?

    - by user269120
    I recently built a gaming pc. It was working fine before the following events: I installed a graphics card driver update then: in windows it said that the user experience index needed to be refreshed, so i started the test and somewhere in the middle of the test my pc just switched off. No shutting down just stopped. Now it won't turn back on. I have checked its plugged in and the switch on psu is down, i tried a different power cable and i checked all the connections. When i press the power button nothing happens, no fans no lights no post beep. Computer Parts: Motherboard: Gigabyte GA78LMT-USB3 CPU: AMD FX-6350 @ 3.9 Ghz RAM: 2x4gb Crucial Ballistix Sport Power Supply: Tesla 750w psu Graphics card: XFX Radeon 7870 DD Case: CiT Vantage R Gaming Case Hard Drive: 2TB Western Digital Caviar Green Please help me, this computer is only a week old since i built it. All anwsers are appreciated :)

    Read the article

  • SDL2 sprite batching and texture atlases

    - by jms
    I have been programming a 2D game in C++, using the SDL2 graphics API for rendering. My game concept currently features effects that could result in even tens of thousands of sprites being drawn simultaneously to the screen. I'd like to know what can be done for increasing rendering efficiency if the need arises, preferably using the SDL2 API only. I have previously given a quick look at OpenGL-based 2D rendering, and noticed that SDL2 lacks a command like int SDL_RenderCopyMulti(SDL_Renderer* renderer, SDL_Texture* texture, const SDL_Rect* srcrects, SDL_Rect* dstrects, int count) Which would permit SDL to benefit from two common techniques used for efficient 2D graphics: Texture batching: Sorting sprites by the texture used, and then simultaneously rendering as many sprites that use the same texture as possible, changing only the source area on the texture and the destination area on the render target between sprites. This allows the encapsulation of the whole operation in a single GPU command, reducing the overhead drastically from multiple distinct calls. Texture atlases: Instead of creating one texture for each frame of each animation of each sprite, combining multiple animations and even multiple sprites into a single large texture. This lessens the impact of changing the current texture when switching between sprites, as the correct texture is often ready to be used from the previous draw call. Furthemore the GPU is optimized for handling large textures, in contrast to the many tiny textures typically used for sprites. My question: Would SDL2 still get somewhat faster from any rudimentary sprite sorting or from combining multiple images into one texture thanks to automatic video driver optimizations? If I will encounter performance issues related to 2D rendering in the future, will I be forced to switch to OpenGL for lower level control over the GPU? Edit: Are there any plans to include such functionality in the near future?

    Read the article

  • Is CUDA, cuBLAS or cuBLAS-XT the right place to start with for machine learning?

    - by Stefan R. Falk
    I am not sure if this is the right forum to post this question - but it surely is no question for stackoverflow. I work on my bachelor thesis and therefore I am implementing a so called Echo-State Network which basically is an artificial neural network that has a large reservoir of randomly initialized neurons and just a few input and output neurons .. but I think we can skip that. The thing is, there is a Python library called Theano which I am using for this implementation. It encapsulates the CUDA API and offers a quiet "comfortable" way to access the power of a NVIDIA graphics card. Since CUDA 6.0 there is a sub-library called cuBLAS (Basic Linear Algebra Subroutines) for LinAlg operations and also a cuBLAS-XT an extention which allows to run calculations on multiple graphics cards. My question at this point is if it would make sense to start using cuBLAS and/or cuBLAS-XT right now since the API is quite complex or rather wait for libraries that will build up on those library (such as Theano does on basic CUDA)? If you think this is the wrong place for this question please tell me which one is, thank you.

    Read the article

< Previous Page | 103 104 105 106 107 108 109 110 111 112 113 114  | Next Page >