Search Results

Search found 18794 results on 752 pages for 'graphics design'.

Page 6/752 | < Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >

  • Laptop with Intel Graphics: external VGA monitor only gets signal on boot (no "hot plugging")

    - by iveand
    I am able to get an external VGA monitor (or projector) to work if I start my laptop with it connected. However, if I start the laptop with it disconnected there is no signal on the external. The Displays screen shows the external, and thinks that it is active, but there is no signal being sent to it. This has been a persistent problem since 10.04 (I am now on 12.04.... each upgrade hoping something is improved). I should note that even when it works (starting with display connected), Displays still says the monitor is "unknown" (but it sends the signal). For the correct resolution to display, I have had to add a few xrandr lines for my monitor to my .xprofile file... otherwise resolution is limited to default 1024x768. So, resolution issues can be worked around, but the main issue is that the external doesn't get a signal without starting the machine with it connected. I have tried: adding i915.modeset=1 to grub (also i965.modeset=1 since someone posted that this helped even though lshw shows i915) adding following ppa and doing a dist-upgrade: sudo add-apt-repository ppa:xorg-edgers/ppa Here are the details: Laptop: Toshiba Tecra M10 lspci listings for video: 00:02.0 VGA compatible controller [0300]: Intel Corporation Mobile 4 Series Chipset Integrated Graphics Controller [8086:2a42] (rev 07) sudo lshw -C video listing: *-display:0 description: VGA compatible controller product: Mobile 4 Series Chipset Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 07 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:46 memory:ff400000-ff7fffff memory:e0000000-efffffff ioport:cff8(size=8) *-display:1 UNCLAIMED description: Display controller product: Mobile 4 Series Chipset Integrated Graphics Controller vendor: Intel Corporation physical id: 2.1 bus info: pci@0000:00:02.1 version: 07 width: 64 bits clock: 33MHz capabilities: pm bus_master cap_list configuration: latency=0 resources: memory:ffc00000-ffcfffff "System Info" shows my graphics as the following Mobile Intel® GM45 Express Chipset x86/MMX/SSE2

    Read the article

  • Graphics performance of 945GME

    - by l0b0
    Edit: Since setting Appearance - Visual Effects up to a stunning "Normal", I now get ~35 FPS in glxgears right after login, with nothing else running :( I'm getting terrible graphics performance in NeverWinter Nights (native with SoU+HotU+CEP2) on my Eee PC 1005HAB. Even with all graphics settings (including the "advanced" ones) at minimum I get about 2-10 FPS, depending on the scene. Firefox is really sluggish as well - Changing tabs often takes a second, scrolling is laggy, and typing this I notice the delay between pressing keys and seeing the text on screen. The rest of the OS is running OK, although general performance seems to be even worse than my old Eee PC 900. glxgears gives about 60 FPS, which is apparently as it should be (synchronized with the monitor refresh rate). Bugs like Launchpad #252094 and the instructions for Reverting the Jaunty Xorg intel driver to 2.4 are old enough that I'm afraid following the instructions would render the system unusable. Are there any tips for improving graphics performance on this system that are still relevant for 10.10? $ uname -a Linux l0b0eee 2.6.35-28-generic #49-Ubuntu SMP Tue Mar 1 14:40:58 UTC 2011 i686 GNU/Linux $ lspci -nn | grep VGA 00:02.0 VGA compatible controller [0300]: Intel Corporation Mobile 945GME Express Integrated Graphics Controller [8086:27ae] (rev 03) $ glxinfo name of display: :0.0 display: :0 screen: 0 direct rendering: Yes server glx vendor string: SGI server glx version string: 1.4 ...

    Read the article

  • 3D Vector "End Point" Calculation for procedural Vector Graphics

    - by FrostFlame64
    Alright, So I need some help with some Vector Math. I've developing some game Engines that have Procedural Fractal Generation for Some Graphics, such as using Lindenmayer Systems for generating Trees and Plants. L-Systems, are drawn by using Turtle Graphics, which is a form of Vector graphics. I first created a system to draw in 2D Graphics, which works perfectly fine. But now I want to make a 3D equivalent, and I’ve run into an issue. For my 2D Version, I created a Method for quickly determining the “End Point” of a Vector-like movement. Given a starting point (X, Y), a direction (between 0 and 360 degrees), and a distance, the end point is calculated by these formulas: newX = startX + distance * Sin((PI * direction) / 180) newY = startY + distance * Cos((PI * direction) / 180) Now I need something Similarly Equivalent for performing this Calculation in 3D, But I haven’t been able to Google anything that could show me how to do this. I'm flexible enough to get whatever required information is needed for this method calculation, in any reasonable form (Vector3, Quaternion, ect). To summarize: Given a starting point/vector position in 3D space (X, Y, Z), a Direction in 3D space (Vector3, Quaternion, ect), and a Distance, I need to find the “End Point” in 3D Space. Thank you for your time and help.

    Read the article

  • Is it possible to install a discrete mobility card on a laptop without a graphics card already?

    - by JXPheonix
    To be specific: Can I: Install a discrete graphics card on a laptop that does not come with a discrete graphics card? Teach windows 8 to use the discrete card over the integrated graphics? To me this is the only downside of my laptop and I would not mind getting any graphics card as long as it is better than the integrated laptop card (an Intel 3000) The laptop if you need more information is an Asus U46E-BAL6.

    Read the article

  • Edubuntu boots in low graphics mode. with an Intel HD Graphics system

    - by user63957
    I have a HD Intel graphics card in my laptop. It was working fine the first few days with the new version Edubuntu. Now when you start, just before it goes to the part asking for the login password I think the OP means lightdm it sends me to a low graphics mode. Things I've tried: I tried Ctl+Alt+F1. Updated and installed fglrx from the terminal. All my work is all stored there. Please, if anyone knows how to fix this, tell me. Original version: hola tengo una tarjeta intel hd graphics en mi laptop estuve trabajando los primeros dias bien con la nueva version edubuntu solo que ahora cuando inicia y justo antes de que pase a la parte que me pide la contraseña me manda low graphic mode no se que hacer ya entre y le di ctr alt f1 y actualice tmb instale fglrx necesito obtener toda miinformacion todo mi trabajo esta ahi guardado, por favor si alguien sabe como solucionar este bug digame como, gracias, ciao.

    Read the article

  • Purple line on left side of screen when I use graphics card's hdmi port

    - by fab
    My graphics card is a nVidia GeForce GTX 660 2GB. When I plug HDMI into mobo it works fine. When I plug it into the graphics card (with 2nd monitor too) it shows a purple vertical line on the left side. It adds 2 pixels to the width and I can't adjust it with my monitor. It doesn't come up when I print screen. I tried changing the driver to the binary one (at the top) but that made it not show up at all. What do I do? Are some graphics cards just not compatible?

    Read the article

  • Ubuntu 11.10 doesn't detect Intel integrated graphics (i7-2670QM CPU)

    - by Telmo Marques
    The laptop I'm using is an MSI GT683DX-847PT that comes with an NVIDIA GeForce GTX570M discrete GPU, and an Intel Core i7-2670QM CPU. According to Intel's description of the Core i7-2670QM CPU, it has an HD Graphics 3000 integrated GPU. The problem is that the Intel integrated graphics GPU doesn't come up in lspci nor in lshw, only the NVIDIA GPU shows up. Here is the output of both commands: sudo lspci: http://pastebin.com/raw.php?i=9AZg8bJy sudo lshw: http://pastebin.com/raw.php?i=6cAMFQsY I was counting on having two GPU's to run CUDA programs on the discrete NVIDIA GPU, while X was handled by the integrated Intel GPU, to prevent kernel execution timeout. Why doesn't the Intel HD Graphics 3000 GPU show up? Any tests I could make to verify the presence of an integrated GPU?

    Read the article

  • OpenGL behaviour depending on the graphics card?

    - by Dan
    This is something that never happened to me before. I have an OpenGL code that uses GLSL shaders to texture a 3D model. The code involves a lot of GPU texture processing, blending, etc... I wanted to check how the performance of my code improves using a faster graphics card (both new and old are NVIDIA, using always the NVIDIA development drivers). But now I have found that once I run the code using the new graphics card, it behaves completely different (the final render looks wrong), probably because some blending effect is not performed correctly. I haven't really look into what has changed, but I am guessing that some OpenGL states are, by default, set different. Is this possible? Have you ever found different OpenGL/GLSL behaviour using different graphics cards? Any "fast" solution? (So far I've thought of plugging back the old one, push all OpenGL default states, and compare with the ones I initially get using the new card..)

    Read the article

  • Which graphics library should I be using?

    - by DaveDev
    I have been developing and maintaining a WPF application, for which I've recently been tasked with adding a 3D representation of some of the data. I'm new to graphics programming in every kind of way so I'm curious whether I should stick with 3D graphics capabilities built into WPF or should I investigate other solutions, like OpenTK or SharpGL My objective is to represent the data so that it will eventually appear similar to: with nodes connected by lines. I need to rotate the image around each axis and each node will be a 3D model of the device it represents. So far, I've been able to experiment with the tutorial outlined here: Windows Presentation Foundation (WPF) 3D Tutorial and it was helpful as an introduction. But I can see that there are other ways to implement 3D graphics solutions and I wonder if they are more suitable for my needs, or should I stick with the in-built WPF solution? What are the pros and cons of each?

    Read the article

  • Object oriented wrapper around a dll

    - by Tom Davies
    So, I'm writing a C# managed wrapper around a native dll. The dll contains several hundred functions. In most cases, the first argument to each function is an opaque handle to a type internal to the dll. So, an obvious starting point for defining some classes in the wrapper would be to define classes corresponding to each of these opaque types, with each instance holding and managing the opaque handle (passed to its constructor) Things are a little awkward when dealing with callbacks from the dll. Naturally, the callback handlers in my wrapper have to be static, but the callbacks arguments invariable contain an opaque handle. In order to get from the static callback back to an object instance, I've created a static dictionary in each class, associating handles with class instances. In the constructor of each class, an entry is put into the dictionary, and this entry is then removed in the Destructors. When I receive a callback, I can then consult the dictionary to retrieve the class instance corresponding to the opaque reference. Are there any obvious flaws to this? Something that seems to be a problem is that the existence static dictionary means that the garbage collector will not act on my class instances that are otherwise unreachable. As they are never garbage collected, they never get removed from the dictionary, so the dictionary grows. It seems I might have to manually dispose of my objects, which is something absolutely would like to avoid. Can anyone suggest a good design that allows me to avoid having to do this?

    Read the article

  • How do we keep dependent data structures up to date?

    - by Geo
    Suppose you have a parse tree, an abstract syntax tree, and a control flow graph, each one logically derived from the one before. In principle it is easy to construct each graph given the parse tree, but how can we manage the complexity of updating the graphs when the parse tree is modified? We know exactly how the tree has been modified, but how can the change be propagated to the other trees in a way that doesn't become difficult to manage? Naturally the dependent graph can be updated by simply reconstructing it from scratch every time the first graph changes, but then there would be no way of knowing the details of the changes in the dependent graph. I currently have four ways to attempt to solve this problem, but each one has difficulties. Nodes of the dependent tree each observe the relevant nodes of the original tree, updating themselves and the observer lists of original tree nodes as necessary. The conceptual complexity of this can become daunting. Each node of the original tree has a list of the dependent tree nodes that specifically depend upon it, and when the node changes it sets a flag on the dependent nodes to mark them as dirty, including the parents of the dependent nodes all the way down to the root. After each change we run an algorithm that is much like the algorithm for constructing the dependent graph from scratch, but it skips over any clean node and reconstructs each dirty node, keeping track of whether the reconstructed node is actually different from the dirty node. This can also get tricky. We can represent the logical connection between the original graph and the dependent graph as a data structure, like a list of constraints, perhaps designed using a declarative language. When the original graph changes we need only scan the list to discover which constraints are violated and how the dependent tree needs to change to correct the violation, all encoded as data. We can reconstruct the dependent graph from scratch as though there were no existing dependent graph, and then compare the existing graph and the new graph to discover how it has changed. I'm sure this is the easiest way because I know there are algorithms available for detecting differences, but they are all quite computationally expensive and in principle it seems unnecessary so I'm deliberately avoiding this option. What is the right way to deal with these sorts of problems? Surely there must be a design pattern that makes this whole thing almost easy. It would be nice to have a good solution for every problem of this general description. Does this class of problem have a name?

    Read the article

  • HP Pavilion dv5 boots with low brightness and graphics card not recognized

    - by cesar
    My problem is with Ubuntu 11.10 in my notebook hp pavilion dv5 with a graphic Intel(R) hd graphics. When I start Ubuntu my screen is without brightness, I can increase it with my control buttons, but when I restart I don't have brightness again. also Ubuntu doesn't recognize my graphic card (Intel HD (R) graphics), please i need your help because i like Ubuntu and i would like have it in my laptop (HP dv5 2045la) (3GB RAM) (500 GB DISK). PS: I installed the repository MESA and now recognizes my card but my problem with the brightness

    Read the article

  • Can't boot Ubuntu 12.10 graphics problem

    - by Frantumn
    I can't boot since installing Ubuntu 12.10 When I try to run Ubuntu My computer never gets to the Ubuntu screen with the loading dots. I tried to run in recovery mode with safe graphics (failsafex) When I do this a message pops up saying "the system is running in low graphics mode", If I click okay I am asked what would I like to do and am given four options. I tried running low graphics for one session and then a message appears with a progress bar and says standby one minute while the display restarts. The progress bar never moves and if I click okay the whole process just restarts. I Don't know what to do from here I can't get into the OS. I'm not sure whether the problem is related to compatibility with my laptop monitor or my graphic card nvidia360m I had to install using a safe graphics mode. To learn about how I installed see this link. This link also has information on my computer hardware. Can't install Ubuntu since 10.10 ----UPDATE--- I was able to get into a desktop environment By installing Nvidia-current however it is messy. I have a screen and I am able to see my desktop however there is no unity bar and none of the keyboard controls work. I can right click and create a folder on the desktop and then I can see inside that folder in a traditional browser window. There is still no top menu or unity bar. When I boot normally I don't get into the desktop environment and I get this message in tty 'GPU lockup switching to software FBCON' Okay, I've played around with tips the pages from comments. I've been able to consistantly get into a safemode desktop environment using the xorg & nouveau drivers. I've tried switching between the 5 different options in the Additional Drivers tab in Software Sources. The nVidia (proprietary, tested) driver gets beyond the GPU lockup on a normal boot and actually gets into a Desktop. The issue is then that there is no Unity bar, or top screen menu bar and the resolution is very low. I've tried switching to the (prop, tested) driver and then reinstalling Unity and Ubuntu-Desktop but that didn't work either.

    Read the article

  • Hybrid Graphics Functional won't work with my Asus UL30V anymore

    - by futuress
    The problem is that I am no longer able to boot in compatibility mode for just turning on my Nvidia graphics to install the driver. Because no login screen will appear if Ubuntu is loading. In Ubuntu 11.10 I was able to activate nvidia graphics only' option this way: 1) Change BIOS to 'compatibility mode' which will turn off the Intel card. 2) Install the Nvidia proprietary driver using Ubuntu's driver finder (Additional Drivers) and then reboot. I was not interested using only the Intel graphics, for the sake of battery life. Now I have both cards running and they drain my battery life dramatically. And the main problem of this configuration no OpenGL is available, so I can't play any games any more. At this point, I have a pre-solution. I uninstalled the nvidia drivers and installed bumblebee. Now the Intel card is recognized. I would prefer to run just the nvidia card as in Ubuntu 11.10 but for now this is better than nothing. Does anybody else have the same problem?

    Read the article

  • Which Graphics/Geometry abstraction to choose?

    - by Robz
    I've been thinking about the design for a browser app on the HTML5 canvas that simulates a 2D robot zooming around, sensing the world around it. I decided to do this from scratch just for fun. I need shapes, like polygons, circles, and lines in order to model the robot and the world it lives in. These shapes need to be drawn with different appearance attributes, like border/fill style/width/color. I also need to have geometry functions to detect intersections and containment for the robot's sensors and so that the robot doesn't go inside stuff. One idea for functions is to have two totally separate libraries, one to implement graphics (like drawShape(context, shape)) and one for geometry operations (like shapeIntersectsShape(shape1, shape2)). Or, in a more object-oriented approach, the shape objects themselves could implement methods to do their own graphics (shape.draw(context)) and geometry operations (shape1.intersects(shape2)). Then there is the data itself: whether the data to draw a shape and the data to do geometric operations on that shape should be encapsulated within the same object, or be separate structures (where one would contain the other, or both be contained inside another structure). How do existing applications that do graphics/geometry stuff deal with this? Is there one model that is best, or is each good for certain applications? Should the fact that I'm using Javascript instead of a more classical language change how I approach the design?

    Read the article

  • Ubuntu 13.10, kernel 3.11 blank screen issue with hybrid graphics

    - by Lagerbaer
    On my HP Envy, which has both an Intel on-chip graphics card and an Nvidia Geforce: *-display UNCLAIMED description: 3D controller product: GK208M [GeForce GT 740M] vendor: NVIDIA Corporation physical id: 0 bus info: pci@0000:01:00.0 version: a1 width: 64 bits clock: 33MHz capabilities: cap_list configuration: latency=0 resources: memory:d2000000-d2ffffff memory:a0000000-afffffff memory:b0000000-b1ffffff ioport:5000(size=128) memory:b2000000-b207ffff *-display description: VGA compatible controller product: 4th Gen Core Processor Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 06 width: 64 bits clock: 33MHz capabilities: vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:46 memory:d3000000-d33fffff memory:c0000000-cfffffff ioport:6000(size=64) I have trouble with all newer kernels. I basically had to install 12.04 LTS and use their 3.5 kernel family to get the system to boot. The 3.8 from 12.10 or the newest 3.11 from Ubuntu 13.10 leave me with a black screen upon boot. On one occasion I did hear the "log in" sound, but the screen did not display anything. I have purged all nvidia drivers so I guess it should just use the intel drivers, but apparently this is all messed up with newer kernel versions. This is different from the other "nvidia boots into blank screen" bug in that I don't rely solely on an nvidia card. Surely the intel on-chip card should be supported and leave me with something different from a blank screen? Again, it only works with kernel versions 3.5.0-41-generic, not with the 3.11.0-12 one that ships with Ubuntu 13.10. When I go into the grub menu and change the boot options from 'quiet splash' to 'nomodeset' I am able to boot the system, but then I don't get any graphics and trying 'sudo service lightdm start' doesn't succeed (I get 100% CPU for apport, but this doesn't do anything either, so I kill it). Help, I'm all out of ideas. EDIT: Let me add that I'm using the EFI boot system and have a dual-boot installation with Windows 8.

    Read the article

  • Using onboard and pci-e graphics card at the same time

    - by Endle
    Hello wonderful people. I know there are several other posts with similar questions. I also know how to use Google. I also have read up on posts discussing bumblebee, crossfire, ati catylist and many other interesting topics. I would just like someone to give me advice on how to use the onboard and pci-e graphics at the same time. I know the computer is capable of doing this. It works in Windows. I can use the VGA and DVI onboard port and the HDMI port of the add on card all at the same time. Works great in Windows 7, In Ubuntu, it seems only one or the other will work. I can use any combination of two displays on either adapter: VGA and HDMI..HDMI and DVI..so forth and so on. I have started experimenting with xorg.conf files, but have not been able to get any of them to work. Here is my last attempt at writing an xorg.conf file: Section "ServerLayout" Identifier "X.org Configured" Screen 0 "Screen0" 0 0 Screen 1 "Screen1" LeftOf "Screen0" Screen 2 "Screen2" LeftOf "Screen1" InputDevice "Mouse0" "CorePointer" InputDevice "Keyboard0" "CoreKeyboard" EndSection Section "Device" Identifier "Onboard Video" Driver "radeon" BusID "PCI:01:05.0" EndSection Section "Device" Identifier "Graphics Card" Driver "radeon" BusID "PCI:02:00.0" EndSection Section "Monitor" Identifier "CRT2" Option "VendorName" "ViewSonic" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" EndSection Section "Monitor" Identifier "DVI1" VendorName "ACR" ModelName "P224W" Option "DPMS" EndSection Section "Monitor" Identifier "DVI2" Option "VendorName" "Acer" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" EndSection Section "Screen" Identifier "Screen0" Device "Onboard Video" Monitor "CRT2" DefaultDepth 24 SubSection "Display" Depth 24 Modes "1280x1024" EndSubSection EndSection Section "Screen" Identifier "Screen1" Device "Graphics Card" Monitor "DVI1" DefaultDepth 24 SubSection "Display" Depth 24 Modes "1920x1080" EndSubSection

    Read the article

  • Less graphics power all the sudden (Intel HD 3000)

    - by queueoverflow
    I have a Intel Sandy Bridge i5 with the HD 3000 graphics card. I used to be able to play Urban Terror and Nexuiz comfortably with 85 and 60 frames per seconds until mid/end of October 2012, the former even on a full HD display with that many frames. Now I have around 30 to 45 on the smaller laptop screen and around 20 to 30 on the external monitor. Did something happen to Kubuntu 12.04 so that it has less graphics performance than previously? Update I looked into the system monitor and could not detect anything being at the maximum. The four CPU cores were pretty much bored, the 8 GB RAM were filled with maybe 2 GB. And I ran intel_cpu_top and did not notice anything at its limit. See the output. after Kernel bisecting I now did a kernel bisect and tried 3.2.0-23, 3.2.0-27, 3.2.0-29 and 3.2.0-30 and all had full graphics power. Interestingly, I then had full power when I just booted back into the regular 3.2.0-32 kernel. This does not make sense to me …

    Read the article

  • Can't get my graphics driver (GMA 3150) to work

    - by bracus
    I've been searching like crazy trying to find a fix for this, it's the only thing that's not completely working on my setup. I see posts where people say it should be working but it just isn't. I have a Gateway LT2802u netbook and I installed 11.10 on this 2 days ago. Everything works except for accelerated graphics. At first I couldn't watch a simple flash video at all, but somehow I got it to work. Now the last problem I have is I can't watch HD videos, my screen resolution won't go higher than 1024x600, and my under my graphics driver it says "Unknown". After doing as much research as possible, I've come to the conclusion that it's the GMA 3150 graphics driver. There is a bunch of talk on it all over the interwebs but nothing lately. I've tried the fixes that some people have used but most when I try to get the package it's no longer there or available if that makes sense. I'm loving everything Ubuntu has to offer but it'll really bite if I can't use it any more because of this problem. Does anybody have any ideas? You'd really be helping a lot.

    Read the article

  • How to avoid having very large objects with Domain Driven Design

    - by Pablojim
    We are following Domain Driven Design for the implementation of a large website. However by putting the behaviour on the domain objects we are ending up with some very large classes. For example on our WebsiteUser object, we have many many methods - e.g. dealing with passwords, order history, refunds, customer segmentation. All of these methods are directly related to the user. Many of these methods delegate internally to other child object but this still results in some very large classes. I'm keen to avoid exposing lots of child objects e.g. user.getOrderHistory().getLatestOrder(). What other strategies can be used to avoid this problems?

    Read the article

  • Subclassing to avoid line length

    - by Super User
    The standard line length of code is 80 characters per line. This is accepted and followed by the most of programmers. I working on a state machine of a character and is necessary for me follow this too. I have four classes who pass this limit. I can subclass each class in two more and then avoid the line length limit. class Stand class Walk class Punch class Crouch The new classes would be StandLeft, StandRight and so on. Stand, Walk, Punch and Crouch would be then abstract classes. The question if there is a limit for the long of the hierarchies tree or this is depends of the case.

    Read the article

  • Why should ViewModel route actions to Controller when using the MVCVM pattern?

    - by Lea Hayes
    When reading examples across the Internet (including the MSDN reference) I have found that code examples are all doing the following type of thing: public class FooViewModel : BaseViewModel { public FooViewModel(FooController controller) { Controller = controller; } protected FooController Controller { get; private set; } public void PerformSuperAction() { // This just routes action to controller... Controller.SuperAction(); } ... } and then for the view: public class FooView : BaseView { ... private void OnSuperButtonClicked() { ViewModel.PerformSuperAction(); } } Why do we not just do the following? public class FooView : BaseView { ... private void OnSuperButtonClicked() { ViewModel.Controller.SuperAction(); // or, even just use a shortcut property: Controller.SuperAction(); } }

    Read the article

  • How do I handle priority and propagation in an event system?

    - by Peeter
    Lets say I have a simple event system with the following syntax: object = new Object(); object.bind("my_trigger", function() { print "hello"; }); object.bind("my_trigger", function() { print "hello2"; }); object.trigger("my_trigger"); How could I make sure hello2 is printed out first (I do not want my code to depend on which order the events are binded). Ontop of that, how would I prevent my events from propagating (e.g. I want to stop every other event from being executed)

    Read the article

  • DDD: service contains two repository

    - by tikhop
    Does it correct way to have two repository inside one service and will it be an application or domain service? Suppose I have a Passenger object that should contains Passport (government id) object. I am getting Passenger from PassengerRepository. PassengerRepository create request to server and obtain data (json) than parse received data and store inside repository. I have confused because I want to store Passport as Entity and put it to PassportRepository but all information about password contains inside json than i received above. I guess that I should create a PassengerService that will be include PassengerRepository and PassportRepository with several methods like removePassport, addPassport, getAllPassenger and etc. UPDATE: So I guess that the better way is represent Passport as VO and store all passports inside Passenger aggregate. However there is another question: Where I should put the methods (methods calls server api) for management passenger's passport. I think the better place is so within Passenger aggregate.

    Read the article

  • Windows 8, NVIDIA graphics recognition fails

    - by Roy Grubb
    I just installed Windows 8 Pro OEM 64-bit (clean install) and it won't properly recognize my graphics adapter. When I installed Win8, it automatically installed the BasicDisplay.sys driver dated 6/21/2006. 6.2.9200.16384 (win8_rtm.120725-1247). Hardware - Mobo:MSi G41M-P33 Combo CPU:Intel CoreDuo 6600 Graphics:NVIDIA GeForce 9400GT *OS* - Windows 8 Pro 64-bit OEM The graphics adapter worked fine in Windows XP. The PC is a generic box, bought locally and its mobo failed recently, so I replaced it with the G41M. Microsoft wouldn't let me re-activate Windows XP with a different mobo, so I installed Win8, which appears to work except as described next. Win8 only partially recognizes the graphics adapter and won't allow NVIDIA latest driver installer to see that it's an NVIDIA card. As a result, OpenGL doesn't work, and this is needed by the software I most use. Other than that the graphics look OK. When I say 'partially recognizes', I mean that via the Control Panel, I can see that the adapter is described as NVIDIA, but the driver remains stuck at Microsoft Basic Display Adapter no matter what I try, including "Update driver..." in adapter properties. Display Screen Resolution Advanced Settings Adapter shows: Adapter Type: Microsoft Basic Display Adapter Chip Type: NVIDIA DAC Type: NVIDIA Corporation Bios Information: G27 Board - p381n17 Don't know what this means ... no mention of 9400GT Total Available Graphics Memory: 256 MB Dedicated Video Memory: 0 MB In fact the adapter has 512MB on-board video memory. System Video Memory: 0 MB Shared System Memory: 256 MB And Control Panel Device Manager Display adapters just shows Microsoft Basic Display Adapter. No other graphics adapter, and no unknown device or yellow question mark. What I have tried so far: 1. Cleared CMOS and reset. Updated BIOS and all mobo drivers as follows: 1st I used Driver Reviver to see if any driver updates were required. It found some but I didn't use that to get the drivers. Then I switched to MSi's own mobo driver utility Live Update 5. This also showed the board needed to update several so I used it to fetch the new drivers. After that it showed that everything was up to date and I checked with Driver Reviver again, which also reported no drivers now needed updating. Rebooted. Went to the NVIDIA site to get the latest graphics adapter driver. Their auto-detect "Option 2: Automatically find drivers for my NVIDIA products" said "The NVIDIA Smart Scan was unable to evaluate your system hardware. Please use Option 1 to manually find drivers for your NVIDIA products." So I downloaded 310.70-desktop-win8-win7-winvista-64bit-international-whql.exe, which lists 9400 GT under supported products, but when I run it, it says: "NVIDIA Installer cannot continue This graphics driver could not find compatible graphics hardware." Connected the display to the on-board Intel graphics (G41 Intel Express), removed the NVIDIA card and rebooted, changed to internal graphics in CMOS. Again it installs the MS Basic Display Adapter, and can't properly run my s/w that needs OpenGL. It runs on other machines with Intel Express graphics (WinXP and 7) Shut down and pulled out the power cord. Held start button to discharge all capacitors. Removed and re-inserted NVIDIA adapter in PCI-E slot and made sure properly seated. Connected the monitor to the card, screwed plug to socket. Reconnected power cord. Started and checked in BIOS that Primary Graphics Adapter was set to PCI-E. Started Windows. Uninstalled MS Basic Display Adapter in Device Manager. Screen blanks briefly, reappears. No Graphics adapter entry was then visible in Device Manager. Restarted PC. MS Basic Display Adapter Visible again in Device Manager. Clicked in Device Manager View Show hidden devices. No other graphics adapter appears, no unknown devices. Rebooted. Tried Scan for Hardware changes. None detected. Tried right-click on MS Basic Display Adapter Properties Driver Update Driver... Search automatically. It replied that it had determined driver was up to date. I checked that there were no graphic driver-related entries in Programs and Features that I could delete (none). Searched for any other drivers with nvidia in their name and deleted them, just keeping the 306.97 installer exe file. Did a Windows Update. Ran GPU-Z which shows (main items): Microsoft Basic Display Adapter GPU G72 BIOS 5.72.22.76.88 Device ID 10DE - 01D5 DDR2 Bus Width 32 Bit Memory size 64MB Driver Version nvlddmkm 6.2.9200.16384 (ForceWare 0.00) / Win8 64 NVIDIA SLI Unknown in the drop-down at the foot, "Microsoft Basic Display Adapter" is the only option If I swap hard disks in that machine to one with a Ubuntu 10.4 installation (originally installed on the same PC), lspci shows "VGA compatible controller as NVIDIA Corporation Device 01d5 (rev a1) (prog-if 00 [VGA controller])" and "kernel driver in use: nvidia" I'm out of ideas for new things to try and would be really grateful of suggestions. Thanks!

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >