Search Results

Search found 3661 results on 147 pages for 'overload resolution'.

Page 21/147 | < Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >

  • Providing unique layout for specific resolution in Android 2.3.3

    - by user1384991
    I need to use specific xml-layouts based on screen resolution, not size. So, the first design is used for resolution <= 480x800, and the second - for resolution = 480x800. How is it done ? update. Possible solution: Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay(); final int height = display.getHeight(); final int width = display.getWidth();

    Read the article

  • How to override part of an overload function in JavaScript

    - by Guan Yuxin
    I create a class with a function like this var Obj=function(){this.children=[];this.parent=null;}//a base class Obj.prototype.index=function(child){ // the index of current obj if(arguments.length==0){ return this.parent?this.parent.index(this):0; } // the index of a child matchs specific obj [to be override] return -1; } basically it is just an overload function composed of index() and index(child). Then I create a sub class,SubObj or whatever, inherits from Obj SubObj.prototype.prototype=Obj; Now, it's time to override the index(child) function,however, index() is also in the function an I don't want to overwrite it too. One solution is to write like this var Obj=function(){this.children=[];this.parent=null;}//a base class Obj.prototype.index=function(child){ // the index of current obj if(arguments.length==0){ return this.parent?this.parent.index(this):0; } // the index of a child matchs specific obj [to be override] return this._index(this); } Obj.prototype._index=function(this){ return -1; } SubObj.prototype._index=function(this){/* overwriteing */} But this will easily mislead other coders as _index(child) should be both private(should not be used except index() function) and public(is an overload function of index(),which is public) you guys have better idea?

    Read the article

  • C++ STL 101: Overload function causes build error

    - by Sidjal
    Trivial code that works if I do not overload myfunc. void myfunc(int i) { std::cout << "calling myfunc with arg " << i << std::endl; } void myfunc(std::string s) { std::cout << "calling myfunc with arg " << s << std::endl; } void testalgos() { std::vector<int> v; v.push_back(1); v.push_back(2); std::vector<std::string> s; s.push_back("one"); s.push_back("two"); std::for_each( v.begin(), v.end(), myfunc); std::for_each( s.begin(), s.end(), myfunc); return; } int _tmain(int argc, _TCHAR* argv[]) { std::cout << "Hello World" << std::endl; testalgos(); return 0; } The following build errors repeat for both for_each calls. error C2914: 'std::for_each' : cannot deduce template argument as function argument is ambiguous error C2784: '_Fn1 std::for_each(_InIt,_InIt,_Fn1)' : could not deduce template argument for '_InIt' from 'std::_Vector_iterator<_Ty,_Alloc'. It does work if I do not overload myfunc.Can someone explain what is happening here. TIA

    Read the article

  • Ubuntu 12.04.1 Radeon 9550 stuck with 640x480, works in Geexbox

    - by Betty
    I am a complete new user trying to set up Ubuntu on an old desktop. It has an AGP Radeon 9550 graphics card. I am running Ubuntu from a USB drive with persistence as the PC currently has no hard drive I seem to be stuck in 640*480 mode. The desktop itself is larger, but the monitor display is stuck on 640*480. In SettingsDisplays, only the 640*480 option is available. What I have found out so far: The proprietary ati drivers no longer support my card. If 3D isn't an issue (it's not) the open source driver should be fine. This should be installed by default so in theory I am using it already xserver-xconf/pci/*.ids doesn't show any entries for the card's PCI id. hardware additional drivers show no proprietary drivers I tried the booting into the current version of Geexbox from a USB stick and this set the resolution correctly by default so I know it can be done, but I know no idea how. How can I tell what driver the card is using, and how can I get the higher resolutions back?

    Read the article

  • Black screen after grub kernal selection menu

    - by skip
    I have an Acer eMachines e727 with Intel GMA 4500M integrated graphics (drivers updated to latest). I installed Ubuntu 12.04 using Wubi. All is well until I select the kernal (first one on the list). My display goes black. I searched for solutions and found one on Unbuntu forum which partially helped. Following that sticky post, I pressed "e" at the kernal listings. I changed the $Linux... default line to "quiet splash nomodeset" and was able to get to the login screen and desktop. I edited grub to make the nomodeset permanent (also removed the vt command as recommended). I followed through with changing grub to match the graphics as recommended in the article using the grub cli (using info from vbeinfo). I updated grub with the recommended settings but still get the black screen after selecting the kernal. Only nomodeset works to get me to the login and desktop. Once I get to the desktop, my display resolution shows being set to 1024x768 but it actually looks like 800x680. What do I need to do to get past these issues? Thanks!

    Read the article

  • Getting an OBB out of another OBB?

    - by Milo
    I'm working on collision resolution for my game. I just need a good way to get an object out of another object if it gets stuck. In this case a car. Here is a typical scenario. The red car is in the green object. How do I correctly get it out so the car can slide along the edge of the object as it should. I tried: if(buildings.size() > 0) { Entity e = buildings.get(0); Vector2D vel = new Vector2D(); vel.x = vehicle.getVelocity().x; vel.y = vehicle.getVelocity().y; vel.normalize(); while(vehicle.getRect().overlaps(e.getRect())) { vehicle.setCenter(vehicle.getCenterX() - vel.x * 0.1f, vehicle.getCenterY() - vel.y * 0.1f); } colided = true; } But that does not work too well. Is there some sort of vector I could calculate to use as the vector to move the car away from the object? Thanks

    Read the article

  • 2D Rectangle Collision Response with Multiple Rectangles

    - by Justin Skiles
    Similar to: Collision rectangle response I have a level made up of tiles where the edges of the level are made up of collidable rectangles. The player's collision box is represented by a rectangle as well. The player can move in 8 directions. The player's velocity is equal in X and Y directions and constant. Each update, I am checking the player's collision against all tiles that are a certain distance away. When the player collides with a rectangle, I am finding the intersection depth and resolving along the most shallow axis followed by the other axis. This resolution happens for both axes simultaneously. See below for two examples of situations where I am having trouble. Moving up-left against the left wall In the scenario below, the player is colliding with two tiles. The tile intersection depth is equal on both axes for the top tile and more shallow in the X axis for the middle tile. Because the player is moving up the wall, the player should slide in an upward direction along the wall. This works properly as long as the rectangle with the more shallow depth is evaluated first. If the equal intersection depth rectangle is evaluated first, there is a chance the player becomes stuck. Moving up-left against the top wall Here is an identical scenario with the exception that the collision is with the top wall. The same problem occurs at the corners when intersection depth is equal for both axes. I guess my overall question is: How can I ensure that collision response occurs on tiles that have non-equal intersection depth before tiles that have equal intersection depth in order to get around the weirdness that occurs at these corners. Sean's answer in the linked question was good, but his solution required having different velocity components in a certain direction. My situation has equal velocities, so there's no good way to tell which direction to resolve at corners. I hope I have made my explanation clear.

    Read the article

  • Confusion with floats converted into ints during collision detection

    - by TheBroodian
    So in designing a 2D platformer, I decided that I should be using a Vector2 to track the world location of my world objects to retain some sub-pixel precision for slow-moving objects and other such subtle nuances, yet representing their bodies with Rectangles, because as far as collision detection and resolution is concerned, I don't need sub-pixel precision. I thought that the following line of thought would work smoothly... Vector2 wrldLocation; Point WorldLocation; Rectangle collisionRectangle; public void Update(GameTime gameTime) { Vector2 moveAmount = velocity * (float)gameTime.ElapsedGameTime.TotalSeconds wrldLocation += moveAmount; WorldLocation = new Point((int)wrldLocation.X, (int)wrldLocation.Y); collisionRectangle = new Rectangle(WorldLocation.X, WorldLocation.Y, genericWidth, genericHeight); } and I guess in theory it sort of works, until I try to use it in conjunction with my collision detection, which works by using Rectangle.Offset() to project where collisionRectangle would supposedly end up after applying moveAmount to it, and if a collision is found, finding the intersection and subtracting the difference between the two intersecting sides to the given moveAmount, which would theoretically give a corrected moveAmount to apply to the object's world location that would prevent it from passing through walls and such. The issue here is that Rectangle.Offset() only accepts ints, and so I'm not really receiving an accurate adjustment to moveAmount for a Vector2. If I leave out wrldLocation from my previous example, and just use WorldLocation to keep track of my object's location, everything works smoothly, but then obviously if my object is being given velocities less than 1 pixel per update, then the velocity value may as well be 0, which I feel further down the line I may regret. Does anybody have any suggestions about how I might go about resolving this?

    Read the article

  • Restart and/or graphics problem in Ubuntu 12.04

    - by kara
    I having been using 12.04 for a couple of months now, with v. little problems. The other day I restarted my computer, and though I think it rebooted, the screen would be black. I could not even get a visual from a live cd. Finally, I was able to get it to load, but the resolution has been completely off. The computer thinks I have a laptop screen, when I actually have a ViewSonic VP2330wb, and it detects only two resolutions. And still, I have a problem with rebooting. If the screen locks after I leave it for a while, I can't get a visual back, and then when I force a shutdown, it takes 3 times for me to get a grub screen. Then I have to boot in recovery mode, and then finally in normal mode, but the screen is still always off. This is my video card: description: VGA compatible controller product: 2nd Generation Core Processor Family Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 09 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list configuration: latency=0 resources: memory:fe000000-fe3fffff memory:d0000000-dfffffff ioport:f000(size=64) I am a new ubuntu user, and am at my wits end. Any help would be greatly appreciated.

    Read the article

  • How do 2D physics engines solve the problem of resolving collisions along tiled walls/floors in non-grid-based worlds?

    - by ssb
    I've been working on implementing my SAT algorithm which has been coming along well, but I've found that I'm at a wall when it comes to its actual use. There are plenty of questions regarding this issue on this site, but most of them either have no clear, good answer or have a solution based on checking grid positions. To restate the problem that I and many others are having, if you have a tiled surface, like a wall or a floor, consisting of several smaller component rectangles, and you traverse along them with another rectangle with force being applied into that structure, there are cases where the object gets caught on a false collision on an edge that faces the inside of the shape. I have spent a lot of time thinking about how I could possibly solve this without having to resort to a grid-based system, and I realized that physics engines do this properly. What I want to know is how they do this. What do physics engines do beyond basic SAT that allows this kind of proper collision resolution in complex environments? I've been looking through the source code to Box2D trying to find out how they do it but it's not quite as easy as looking at a Collision() method. I think I'm not good enough at physics to know what they're doing mathematically and not good enough at programming to know what they're doing programmatically. This is what I aim to fix.

    Read the article

  • Problems with dual monitor & resolutions, only in 14.04

    - by theLadder
    I installed Ubuntu 14.04 but i am having weird problems with my dual monitors and the resolutions. I also tried Xubuntu 14.04 and was having the same problem. I have one 32 inch LG TV with 1920x1080 and one monitor with 1280x1024 resolution. When i first start my 32 inch gets 1360x768, if i then try to change to 1920x1080, everythings looks fine and the prompt asking me if i want to keep settings comes up and starts the countdown, but after 2 seconds my computer freezes, and after a few more seconds it reboots itself. However, if i disable my smaller monitor first, i can change to 1920x1080 on my 32 inch without problems, but if i then activate the second monitor the same problem happens again. in Xubuntu 14.04 i can change the refresh rate, if i run the 32 inch at 30hz or 50hz everytying works, but i would like to be able to run it at 60hz. I'm currently running Xubuntu 13.10 without this problem. My graphics card is a ATI Radeon HD 4850. What is causing this problem, grahpics drivers? Kernel? Xorg? And how do i solve it?

    Read the article

  • Monitor dectecs wrong display mode

    - by user292449
    I am running into an issue with 14.04. I have two monitors. the fist one is a 19 inch LG that runs off DVI at 1440x900. It seems to function just fine. The second is a 23 inch LG that should run at 1920x1200. It has been plugged in with both a HDMI to HDMI cable and a DVI to HDMI cable. It seems to be stuck in "I am displaying for a DVD player in 1080p mode" or some such. I had this issue with windows a long long time ago and eventually it just went away. I can set the screen display to 1920x1200 with the generic X drivers but I am a gamer and would like to use the Nvidia drivers since they deliver better performance. When I switch to the Nvidia driver I can set the resolution to 1920x1200 but the screen seems to be up and to the left with a black border down and to the right. If I switch back to the default X driver after this the screen remains stuck in the up and to the left mode. Any help would be wonderful.

    Read the article

  • LG W3000H-BN monitor cannot go above 1280x800

    - by Jo Profit
    I noticed that there are many people complaining about this issue with the W3000H but I have yet to find a solution that works for me. I am using Windows 7 Professional and and using a nVidia Quadro NVS 240 video card with a 4 monitor splitter cable. The cable from the monitor and the splitter are rated DVI-D Dual Link and the video card itself is rated for 2560x1600. I have installed the latest drivers for the video card and just grabbed the .inf, icm and cat file from the LG website and manually installed the monitor drivers. Does anyone have problems with the same setup? I have 3 other monitors (2 at 1920x1080 and 1 at 1280x1024). I really would like to be able to display the full resolution or else the large screen is useless. (I triple checked that the monitor itself supports this resolution). So monitor, cable, splitter and card supposedly support 2560x1600. Drivers are up to date but I cannot select that resolution when in the "Screen Resolution" menu, nor through the nVidia control panel. Please save me from madness :)

    Read the article

  • Monitor reset itself and now I can't set the resolution/settings back to how it was before

    - by verve
    I've had my LG 24" widescreen monitor since 2009 and 2 weeks ago I noticed the monitor turned itself off (never had it done this before) so I switched it back on to find all the settings like gamma, resolution etc. different = looked like it had been reset. Everyone in the house swears they never unplugged and plugged it back in. When I opened a webpage the fonts and zoom on the pages were different and my desktop was strange too; fonts of the icons were different etc. The screen seems blurry and when I watch movies the faces look distorted so I thought I would try to first figure out the resolution it used to be but when I go under "Adjust screen resolution" none of the options work and there is no recommended resolution marked; all the options stretch out the screen and looks terrible so right now I have it set to the least distorted one. Then since the resolution wasn't working I set the other manual settings(done by physical buttons on the monitor) back to how it used to be (luckily, I had written these down). The monitor looks better but the resolution makes it a strain to use. I thought maybe some Windows update caused this crap so I tried to System Restore: didn't work. What went wrong? A few questions: 1) What was the likely cause of the monitor shutting down itself and screwing up the settings I have been using since the day I bought the monitor? 2) Why have the fonts changed everywhere unless this is a HDD/video card problem? 3) How do I find the perfect resolution it used to be? The monitor wants me to set it to 1920 x 1080 but that isn't one of the options although I don't remember what resolution I used before. I use the 16:9 setting while I try the available resolution options but nothing looks good! How do I find what it used to be? Manual available in PDF under Support: http://www.lg.com/ca_en/computer-products/monitors/LG-lcd-monitor-W2442PA-BF.jsp Win 7. IE 9.

    Read the article

  • Remote Desktop Zooming

    - by codeulike
    Using Remote Desktop from a device with a hi-res screen (say, a Surface Pro) is decidedly tricky - as everything displays 1:1 scale and so looks tiny. If the machine you are remoting into runs Server 2008 R2 or later, you can change the dpi zooming setting (see here). But for older hosts, that doesn't work. Using normal Remote Desktop, you can connect with a lower resolution, say 1280x768, and turn on smart-sizing. However smart-sizing can scale down (to display a huge desktop in a small area) but does not seem to scale up (to display a small desktop in a big area). Using the Windows 8 Remote Desktop App, you can zoom - but you cannot set the default resolution of the host. What I want is a lower resolution in the host, scaled up to fit my screen. So both of those are close to what I want, but dont quite work. So question is: Does the Remote Desktop App allow screen resolution to be set somehow? Is there some other Remote Desktop client that can handle zooming better?

    Read the article

  • Serving images from another hostname vs Apache overload for the rewrites

    - by luison
    We are trying to improve further the speed of some sites with older HTML in order as well to obtain better SEO results. We have now applied some minify measures, combined html, css etc. We use a small virtualized infrastructure and we've always wanted to use a light + standar http server configuration so the first one can serve images and static contents vs the other one php, rewrites, etc. We can easily do that now with a VM using the same files and conf of vhosts (bind mounts) on apache but with hardly any modules loaded. This means the light httpd will have smaller fingerprint that would allow us to serve more and quicker, have more minSpareServer running, etc. So, as browsers benefit from loading static content from different hostnames as well, we've thought about building a rewrite rule on our main server (main.com) to "redirect" all images and css *.jpg, *.gif, *.css etc to the same at say cdn.main.com thus the browser being able to have more connections. The question is, assuming we have a very complex rewrite ruleset already (we manually manipulate many old URLs for SEO) will it be worth? I mean will the additional load of main's apache to have to redirect main.com/image.jpg (I understand we'll have to do a 301) to cdn.main.com/image.jpg + then cdn.main.com having to serve it, be larger than the gain we would be archiving on the browser? Could the excess of 301s of all images on a page be penalized by google? How do large companies work this out, does the original code already include images linked from the cdn with absolute paths? EDIT Just to clarify, our concern is not to do so much with server performance or bandwith. We could obviously employ an external CDN server but we have plenty CPU and bandwith. Our concern is with how to have "old" sites with plenty semi-static HTML content benefiting from splitting connections for images and static content via apache without having to change the html to absolute paths (ie. image.jpg to cdn.main.com/image.jpg happening on the server not the code)

    Read the article

  • Windows 7 Stopped Using hosts file for DNS Resolution

    - by AJ
    I am running Windows 7 Home Premium 64-bit. Starting today, I noticed that DNS resolution is not reading my %SYSTEMROOT%\System32\drivers\etc\hosts file. I say this because I added two new entries to the file and when I run 'nslookup' on the command line, they don't resolve. Further, just trying to resolve 'localhost' results in my primary DNS server being queried. I've read several threads that suggest that the file might have been corrupted and to move it aside and create a new one. I've done that, and no improvement. Is there some sort of registry key that controls the sequence of resources used for DNS resolution (similar to nsswitch.conf on UNIX)? What else could be causing this? Thanks in advance.

    Read the article

  • How can I get 2560 * 1600 on Win 7 ? MacBook Pro 17 + Dell 30 inch 3008 WFP + Fusion 3

    - by Tarek Demiati
    I run VM Ware Fusion 3 on a Mac Book Pro 17 hooked to a Dell 30 inch screen I CAN manage to get a Resolution of 2560 * 1600 on Mac OS X(MacBook Pro), but can't on Win 7 (on the exact same MacBook Pro) The highest resolution I can get on MS Win 7 is 2048 * 1536 (and I want to be able to set it to 2560 * 1600 to fully enjoy the real estate of my 30 inch screen!) I have searched the KB, and found an article which mentionned that I should add the following lines in the vmx file (which I did) The lines are the very bottom of the vmx file svga.maxWidth = "2560" svga.maxHeight = "1600" svga.vramSize = "16384000" KB Article : http://kb.vmware.com/selfservice/microsites/search.do?cmd=displayKC&docType=kc&externalId=1003&sliceId=1&docTypeID=DT_KB_1_1&dialogID=63746028&stateId=0%200%2066741566 I did the manipulation describe in the KB above however, I rebooted several times, but I still can't get the correct resolution to show in Windows

    Read the article

  • ubuntu 10.04 console resolution

    - by Ove
    I have installed Ubuntu 10.04 on my HP Pavilion dv6000 After I installed it, the text in the console (when I press ALT+F1) was small and the console had a good resolution (I think the same as my LCD, 1280x800). Also, at boot, the "Ubuntu" logo was small and centered in the middle of the screen. That was good. After that, I installed the nVidia driver via the "System-Administration-Hardware drivers" screen. After the driver was installed, the text in the console was larger and more pixelated, and also the "Ubuntu" logo was much larger and looked uglier because it was pixelated. Can anyone help me change the resolution in the console and boot screen back to what it was before I installed the nVidia driver?

    Read the article

  • Dock displays low-resolution icons

    - by squircle
    Recently, I've noticed that the dock has been starting to display low-resolution icons in place of the former high-resolution icons for common apps like Stickies, Word, iTunes and Preview. Looking at the .icns file within each program, all copies of the icon are present within the file (high and low resolutions), but the dock refuses to display them, leaving some programs looking like this: Restarting doesn't stop this behaviour, nor does a killall Dock, nor removing the icon and replacing it in the dock. In Finder, the icons display normally. Does anybody know what may be causing this issue? Thanks!

    Read the article

  • Ubuntu 10.10: getting appropiate monitor resolution for lcd hdtv

    - by lurscher
    I'm running Ubuntu 10.10 x86_64 version, with Nvidia 9800 GT, installed 270.41.06 Nvidia drivers following this guide. I have a LG42LH30FR LCD TV connected with the dvi link - RGB PC input I'm able to get 1024x768 resolution without overscan (I can get 1080i = 1366x768 but there is a lot of hidden screen space to the right and I don't know what to do about it). I want to get full HD I can get full HD (1080p = 1920x1080) on Windows XP 64-bit with custom resolution created with Nvidia Control Panel, from reading over xorg.conf configurations it seems I need to add a certain modeling to the monitor configuration, but I don't know where to get the appropriate options for this task any suggestions?

    Read the article

  • Monitor "forgot" its native resolution overnight

    - by Ben
    I have: Hanns G HW173A monitor NVIDIA GTX 460 graphics card Windows XP SP3. It was working. Suddenly overnight I can no longer use my monitor's native resolution of 1440x900. It's not listed, and if I force it it only shows part of the screen. The only resolution that seems to fit the whole screen is now 1024x768. I have tried reinstalling the NVIDIA drivers as well as disabling and enabling the monitor drivers (for windows XP the monitor uses the microsoft "default monitor" drivers - this is what Hanns G says on their website too). Any ideas? Thanks very much.

    Read the article

  • Serving images from another hostname vs Apache overload for the rewrites

    - by luison
    We are trying to improve further the speed of some sites with older HTML in order as well to obtain better SEO results. We have now applied some minify measures, combined html, css etc. We use a small virtualized infrastructure and we've always wanted to use a light + standar http server configuration so the first one can serve images and static contents vs the other one php, rewrites, etc. We can easily do that now with a VM using the same files and conf of vhosts (bind mounts) on apache but with hardly any modules loaded. This means the light httpd will have smaller fingerprint that would allow us to serve more and quicker, have more minSpareServer running, etc. So, as browsers benefit from loading static content from different hostnames as well, we've thought about building a rewrite rule on our main server (main.com) to "redirect" all images and css *.jpg, *.gif, *.css etc to the same at say cdn.main.com thus the browser being able to have more connections. The question is, assuming we have a very complex rewrite ruleset already (we manually manipulate many old URLs for SEO) will it be worth? I mean will the additional load of main's apache to have to redirect main.com/image.jpg (I understand we'll have to do a 301) to cdn.main.com/image.jpg + then cdn.main.com having to serve it, be larger than the gain we would be archiving on the browser? Could the excess of 301s of all images on a page be penalized by google? How do large companies work this out, does the original code already include images linked from the cdn with absolute paths?

    Read the article

< Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >