Search Results

Search found 19958 results on 799 pages for 'bit fiddling'.

Page 366/799 | < Previous Page | 362 363 364 365 366 367 368 369 370 371 372 373  | Next Page >

  • What electronic user-story-mapping tools can you recommend?

    - by azheglov
    Agile software development relies heavily on a work item type called user stories. For example, you have a backlog full of user stories and you can select a few of them to work on during the next sprint. But where and how do you find user stories to put into the backlog? There is a popular technique for doing that called story mapping. Jeff Patton invented it and here is the definitive guide on how to do it. The question is, what electronic tools are out there that support Patton's story-mapping technique? I've done a bit of research, found Pivotal and Rally plug-ins (but I'm not a customer of either) and I'm currently experimenting with SilverStories. What other tools are out there? What have you used? What do you (not) recommend? Why? UPDATE: Some people who wrote comments seem to lean towards an answer that applying this technique is simply impossible with an electronic tool and we should just accept that. Can't someone write it up as an answer?

    Read the article

  • DNS A vs NS record

    - by Tiddo
    I'm trying to understand DNS a bit better, but I still don't get A and NS records completely. As far as I understood, the A record tells which IP-address belongs to a (sub) domain, so far it was still clear to me. But as I understood, the NS record tells which nameserver points belongs to a (sub) domain, and that nameserver should tell which IP-address belongs to a (sub) domain. But that was already specified in the A record in the same DNS file. So can someone explain to me what the NS records and nameservers exactly do, because probably I understood something wrong. edit: As I understand you correctly, a NS record tells you were to find the DNS server with the A record for a certain domain, and the A record tells you which ip-address belongs to a domain. But what is the use of putting an A and an NS record in the same DNS file? If there is already an A record for a certain domain, then why do you need to point to another DNS server, which would probably give you the same information?

    Read the article

  • Program to download .torrent file from a Magnet URI?

    - by robmathers
    I have a Torrent app running on my home server that has an outdated but pretty useful web interface. The one problem is that it doesn't take magnet links, only .torrent files. I'd like to continue using this and not have to bother with finding a different program. Given that magnet links are just a pointer to download the torrent file from the swarm, I'm hoping there's a program out there that will take a magnet link and spit out a .torrent only. I know I could put it into µTorrent and grab the file from the app's directory, but that's a bit roundabout, and I'd like something that will do it semi-unattented. Preferably for OS X, but Linux (or a web app) would work too.

    Read the article

  • Why does my SQL Server use AWE memory? and why is this not visible in RAMMap?

    - by Marnix Klooster
    We have a Windows Server 2008 R2 (64-bit) 8GB server where, according to Sysinternals RAMMap, 2GB of memory is allocated using AWE. As far as I understood, this means that these pages stay in physical memory and are never pushed out. This causes other apps to be pushed out of physical memory. In RAMMap, on the Physical Pages tab, the Process column is empty for all of the AWE pages. We run SQL Server on that box, but (through SQL Server Management Studio, under Server Properties - Memory, under Server memory options) it says is configured not to use AWE. However, when stopping SQL Server, the AWE pages are suddenly gone. So it really is the culprit. So I have three questions: Why does RAMMap not know/show that a SQL Server process is responsible for that AWE memory? Why does SQL Server Management Studio say that AWE memory is not used? How do we make configure SQL Server to really not use AWE memory?

    Read the article

  • I can't install Ubuntu on my Dell Inspiron 15R at all

    - by Kieran Rimmer
    I'm attempting to install Ubuntu 12.04LTS, 64 bit onto a Dell Inspiron 15R laptop. I've shrunk down one of the windows partitions and even used gparted to format the vacant space as ext4. However, the install disk simply does not present any options when it comes to the partitioning step. What I get is a non-responsive blank table As well as the above, I've changed the BIOS settings so that USB emulation is disabled (as per Can't install on Dell Inspiron 15R), and changed the SATA Operation setting to all three possible options. Anyway, the install CD will bring up the trial version of ubuntu, and if I open terminal and type sudo fdisk -l, I get this: Disk /dev/sda: 1000.2 GB, 1000204886016 bytes 255 heads, 63 sectors/track, 121601 cylinders, total 1953525168 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 4096 bytes I/O size (minimum/optimal): 4096 bytes / 4096 bytes Disk identifier: 0xb4fd9215 Device Boot Start End Blocks Id System /dev/sda1 63 80324 40131 de Dell Utility Partition 1 does not start on physical sector boundary. /dev/sda2 * 81920 29044735 14481408 7 HPFS/NTFS/exFAT /dev/sda3 29044736 1005142015 488048640 7 HPFS/NTFS/exFAT /dev/sda4 1005154920 1953520064 474182572+ 83 Linux Disk /dev/sdb: 32.0 GB, 32017047552 bytes 255 heads, 63 sectors/track, 3892 cylinders, total 62533296 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0xb4fd923d Device Boot Start End Blocks Id System /dev/sdb1 2048 16775167 8386560 84 OS/2 hidden C: drive If I type 'sudo parted -l', I get: Model: ATA WDC WD10JPVT-75A (scsi) Disk /dev/sda: 1000GB Sector size (logical/physical): 512B/4096B Partition Table: msdos Number Start End Size Type File system Flags 1 32.3kB 41.1MB 41.1MB primary fat16 diag 2 41.9MB 14.9GB 14.8GB primary ntfs boot 3 14.9GB 515GB 500GB primary ntfs 4 515GB 1000GB 486GB primary ext4 Model: ATA SAMSUNG SSD PM83 (scsi) Disk /dev/sdb: 32.0GB Sector size (logical/physical): 512B/512B Partition Table: msdos Number Start End Size Type File system Flags 1 1049kB 8589MB 8588MB primary Warning: Unable to open /dev/sr0 read-write (Read-only file system). /dev/sr0 has been opened read-only. Error: Can't have a partition outside the disk! I've also tried a Kubuntu 12.04 and Linuxmint install disks, wityh the same problem. I'm completely lost here. Cheers, Kieran

    Read the article

  • How do I decrypt WPA2 encrypted packets using Wireshark?

    - by Rox
    I am trying to decrypt my WLAN data with Wireshark. I have already read and tried eveything on this page but without any success (well, I tried the example dump on that page and succeeded, but I fail with my own packets). I caught the four-way handshake from another client connecting to the network. My network info is as follows: SSID: test Passphrase: mypass The above info would give this preshared key: 58af7d7ce2e11faeab2278a5ef45de4944385f319b52a5b2d82389faedd3f9bf In Wireshark in the Preferences--IEEE 802.11 I have set this line as Key 1: wpa-psk:58af7d7ce2e11faeab2278a5ef45de4944385f319b52a5b2d82389faedd3f9bf I have tried the different options of "Ignore the protection bit" but none works. What could I have missed?

    Read the article

  • How many of you *really* surf around without JavaScript enabled? [closed]

    - by Stephen
    I've decided to rephrase the question. After some deliberation on Meta, I've realized that my question needs to be a bit more focused. The question: Should we (web developers) continue to spend effort progressively enhancing our web applications with JavaScript, ensuring that features gracefully degrade, thereby ensuring accessibility? Or should we spend that time focused on new features or other areas of development? The subtext of that question would be: How many of our customers/clients/users utilize our websites or applications with JavaScript disabled? Do you have any projects with requirements that specifically demand JavaScript functionality (almost all of mine do), and do those requirements also demand graceful degradation? For the sake of asking this question, I pulled up programmers.stackexchange.com without JavaScript enabled, and I was greeted with this message: "Programmers - Stack Exchange works best with JavaScript enabled". It was difficult to log in, albeit the site seemed to generally work okay. (I wasn't able to vote up any questions.) I think this is a satisfactory approach to development. Imagine the effort involved in making all of the site's features work with plain old HTML and server-side logic. OTOH, I wonder how many users have been alienated by this approach. We've all been trained (at least the good developers among us) to use progressive enhancement and to ensure our web applications' dynamic features degrade gracefully. Is this progressive enhancement just pissing into the wind, or do some of our customers actually utilize certain web services without JavaScript enabled? I mean, like really, not figuratively or presumptuously.

    Read the article

  • how to optimize virtual box shared folders

    - by Nrew
    This is really pissing me off. No matter how much memory I put into the guest os(windows xp). It still hangs for about 365 days before you can access the file you want to access from the shared folder. What do I do to make things faster? Because after it hangs and not respond for 365 days. It will do it again for another 250 days. Ive even set the shared folder to permanent. This is a fairly decent machine: 2.50ghz processor(x64 architecture, but I have only 2Gb of memory so my host os is just 32 bit windows 7) hdd has much space left: 156 Gb free of 250Gb

    Read the article

  • Are these components compatible and are there any significant bottlenecks?

    - by Tom Gullen
    I'm trying to buy a new pc, for software dev and a bit of gaming. I already have a 500gb HDD , and a PCI sound card I want to use as well. Is all this stuff compatible, and will it all work together and are there any significant bottlenecks? Case, Mobo and PSU "Primo Motion" AMD 880G DDR3 Ready Barebones (Socket AM3) http://www.overclockers.co.uk/showproduct.php?prodid=FS-268-OK&groupid=43&catid=1817&subcat= SSD 64GB Crucial RealSSD C300 64GB 2.5" SATA 6Gb/s Solid State Hard Drive http://www.overclockers.co.uk/showproduct.php?prodid=HD-007-CR&groupid=1657&catid=1660&subcat=1668 CPU AMD Phenom II X6 Six Core 1090T Black Edition 3.20GHz (Socket AM3) http://www.overclockers.co.uk/showproduct.php?prodid=CP-266-AM&groupid=701&catid=6&subcat=1944 RAM 8GB Corsair Vengeance 8GB (2x4GB) DDR3 PC3-15000C9 1866MHz Dual Channel Kit http://www.overclockers.co.uk/showproduct.php?prodid=MY-292-CS&groupid=701&catid=8&subcat=1387 Graphics XFX ATI Radeon HD 5770 1024MB GDDR5 PCI-Express Graphics Card http://www.overclockers.co.uk/showproduct.php?prodid=GX-149-XF

    Read the article

  • Set up ad hoc wireless connection between Windows Vista and Mac OS X

    - by Skarab
    I have the following problem - Windows Vista does not connect to adhoc wireless network created on my Macbook. I have tried to create secured (with 40 bit key) and unsecured network but Windows Vista still has problems to connect. Windows VISTA informs me -- after 5 minutes of attempts - that setting up the connection -- with my adhoc network -- took too much time. My question: do I need to configure some settings on Vista to connect it to my Macbook? Maybe it is a problem with DHCP? Edited: I have tried the other way: http://superuser.com/questions/202890/set-up-an-adhoc-network-in-windows-vista-to-connect-to-and-share-the-internet-con

    Read the article

  • Will adding top level directories with similar structure to existing directories change the SEO of my site?

    - by Russell Sims
    I've been pointed this way for SEO related questions and this one has had me pondering for a little while now. I'm recreating a site's structure. The website's content is generated through several feeds and unless I want to place each and every - of the 10,000 odd - venues into their own category manually, I can't avoid categorising each item by using its address. The current the structure looks like this Homepage > region > county > city/town > venue page and the URL looks like domain/region/county/city/venue/ I'm relatively happy to use this structure as it's not too convoluted. However we also promote deals and we also group the venues into their respective franchise, so that leads to URLs such as: domain/groups AND domain/deals My question is: how would the directory structure look with these new additions? Would I have a URL that looks like domain/deals/region/county/city/venue or domain/group/region/county/city/venue and just put a 301 or a canonical link tag on the page to prevent the duplicate pages competing with each other? Am I just worrying about it needlessly and perhaps link straight from domain/deals to the venue page URL domain/region/county/city/venue, this bothers me a bit though as the deals and groups will not be in the breadcrumbs.

    Read the article

  • Cloning Windows 7 installation from MBR to GPR drive and make it bootable

    - by Nelluk
    I've seen threads on similar topics - such as this one - but the answers never seem to solve how to make it bootable. I have Win 7 64-bit on a PC installed on a 2tb MBR volume. The motherboard is UEFI compatible. I just installed a secondary internal 3TB drive which will be partitioned as GPT. Is there a relatively easy way to clone my installation over to the new drive and have that drive be bootable? I have used EaseUS Partition Master to clone the C volume to the D volume, but that would not boot and I assume the issue is that one is MBR and one is GPT. Is there a process to do this?

    Read the article

  • How Can I Know Whether I Am a Good Programmer?

    - by Kristopher Johnson
    Like most people, I think of myself as being a bit above average in my field. I get paid well, I've gotten promotions, and I've never had a real problem getting good references or getting a job. But I've been around enough to notice that many of the worst programmers I've worked with thought they were some of the best. Bad programmers who are surrounded by other bad programmers seem to be the most self-deluded. I'm certainly not perfect. I do make mistakes. I do miss deadlines. But I think I make about the same number of bonehead moves that "other good programmers" do. The problem is that I define "other good programmers" to mean "people who are like me." So, I wonder, is there any way a programmer can make some sort of reasonable self-evaluation? How do we know whether we are good or bad at our jobs? Or, if terms like good and bad are too ill-defined, how can programmers honestly identify their own strengths and weaknesses, so that they can take advantage of the former and work to improve the latter?

    Read the article

  • Optimising Database Mirroring over WAN

    - by blakmk
      I recently got asked by our network guys about botlenecks in the WAN that used for mirroring to our DR I site. They asked me to turn off encryption of Database Mirroring so that the riverbed software  they were using could optimise the packets sent over the WAN. I was a bit sceptical at first about the security risks, but it seems the riverbed software has its own form of obfuscation making the packets difficult to read. After reading an article by rusanu I realised that it could be done with minimal downtime and potential reducing network traffic by 5-10% on its own. After turning off encryption I was pleasantly suprised to see that overall network traffic for mirroring dropped by a whopping 75%!                                               This Web Page Created with PageBreeze Free HTML Editor

    Read the article

  • Enabling AES 256 GCM on Windows Server 2012 R2

    - by Feanaro
    I'd like to enable the use of the AES 256 GCM encryption instead of the AES 256 CBC. We already have ECC certificates based on ECDSA so that pre-requisite has been fullfilled. The certificate has a SHA-256 signature and uses a 256-bit ECC keyset. The ciphersuite I'd like to use: TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384_P384 This is our ciphersuite order: TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384_P384, TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384_P521, TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384_P384, TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384_P521, TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA_P256, TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA_P256, TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384_P256 Still when I check the website it says we use TLS 1.2 and ECDHE_ECDSA for key exchange AES_256_CBC encryption and SHA1 for message digest. I suspect it uses this suite for some reason: TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA_P256 When I remove that ciphersuite the site has a protocol mismatch and won't load the https anymore. Does anyone know how to enable the ciphersuite? Did I forget to set something in the registry or do I need to do something else to enable that specific suite. Thanks in advance!

    Read the article

  • Welcome to the Oracle FedApps blog

    - by jeffrey.waterman
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Congratulations, you have stumbled upon Oracle’s newest blog: The Federal Applications Blog. Periodically I plan to provide some insight into how Oracle’s application solutions are being applied, or how they can be applied, within the Federal Government. If you are a user of, or just interested in, Oracle’s applications in the Federal space and have questions/topics you would like to see addressed in this blog, please post a comment. So bear with me as I take a bit of time to refine the content, look and feel of this blog. http://www.oracle.com/us/industries/public-sector/038044.htm http://www.oracle.com/us/industries/public-sector/038046.htm -- JMW

    Read the article

  • Partial upgrade on 12.04, how to stop nagging after locking to a working NVIDIA & xorg

    - by alsk
    How to stop the upgrade manager from offering updates and upgrades that potentially would harm my working 2D and 3D graphics? Finally, I got 12.04 working as it should: with nvidia-173 drivers by downgrading xorg and locking the version: On my 32-bit system on Athlon64, with (Albatron) NVIDIA GeForce FX5700XT, locked (/pinned) to xorg 1:7.6-7ubuntu7, xserver-xorg-core 2:11.1-0obuntu10.07, nvidia-173 173.14.35-0ubuntu0.2? An annoying thing left is that every time the updates are checked, I get warning of partial updates, and ambiguous options of "partial update" and "close". Ambiguous in that sense that if I click close, I will get option to update a few packages, which has been OK, while "partial update" would like to update my kernel to 3.2, alter xorg, remove nvidia-173 etc., and update mesa etc. This is not what I call appropriate, after locking XORG and NVIDIA drivers to working ones. One may say according to package management logic it may be correct, but to me as an user it makes little sense. Last Ubuntu that worked without big mess for me was 10.10, hence I will not put 12.10 to my "production" system, until I can be sure it will not trash the system again. P.S. Is there a recommended way to keep NVIDIA GeForce FX working with 3D on Ubuntu... in future?

    Read the article

  • IoT? Time for Enterprise Architecture

    - by OTN ArchBeat
    Of course you've been listening to the latest OTN ArchBeat Podcast on the challenges and opportunities in the Internet of Things. If so, you'll also be interested in ZDNet blogger Joe McKendricks' recent post, Will the 'Internet of Things' make CIOs' jobs harder?. In that post McKendrick offers this important bit of advice that will certainly have architects saying "I told you so." Enterprises need to develop architectural approaches to the management of data. Meaning the development of repeatable processes to source, ingest, transform and store information. For years, IT managers simply bought more hardware and addressed data with on-off integration projects. Now it's time for enterprise architecture. IoT is an important new phase in the evolution of enterprise IT. Challenging? You bet! But meeting any such challenge requires big, broad thinking and planning. In that context Enterprise Architecture has always been important. But as IoT gains traction and speed, enterprise architecture should be top of mind for all concerned.

    Read the article

  • Emulator PCSX Reloaded - Fullscreen not working on Unity

    - by Leonardo Montenegro
    I have an older PS1 console with a couple of games I bought some years ago. On my pc, I'm using PCSX Reloaded - the best PS1 emulators for Linux I found so far. But I'm having a little issue on Ubuntu 12.04 Precise. I'm using Unity 3D and I'm trying to run some of my original PS1 games on PCSX Reloaded. Everything works nicely, except for fullscreen. I toggle fullscreen and specify maximum resolution for my monitor, but on fullscreen mode, both left and top unity bars aren't getting hidden. I tried changing between other graphic modes like Gnome Classic and Gnome Classic w/o Effects. On both, PCSX shows bars in fullscreen mode, so it isn't an Unity-specific issue, but an emulator problem. It's a bit annoying play games this way, so basically I'm running games on window mode for now. I'm using default OpenGL graphic plugin on this emulator. I tried changing to X11 graphic plugin and fullscreen worked, but graphics on X11 plugin aren't as good as OpenGL one. Anyone know a way to get fullscreen working on PCSX using OpenGL plugin? Or maybe another graphic plugin w/ OpenGL support.

    Read the article

  • How to balance a non-symmetric "extension" based game?

    - by Klaim
    Most strategy games have fixed units and possible behaviours. However, think of a game like Magic The Gathering : each card is a set of rules. Regularly, new sets of card types are created. I remember that the firsts editions of the game have been said to be prohibited in official tournaments because the cards were often too powerful. Later extensions of the game provided more subtle effects/rules in cards and they managed to balance the game apparently effectively, even if there is thousands of different cards possible. I'm working on a strategy game that is a bit in the same position : every units are provided by extensions and the game is thought to be extended for some years, at least. The effects variety of the units are very large even with some basic design limitations set to be sure it's manageable. Each player choose a set of units to play with (defining their global strategy) before playing (like chooseing a themed deck of Magic cards). As it's a strategy game (you can think of Magic as a strategy game too in some POV), it's essentially skirmish based so the game have to be fair, even if the players don't choose the same units before starting to play. So, how do you proceed to balance this type of non-symmetric (strategy) game when you know it will always be extended? For the moment, I'm trying to apply those rules but I'm not sure it's right because I don't have enough design experience to know : each unit would provide one unique effect; each unit should have an opposite unit that have an opposite effect that would cancel each others; some limitations based on the gameplay; try to get a lot of beta tests before each extension release? Looks like I'm in the most complex case?

    Read the article

  • Virtual PC - Display Issue with Ubuntu Server

    - by Christopher R
    Hi Everyone, I just did a clean install of Ubuntu Server 9.04 in Virtual PC on the Windows 7 RC, and it seems to be having a bit of an issue with the virtual machine's display adapter. I've tried setting a VGA flag in the GRUB configuration to no avail. This is a guess, but I think it has something to do with the color console mode that gets enabled by default at boot time. The system starts booting just fine (i.e. the console looks "normal" when I'm asked to enter an LVM passphrase, etc.), but then the display goes wonky after a few seconds and I end up with this. Typing commands in bash works just fine: it's not like the system is frozen or anything, I just can't see anything that I type. The console looks exactly the way it does in the image below.

    Read the article

  • Demystifying "chunked level of detail"

    - by Caius Eugene
    Just recently trying to make sense of implementing a chunked level of detail system in Unity. I'm going to be generating four mesh planes, each with a height map but I guess that isn't too important at the moment. I have a lot of questions after reading up about this technique, I hope this isn't too much to ask all in one go, but I would be extremely grateful for someone to help me make sense of this technique. 1 : I can't understand at which point down the Chunked LOD pipeline that the mesh gets split into chunks. Is this during the initial mesh generation, or is there a separate algorithm which does this. 2 : I understand that a Quadtree data structure is used to store the Chunked LOD data, I think i'm missing the point a bit, but Is the quadtree storing vertex and triangles data for each subdivision level? 3a : How is the camera distance usually calculated. When reading up about quadtree's, Axis-aligned bounding box's are mentioned a lot. In this case would each chunk have a collision bounding box to detect the camera or player is nearby? or is there a better way of doing this? (raycast maybe?) 3b : Do the chunks calculate the camera distance themselves? 4 : Does each chunk have the same "resolution". for example at top level the mesh will be 32x32, will each subdivided node also be 32x32. Example below:

    Read the article

  • Windows 2008 Server in Amazon EC2 stops responding when SSTP/VPN connection is closed

    - by user38349
    All, I have a single Windows 2008 server running in Amazon's EC2 cloud. It's running a web application that is running fine and is accessible to the outside world. I need 3-5 developers to be able to work on database on the server, and was intending to accomplish this by setting up SSTP/RRAS on the server and letting them VPN in. This has been a bit of an ordeal due to the amount of server roles and messing with certificates that has been needed, but my VPN connection works now (all clients will be Windows 7). My problem is that when I use my VPN connection (from the client side) the server hangs - although not at any any consistent place, sometimes it's when I close the connection, some times when I'm making the connection). The only way that I've found to get it back is to reboot it from the Amazon management console. Thanks for any guidance. Duncan

    Read the article

  • Windows 2008 Server in Amazon EC2 stops responding when SSTP/VPN connection is closed

    - by user38349
    All, I have a single Windows 2008 server running in Amazon's EC2 cloud. It's running a web application that is running fine and is accessible to the outside world. I need 3-5 developers to be able to work on database on the server, and was intending to accomplish this by setting up SSTP/RRAS on the server and letting them VPN in. This has been a bit of an ordeal due to the amount of server roles and messing with certificates that has been needed, but my VPN connection works now (all clients will be Windows 7). My problem is that when I drop my VPN connection (from the client side) the server hangs. The only way that I've found to get it back is to reboot it from the Amazon management console. Thanks for any guidance. Duncan

    Read the article

  • Could not apply the stored configuration for the monitor

    - by dellphi
    I'm using Nvidia 7300 gt and monitor-Acer V173w, on 64 bit Ubuntu 10.04. Compiz and Emerald went well, but at the time of entry into the GUI, I always receive the message : "Could not apply the stored configuration for the monitor, could not find a suitable configuration of screens" Why do I always receive it, and what is wrong with the monitor configuration or pci-e is used? root@dellph1-desktop:/# xrandr Screen 0: minimum 320 x 240, current 1440 x 900, maximum 1440 x 900 default connected 1440x900+0+0 0mm x 0mm 1440x900 50.0* 1024x768 51.0 58.0 59.0 1360x768 52.0 53.0 1152x864 54.0 55.0 56.0 57.0 960x600 60.0 960x540 61.0 896x672 62.0 840x525 63.0 64.0 65.0 66.0 832x624 67.0 800x600 68.0 69.0 70.0 71.0 72.0 73.0 800x512 74.0 720x450 75.0 680x384 76.0 77.0 640x512 78.0 79.0 640x480 80.0 81.0 82.0 83.0 576x432 84.0 85.0 86.0 87.0 512x384 88.0 89.0 90.0 416x312 91.0 400x300 92.0 93.0 94.0 95.0 320x240 96.0 97.0 98.0 root@dellph1-desktop:/# === xorg.conf # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 260.19.29 ([email protected]) Wed Dec 8 12:27:27 PST 2010 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: xconfig, VertRefresh source: xconfig Identifier "Monitor0" VendorName "Unknown" ModelName "Acer V173W" HorizSync 30.0 - 83.0 VertRefresh 55.0 - 75.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce 7300 GT" EndSection Section "Screen" # Removed Option "metamodes" " 1440x900_60 +0+0; 1280x1024 +0+0" # Removed Option "metamodes" "1440x900 +0+0" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "0" Option "TwinViewXineramaInfoOrder" "CRT-0" Option "metamodes" "1440x900_75 +0+0; 1440x900 +0+0" SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

< Previous Page | 362 363 364 365 366 367 368 369 370 371 372 373  | Next Page >