Search Results

Search found 18096 results on 724 pages for 'let'.

Page 527/724 | < Previous Page | 523 524 525 526 527 528 529 530 531 532 533 534  | Next Page >

  • Where can I legally obtain the 64bit version of Windows 8?

    - by Harsha K
    No, I am not looking to pirate. I bought a key through the Upgrade assistant (for just $15 due to the upgrade offer), but it downloaded an iso file that was between 2.3 and 2.5 GB. Which doesn't make sense to me, because the Evaluation version of Windows 8 x64 is closer to 3.4 GB in size. I assumed the Upgrade Assistant would be intelligent enough to realize that it is being run on a Windows 7 x64 machine and by extension, download the x64 code. Previously, I was able to legally download the ISOs (sans the keys, of course) from the Digital River host. I do not see an option to do that. I'm not interested in risking downloading a tampered ISO. I want to do it through Microsoft channels, but I just don't see how. As you may imagine, search terms such as "Windows 8 official download link" result in a plethora of obviously spyware infested piracy sites. If there's any non-exposing way for me to prove that I have legally purchased Windows and I'm genuinely looking for this answer, please let me know. For reference to what I am looking for it is similar to the answer given in this question for Windows 7: Where do I download Windows 7 (legally from Microsoft)?

    Read the article

  • Why is my HP TouchSmart tx2 1025dx shutting down?

    - by Kristy
    I'm having an issue with my HP tx2 1025dx laptop. Sometimes it will spontaneously shut itself off. First, the screen will go out. Next, the computer itself will shut down. This usually occurs when I am doing something that requires a lot of resources, but lately can happen when I am doing something as simple as using an Office app. I think it may be heat related, if I try to boot the laptop after it shuts itself down, it will shut itself down again before startup is complete. If I let it cool down for a while and then come back to it, the computer will boot properly. These shutdowns are always accompanied by high temperatures on the bottom right side, as well as the right side of the keyboard. I am using a laptop cooler with two fans, and it is not making a bit of difference. This is the second TouchSmart we have owned, the first one had the same issues, we sent it in to HP after it finally refused to reboot. They “fixed” it, and it worked fine for about a month before doing the exact same thing. Unfortunately we had bought the second one before the issues began with the original, if I had known this would be a problem, I would not have purchased another TouchSmart. This particular laptop we have had for more than a year and it didn’t start these issues until a couple of months ago. Any suggestions as to how to fix this problem?

    Read the article

  • vmware server 64 bit on ubuntu 9.10 64 bit with P2V windows 2003 SBS poor network speed

    - by RobertHC
    configuration is ubuntu 2.6.31-21 64 bit vmware 2.0.2 64 bit last release hardware is core 2 quad with 8GB ram guest is win 2003 server SBS 32 bit Dear friends, we have a converted physical to virtual windows sbs 2003, converted with last converter available nowadays http://www.vmware.com/products/converter/ vCenter converter. Running the P2V 2K3 SBS on vmware server, it does boot fine, but we do note an abnormal CPU activity and a poor lan speed. As attempts we did what follow. We removed all unneeded peripherals, we removed one NIC (phisycal server was 2 nics), we changed the vmx to ged the nic recognized as intel instead than amd, we removed 1 cpu (physical was 2 cpu), we removed anything was reported as failed driver from system events monitor. Nothing to do, no way and funny results. Let's read some tests results. All are made with the same file copied in different source folders. Copying from client side (both directions copy, to/from server) results are i.e. 10 seconds, copying the same files from server side (again from and to server) results are different... from client to server, speed is round about (bit more) 10 seconds, but from server to client direction is slower: double the time. Beeing very fast and launching a simultaneous copy "from server to client"+"from client to server", this made from the server side, results in a stuck traffic... 45 seconds to do the copy. vmware tools are installed and e1000 driver has been updated. With one processor CPU activity is still going up and down but much less than with two. Because of test, we installed win 2k8 STD 64 bit. We repeated all the above tests with exactly the same file result is just one: always 5 seconds (this matches the lan speed) Any idea about this issue is welcome and thank you if any. Kind regards R.

    Read the article

  • What's wrong with closing applications on Windows Mobile?

    - by balpha
    As far as I can tell, this annoys the crap out of people that do notice and (at max) gives no real benefit to people who don't notice: Why did Microsoft decide to make the "X" on Windows Mobile (or CE before that) not close, but only hide the application, and thus keep cluttering up your memory? WM wants you to go to the Control Panel - Memory and "Do you really want to" shut down the app. Pretty much every WM application I've seen that did not come from Microsoft has a "Quit" menu choice. The number of task managers out there that let you quit programs is larger than the count of emails from African bank managers that want me to take care of some millions of bucks that belonged to a deceased customer of theirs. My new HTC even comes with a close-able (not closeable, though) task manager pre-installed. But still today, Word Mobile just wants to hide, not be closed. I don't want to get a "That's M$hit, get used to it" answer; I really want to know: What in the world is the reason for this decision, and even more, for still sticking with it?

    Read the article

  • SMTP Server setting on Windows 2008 R2

    - by user223298
    I am very very new to this and just trying to configure SMTP virtual server. I have followed a few threads to get it all running, but the mails are not being delivered. What I have done so far - 1) Install SMTP server. 2) SMTP server Properties General Tab - IP address is set to 'All Unassigned'. Access Tab - Authentication is anonymous access. Everything else is left to Default settings. Delivery Tab - Outbound security is anonymous access. In Advance section, entered the domain name in the FQDN field, and localhost in Smart host field. 3) Created an Inbound Rule for SMTP service to allow connections to Port 25. When I try to telnet, everything works up until the point the mail has to be send. Now, the sender's domain is different to the receiver's domain. Not sure if settings have to be changed to allow that? I had set the Relay restrictions on SMTP server, but because I couldn't send the mails, I thought I might as well make it work without the relay first. The error I see while sending the mail is 451 Timeout waiting for client input. I used to get some other error before when I had Relay restrictions on. Can anyone please point me in the right direction? Please let me know if you need more information. Thanks.

    Read the article

  • rsync --remove-source-files but only those that match a pattern

    - by user28146
    Is this possible with rsync? Transfer everything from src:path/to/dir to dest:/path/to/other/dir and delete some of the source files in src:path/to/dir that match a pattern (or size limit) but keep all other files. I couldn't find a way to limit --remove-source-files with a regexp or size limit. Update1 (clarification): I'd like all files in src:path/to/dir to be copied to dest:/path/to/other/dir. Once this is done, I'd like to have some files (those that match a regexp or size limit) in src:path/to/dir deleted but don't want to have anything deleted in dest:/path/to/other/dir. Update2 (more clarification): Unfortunately, I can't simply rsync everything and then manually delete the files matching my regexp from src:. The files to be deleted are continuously created. So let's say there are N files of the type I'd like to delete after the transfer in src: when rsync starts. By the time rsync finishes there will be N+M such files there. If I now delete them manually, I'll lose the M files that were created while rsync was running. Hence I'd like to have a solution that guarantees that the only files deleted from src: are those known to be successfully copied over to dest:. I could fetch a file list from dest: after the rsync is complete, and compare that list of files with what I have in src:, and then do the removal manually. But I was wondering if rsync can do this by itself.

    Read the article

  • ZFS Data Loss Scenarios

    - by Obtuse
    I'm looking toward building a largish ZFS Pool (150TB+), and I'd like to hear people experiences about data loss scenarios due to failed hardware, in particular, distinguishing between instances where just some data is lost vs. the whole filesystem (of if there even is such a distinction in ZFS). For example: let's say a vdev is lost due to a failure like an external drive enclosure losing power, or a controller card failing. From what I've read the pool should go into a faulted mode, but if the vdev is returned the pool should recover? or not? or if the vdev is partially damaged, does one lose the whole pool, some files, etc.? What happens if a ZIL device fails? Or just one of several ZILs? Truly any and all anecdotes or hypothetical scenarios backed by deep technical knowledge are appreciated! Thanks! Update: We're doing this on the cheap since we are a small business (9 people or so) but we generate a fair amount of imaging data. The data is mostly smallish files, by my count about 500k files per TB. The data is important but not uber-critical. We are planning to use the ZFS pool to mirror 48TB "live" data array (in use for 3 years or so), and use the the rest of the storage for 'archived' data. The pool will be shared using NFS. The rack is supposedly on a building backup generator line, and we have two APC UPSes capable of powering the rack at full load for 5 mins or so.

    Read the article

  • Vim: How to join multiples lines based on a pattern?

    - by ryz
    I want to join multiple lines in a file based on a pattern that both lines share. This is my example: {101}{}{Apples} {102}{}{Eggs} {103}{}{Beans} {104}... ... {1101}{}{This is a fruit.} {1102}{}{These things are oval.} {1103}{}{You have to roast them.} {1104}... ... I want to join the lines {101}{}{Apples} and {1101}{}{This is a fruit.} to one line {101}{}{Apples}{1101}{}{This is a fruit.} for further processing. Same goes for the other lines. As you can see, both lines share the number 101, but I have no idea how to pull this off. Any Ideas? /EDIT: I found a "workaround": First, delete all preceding "{1" characters from group two in VISUAL BLOCK mode with C-V (or similar shortcut), then sort all lines by number with :%sort n, then join every second line with :let @q = "Jj" followed by 500@q. This works, but leaves me with {101}{}{Apples} 101}{}{This is a fruit.}. I would then need to add the missing characters "{1" in each line, not quite what I want. Any help appreciated.

    Read the article

  • Browser: Randomly Opens Pages in New Windows Without Reason?

    - by Mark R
    This is a very strange thing I've noticed on my computer and past computers over this past year or more. I know when page are meant to open in a new Window and know which settings to use on my browser for this. But on both my browsers Chrome and Firefox, I have this really strange issue. I'm starting to think it's a hacker. When I right click links and select Open Link in New Tab, sometimes they will randomly open in a new window and today it is happening a lot. Like lots and lots. I'm getting really creeped out by it and YES I understand when a link opens in a new window and when it isn't supposed to. And none of these links are meant to open in a new Window. Let me give you an example: I searched Google about my issue and clicked on the result that I thought good. It opened in a new tab but half way through loading in new tab it opens into a new window. Recently, I decided to record my screen and while recording my screen using Camtasia, the issue stops. When I stop recording it starts again, as if it's a human operated issue. VERY strange. This has been going on for months on my old and new computer and on both Firefox and Chrome. Is there something I can do to fix this intermittent problem?

    Read the article

  • Multi-petabyte scale out storage solution [closed]

    - by Alex Yuriev
    Let's say that I have a need to have a single-name space scale to multi-petabyte object store with a file system-like wrapper. What is currently out there that supports the following: Single name space that can take 1B files. Support for multiple entry points using NFS At least node level replication ( preferably node and file level replication ) Online software upgrades No "magic sauce" on the storage layer The following has been evaluated: Gluster & Lustre - just ick - fundamental lack of understanding of why online upgrades are mandatory. OneFS - we have it. It is smelling more and more like it hides a dead body under the hood. Other than MapR and zfs am I missing anything? P.S. Oh yes, I keep forgetting that the forums are for people to discuss if 2TB drive actually stores 2TB info. May bad. Seriously though - how the heck can "meets the following requirements" can be considered a "debate"? P.P.S. I did not throw an idiotic insult - i pointed out that this is actually an interesting question compared to a conversation about storage capacity of a 2TB hard drive. It is not a question of what works better - it is a question that asks did I miss any of the products that currently exist which fit the criteria where criteria is clearly outline. I got one answer below which included something that I have not looked at in a long time which looks quite a bit grown up compared to the time I briefly look at it before.

    Read the article

  • allow public access to subfolder of protected folder on apache

    - by UnnamedMook
    I have password-protected the root folder of my website while i do maintenance, but I want to display a custom 401 error page to let people know the site is under construction. Unfortunately, my web host doesn't allow me write access to anything outside the root folder of my website, so this custom error page must by stored in the root folder or one of its subfolders. Instead of my custom error page I get the Apache default error page and it also says "Additionally, a 401 Authorization Required error was encountered while trying to use an ErrorDocument to handle the request." I searched for ways to make a subfolder of a protected directory public, and all I could find was to use the "Satisfy any" directive, but this doesn't work for me. It doesn't work on a file-only basis either, as with the .htaccess file below. #Authorization Restriction AuthType Basic AuthName "Access to root" AuthUserFile ********************************* Require user *********** Order Allow,Deny Satisfy any #Error Documents ErrorDocument 401 Error-401.html #Allow access to error documents <Files Error-*,html> Order Deny,Allow Allow from all Satisfy any </Files> I can only use .htaccess files; I don't have access to httpd.conf

    Read the article

  • Softfail / Failure Notice on SMTP

    - by pascal1954
    Hey, i'm searching for an answer for about 24 hours now and I still can't find any really useful help... The problem appears as the following: I'm running a debian server with Plesk 9.5.3 and qmail. Since a few weeks I'm not able to send mails to some particular servers (like web.de, aol.com). Hence I get failure notices like "Sorry, I wasn't able to establish an SMTP connection." But when I try to send mails to gmail.com - it works! Gmail only reports a softfail in the mail header like so: Received: from h1600XXX.?none? (DOMAIN2.TLD [XX.XXX.XX.XX]) Received-SPF: softfail (google.com: best guess record for domain of transitioning [email protected] does not designate 85.XXX.XX.XX as permitted sender) client-ip=85.XXX.XX.XX This sounds like a dns problem for me, but I can't get an answer for that... What makes me wondering is: h1600XXX is correct, but it should look like h1600XXX.stratoserver.net, not ?none? DOMAIN2.TLD (first line) is different from DOMAIN1 (second line). Both are hosted on this machine, but is this correct? DOMAIN1 is the one I send this mail from. Hopefully someone could help me! If you need more specific information, let me know. Thanks in advance!!! Best regards

    Read the article

  • Is it safe to format this partition?

    - by xanesis4
    On a ubuntu server I own, I am running out of space. When I ran sudo parted /dev/sda -l to find all available drives, I got this: Model: ATA ST31000528AS (scsi) Disk /dev/sda: 1000GB Sector size (logical/physical): 512B/512B Partition Table: msdos Number Start End Size Type File system Flags 1 1049kB 256MB 255MB primary ext2 boot 2 257MB 1000GB 1000GB extended 5 257MB 1000GB 1000GB logical lvm Model: Linux device-mapper (linear) (dm) Disk /dev/mapper/server--vg-swap_1: 2135MB Sector size (logical/physical): 512B/512B Partition Table: loop Number Start End Size File system Flags 1 0.00B 2135MB 2135MB linux-swap(v1) Model: Linux device-mapper (linear) (dm) Disk /dev/mapper/server--vg-root: 998GB Sector size (logical/physical): 512B/512B Partition Table: loop Number Start End Size File system Flags 1 0.00B 998GB 998GB ext4 I understand /dev/mapper/server--vg-root is the filesystem, and /dev/sda1 has some stuff related to GRUB. But, what about /dev/sda2 and /dev/sda5? When I tried to mount /dev/sda2, it said that I needed to specify the file system, which according to the table, is nonexistent. So, is it safe to format this with, say ext4 and mount it? Also, when I tried to mount /dev/sd5, it gave me this error: mount: unknown filesystem type 'LVM2_member' I assume it is NOT save to reformat this. If I'm wrong, then that would be great, because I could save some space. Please let me know either way. Thanks in advance! UPDATE: Here is the result of mount: /dev/mapper/server--vg-root on / type ext4 (rw,errors=remount-ro) proc on /proc type proc (rw,noexec,nosuid,nodev) sysfs on /sys type sysfs (rw,noexec,nosuid,nodev) none on /sys/fs/fuse/connections type fusectl (rw) none on /sys/kernel/debug type debugfs (rw) none on /sys/kernel/security type securityfs (rw) udev on /dev type devtmpfs (rw,mode=0755) devpts on /dev/pts type devpts (rw,noexec,nosuid,gid=5,mode=0620) tmpfs on /run type tmpfs (rw,noexec,nosuid,size=10%,mode=0755) none on /run/lock type tmpfs (rw,noexec,nosuid,nodev,size=5242880) none on /run/shm type tmpfs (rw,nosuid,nodev) /dev/sda1 on /boot type ext2 (rw,acl) /dev/sda1 on /media/hd2 type ext2 (rw)

    Read the article

  • Downloading Emails locally with Thunderbird

    - by r_honey
    I am using Gmail (web interface) for sometime now, and have well over 20 labels and some thousand mails there archived to different labels in Gmail. Now I want to have a local copy of all my mails and following points are important in the context: The Primary mail access mechanism would continue to be Gmail web for me. I just want a backup of my mail account locally. Ideally the mails should download locally in folders named after Gmail labels (I know this is possible via IMAP but probably not by POP) After all my mails are available locally, I will delete most of them in Gmail to free up space and because I want to archive them. The mails should continue to exist locally and should not be deleted when I delete when from Gmail web interface. I would be syncing my gmail account locally let's say every month. So, the new mails that I have sent/received during this period should come over to my local mailbox in the folders named after Gmail labels. I do understand that Gmail maintains a single copy of email having 2 different labels and such email would get duplicated locally in the 2 folders and I am okay with that. Essentially you can see I just want to archive my mails from the Gmail server to a local backup and then sync (one way from Gmail to locally) new mails at regular intervals. For some points above, POP seems to be the option while IMAP seems for the others. I am really confused and need help in deciding which of POP or IMAP would suit me best. I have currently chosen Thunderbird to be my local email client but would not have a problem switching to Outlook or anything else as long as I get my desired archiving functionality.

    Read the article

  • Best way to say "sync all system clocks to this server, when and ONLY when I say so?" Mixed setup of Windows+Linux servers.

    - by twblamer
    Title pretty much explains it. Let's say there's 100 servers, various versions of Windows and Linux, and one Windows server is the "master clock." I did look at this question: How do I synchronize clocks between Linux and Windows? This hints that ntp can do what I want if I run "ntpd -q" on a client (?). If I install ntp I also need to guarantee that it will only sync the times when I force it to. Even better if I have a log that tells me every time a sync was performed. I'm doing benchmark runs and I need to be able to say something like this: "Clocks were synced on all the benchmark systems at 09:42:01am on the master. A benchmark run was then initiated and allowed to run for six hours. None of the system clocks were altered during this time interval." I understand there is subsequent clock drift, but for now that's the way we're doing things and I'm doing it with a manual process. I'd rather at least automate the one-time sync.

    Read the article

  • Monitor programs accessing my keyboard?

    - by Anti Earth
    As of a few days ago, my computer is behaving 'erratically'. When I am typing, my pointer will randomly move to another place in the text and start typing a semi-random string of characters. ("gvyfn" is common; It has typed this about 8 times whilst I composed all the text above) It often highlights part of or all the text and overwrites it. It sometimes goes into loops of pressing Control-alt-delete down, bringing up Windows 7 menu thing. It sometimes even messes with mouseclicks; they have unexpected results, like requesting admin priveledges from applications, instead of switching to their window. I believe this is because it is holding a alt-function key down. This behaviour happens periodically, in waves. It might subside for an hour, then continue to haunt me. I believe it to be a virus or malicious program. My anti-virus (Symantec) and multiply MS rootkit removers could not find anything suspicious. I've noticed that sometimes it re-maps keys, and types gibberish when I press certain keys (though no pattern is evident). I believe a malicious program has installed a keyhook on my computer. I'm wondering... - Is there a way to let me view which programs are emulating keystrokes? - Is there a way to view what keyboard hooks are installed? (I'm also at liberty to try any other techniques to remove this blasted thing. It is easily the most fustrating computer problem I've encountered). Thanks!

    Read the article

  • How can I change the file name of the Program Files (x86) folder?

    - by Madrauk
    Let me stop you before the "you shouldn't do that" and "you will corrupt your file paths". I know what I'm doing, but I'll give you the story to convince you. Basically, my hard drive is failing to the point where programs installed on it are not responding consistently. So, in preparation for my replacement, I'm moving as many files as possible to my secondary because the new drive will be smaller (it's an emergency I can't buy a fancy expensive new drive) and so I can actually use my computer until the new drive comes in. The basic idea behind what I'm doing is I've copied the contents of the Program Files x86 folder to another spot on my other drive, and I want to replace the original folder in the C drive with a symbolic link that points to the other drive, so the programs can run from that drive and be fine but it will save space on my C drive and simplify the moving process when the new one comes in. To do that, I need to rename the program files folder, make the symbolic link, hopefully delete the program files folder, then restart my computer so all the programs are running properly and I can confirm it works. Now that I've told you why, can anyone help me out?

    Read the article

  • How many bootable partitions are possible to have on one hard drive?

    - by draiden
    This may not be the correct place to post this; if that's the case, just let me know and point me in the right direction please! I'm thinking of building a box that needs to be lightweight and portable, and would need to be able to boot multiple installations of windows. I am needing to have multiple installations so that I can, for example, plug the box in to the network at one location, boot in to that location's partition, and have full access to everything I would normally need to do on a computer that has already been set up on that network. Then, when I go to the next client, I would be able to do the same thing, with the new location's partition, and have all of those network settings, drive mappings, etc., available there. Obviously I'd need to go through and set them all up on the different locations/networks, I'm not expecting it to magically know where I am and what I'm doing. It would be like I'm carrying around a computer that is configured for each place I need to go in one little box, instead of having to have multiple computers or having to reconfigure all the settings and such every time I go to another client. Or is there an easier way to do this that I haven't learned of?

    Read the article

  • SSL and regular VHost on the same server [duplicate]

    - by Pascal Boutin
    This question already has an answer here: How to stop HTTPS requests for non-ssl-enabled virtual hosts from going to the first ssl-enabled virtualhost (Apache-SNI) 1 answer I have a server running Apache 2.4 on which run several virtual hosts. The problem I noticed is that if I try to access let's say https://example.com that have no SSL setuped, apache will automatically try to access the first VHost that has SSL activated (which is litteraly not the same site). How can we prevent this strange behaviour, or in other words, how to say to Apache to ignore SSL for a given site. Here's sample of what my .conf files look like : <VirtualHost foobar.com:80> DocumentRoot /somepath/foobar.com <Directory /somepath/foobar.com> Options -Indexes Require all granted DirectoryIndex index.php AllowOverride All </Directory> ServerName foobar.com ServerAlias www.foobar.com </VirtualHost> <VirtualHost test.example.com:443> DocumentRoot /somepath/ <Directory /somepath/> Options -Indexes Require all granted AllowOverride All </Directory> ServerName test.example.com SSLEngine on SSLCertificateFile [­...] SSLCertificateKeyFile [­...] SSLCertificateChainFile [­...] </VirtualHost> With this, if I try to access https://foobar.com chrome will show me a SSL error that mention that the server was identifying itself as test.example.org Thanks in advance !

    Read the article

  • Mac OS X - Time Machine backup fails verification - What can I do to save the history?

    - by usermac75
    Hi, How do I make Time Machine to make a new complete backup without losing older versions of backed up files? Verbose: I am using the Time Machine backup on my OS X (Snow Leopard) to backup the whole computer to an external drive. I especially like the "history", i.e. the feature that allows you to restore the older version of a file. Problem: I had some data corruption on my external backup drive, I repaired it with the System Tool for doing that, it found some faults. I had the disk tool repair the external drive. After that, the external drive was OK and I could use Time Machine again. I let Time Machine do one more backup. Now I made a verification according to http://superuser.com/questions/47628/verifying-time-machine-backups, namely along sudo diff -qr . $HOME/Desktop 2>&1 | tee $HOME/timemachine-diff.log However: After doing the command above, several differences and missing files were reported, approx. 200 files in sum. Whereas some of the missing files were cache or excluded directories, the differences do bother me, especially as some important documents from me are listed as differing. How can I make sure that the data on the external drive is synced correctly? Is it possible to have Time Machine to do a complete new backup without losing the history? Or to have Time Machine compare all files for differences and re-write all files that are different? Or can I set some flags on the files that do not match to have them copied again? (like the archive-flag in Windows/Dos). I'd rather not touch the files because I would like to keep the date of last change/date of creation) Thank you for your thoughts!

    Read the article

  • How to route 1 VPN through another on OS X?

    - by Eeep
    Hi everyone. Thanks a lot for your help! I've been tinkering with this for a while and have read many posts along with Googling for help, but my knowledge of TCP/IP is really weak... I have access to two different VPN servers. 1 Is set up in Network Settings and connects through PPP 2 Is set up through Tunnelblick and uses OpenVPN. I can connect to either tunnel #1 or tunnel #2, but not both one after the other... One of my major to-do's this year is study TCP/IP, but for now, would you be super-helpful and help me fix this really clearly? I have no experience with routing, DNS, gateways or any of that. If you tell me, "Set your gateway to XXX.XXX.XX.XXX" can you specify how I get that IP, off of what interface so I don't get messed up? I can figure out the terminal just fine if you let me know what to type, and I WILL read the man pages on everything you help me with. Thanks a million!

    Read the article

  • How two completely unrelated software can affect each other in a very strange manner?

    - by user40602
    I installed an old game on my old PC and it doesn't work; its process/exe file was listed in task manager but nothing appeared on the screen. then some time later i discovered that when some specific program was running an my pc, that game could be executed without that problem! although i am myself a power user and also a programmer, i couldn't find the reason, and don't have any good guesses about it. i just know that when i want to run that game i should have another specific and unrelated program running. i ask if anyone has any idea/guess about the possible reasons for this rare phenomenon! oh and if u ask about the details/names of those programs, i am afraid of telling that, because others may think i am kidding, but i am not (please believe me!), that game is NFS2 and the other program is mysqld.exe (i said before that i am a programmer!). I don't know how mysqld.exe (yes it is the windows version of the famous MySQL DBMS server) can affect NFS2 in such an strange manner, and my curiosity and profession don't let me to forget seeking for the answer, so i decided to take the help of others to see if someone has had a similar experience or a reasonable idea about it.

    Read the article

  • SSH connect error

    - by DMDGeeker
    I use a notebook with Ubuntu 12.10 and try to connect a server with Ubuntu 12.04. The server has already installed openssh-server. And allow publick key and password to login. But I connect the server sometime well but after minutes it will be error. First, it will show me these messages:    WARNING: REMOTE HOST IDENTIFICATION HAS CHANGED!    ......    Add correct host key in /home/myname/.ssh/known_hosts to get rid of this message. but I never reinstall system and the openssh-server. Both are never changed! The server nerver shutdown or reboot. Second, after I remove the relative key from my known_hosts and use ssh connect the server again, it will let me type my password. then my nightmare coming... Permission denied (publickey,password) But I typed the correct password! PS: I used password and public key both success. But the problem will appear again after i logout then login.

    Read the article

  • How to get apache to look for files in different subfolders folders?

    - by prb
    I am definitely new to mod-rewrite stuff. Note:- here the URL is common, and all the folders and subfolders on same host. The url a user uses to access their page is http://myurl.com/1234/filename.jpg Here the name of the subfolder is an integer is unique and generated dynamically by another application. The subfolder stores images specific to an individual user. So the folder structure is as follows main1 = document root main2 is another folder within main1 or document root. /main1/1234/filename.jpg /main1/5678/filename.jpg /main1/2345/filename.jpg /main1/1212/filename.jpg /main1/main2/2367/filename.jpg /main1/main2/8790/filename.jpg /main1/main2/9966/filename.jpg So, I want to write a rewrite a rule so that if a user tries to type in http://myurl.com/1234/filename.jpg, the rewrite rule will need to look where the file is and serve the request; so for request http:/myurl.com/1234/filename.jpg the actual page is located at /main1/1234/filename.jpg and then need to serve that page from that folder. So, if another users makes a request as http://myurl.com/9966/filename.jpg, it should serve the page from the following destination /main1/main2/9966/filename.jpg Please let me know if the question is still not clear. This is what i have done so far and does not work at all. RewriteCond {DOCUMENT_ROOT}/%{REQUEST_FILENAME} -f RewriteRule ^(.*)$ {DOCUMENT_ROOT}/$1 [L] RewriteCond {DOCUMENT_ROOT}/main2/%{REQUEST_FILENAME} -f RewriteRule ^(.*)$ {DOCUMENT_ROOT}/main2/$1 [L] any help is really grateful

    Read the article

  • Slow Routing Over LAN (Wired)

    - by reverendj1
    I'm having issues with my router acting very slow (Adtran Netvanta 3458). We have two networks, let's call them A and B. When I run netperf from two servers on network A (no routing) I get speeds along the lines of 900 Mbps. Which makes sense, since we have all 1Gbps switches. When testing A to B (or vice-versa) I get speeds along the lines of 22Mbps. I have also tested connecting my laptop to the switchports on the router, and testing two servers on network A (no routing) and got speeds around 90 Mbps. Which makes sense since the switchports on the router are 100Mbps. Does anyone have any idea why routing would be so slow? We bought the router over a year ago, and we think it has been doing this since then, but we never actually tested it before. (network B isn't really used much, so we didn't notice much) We were implementing a site-to-site VPN and noticed it was ridiculously slow, so we started testing basic routing performance. I have ruled out cabling and router CPU/memory utilization. Adtran looked at my config, but didn't see anything wrong with it.

    Read the article

< Previous Page | 523 524 525 526 527 528 529 530 531 532 533 534  | Next Page >