Search Results

Search found 4432 results on 178 pages for 'fail'.

Page 73/178 | < Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >

  • Why is process not being displayed by TOP

    - by drN
    I am running a Mathematica script (this question probably doesn't fit in Mathematica.SE however) and I know that it generally takes up a lot of RAM and loads up my cores. However, althought pgrep MathKernel is showing a pid, I find that top doesn't show this in the top processes, although I notice that it is taking up about 2.25GB of the 8GB available to me. pmap -x my_process_id total kB 2243132 1907404 1892108 AND ps aux | grep MathKernel dnaneet 20837 12.6 23.3 2234944 1907404 pts/1 Sl 09:23 8:01 /share/apps/mathematica/8.0.4/SystemFiles/Kernel/Binaries/Linux-x86-64/MathKernel -runfirst $TopDirectory="/share/apps/mathematica/8.0.4" -script ./dcm_10micrometer_2x -- ./dcm_10micrometer_2x ps aux shows that the process is taking about 12% (In asterisks) dnaneet 20601 0.0 0.0 68264 1660 pts/1 Ss 09:15 0:00 -bash **dnaneet 20837 12.2 23.3 2234944 1907404 pts/1 Sl 09:23 8:01 /share/apps/mat** dnaneet 21922 0.0 0.0 65604 948 pts/1 R+ 10:29 0:00 ps -aux Did this process fail and is the MathKernel just lingering?

    Read the article

  • 32 vs 64-bit software for same machine?

    - by GorillaSandwich
    What is the difference between 32 and 64-bit software? My understanding is that 64-bit can use more RAM, if it's available, because it has a larger address space for it. Is this correct? And, specifically: If I have a 64-bit operating system with lots of RAM, and I install, say, the 32-bit version of MySQL instead of the 64-bit version, will it be unable to use all the available RAM and therefore run slower than the 64-bit version might on the same machine (assuming RAM becomes the bottleneck before processing speed or disk access speed or whatever)? If I have a 32-bit operating system and I install a 64-bit piece of software on it, will it (probably) fail to run?

    Read the article

  • apache requests failing

    - by Josh
    I'm trying to figure out why sometimes the client fails to load objects/requests from a dynamic page served from Apache/MySql/Debian machine. Let's say 13 objects are to be loaded for a total of 185.3 KB load, with no external objects (no DNS lookups) and no other traffic at the same time, randomly some of those object do not load. However, if I perform a refresh, sometimes all of them load or some might fail again. I only have 1Mbps/up and my DNS is been hosted externally (everydns). What could be the reason of this issue? Any comments will be appreciated.

    Read the article

  • Why doesn't SuperGenPass work on some sites when I use Chrome?

    - by Lunatik
    Bookmarklet SuperGenPass sometime fails to popup when I click the bookmark in Chrome. It does however work when on the same page works in Firefox; an example is http://www.engadget.com/login This behaviour also replicated on a new Chrome tab (understandably, there is no domain), but some sites just fail to launch it meaning you have to go to another site, open it up, enter in [something] to get the 'Regenerate password' link, enter the domain manually then finally enter your master password to get the generated password! Something about the makeup of the page seems to make SuperGenPass think that it isn't able/required to popup. The FAQ doesn't make any mention of this fact, neither does a quick Google turn up anything that looks relevant. Does anyone else have the same issue? How can it be fixed? I'm on Windows using the current release of Chrome (5.x at the moment, but probably 18.x by the time you read this next week based on Google's seemingly logarithmic release numbering).

    Read the article

  • How to Virtualize an OEM windows install.

    - by jumentous
    I've bought a new computer and like always it comes with windows 7 pre-installed. I'm a linux user by default but i still keep a virtual windows installation around. Is it possible to install my linux distribution, and use the OEM license that came with the computer to create the virtual instance? I have no intention of moving the license off the physical machine so i'm sure i could argue that i'm not violating the license but i don't expect that this would work and activate without great legal battles. So in the event that this doesn't work what other options do i have? Can i shrink the physical partition and have Qemu boot it? My thoughts are that windows would detect the change in hardware and fail. What can i do with this windows install as a linux user?

    Read the article

  • harddrive problem (usb external)

    - by masfenix
    Hey guys I am using a Maxtor One Touch 3 given to my by my uncle. It is connected through USB2.0 When I plug it in, XP installs it and says "the device is ready to use". But it dosnt show up on my computer. It dosn't even show up in disk mangement. I then install Acronis Disk Director (which detects it) but marks it as offline. if i try to change it to online, nothing happens. (actually the lights on the HD blink - meaning communication and goes back into offline mode). The lights on the HD return to a solid which mean "working properly". Here is a screenshot: Is there any way to extract the data off the drive? is the drive corrupt? sitll useable? edit: DISKPART screenshot: edit2: I ran Seagate diagnostic tool and this came up: Long Generic - Started 9/10/2010 6:44:36 PM Bad LBA: 0 Unable to repair Long Generic - FAIL 9/10/2010 6:46:41 PM SeaTools Test Code: 4

    Read the article

  • Is there a way to make 7zip temporarly uncompress the whole archive when double-clicking on an exe?

    - by Gnoupi
    In WinRAR, one feature which I like is the fact that you can set it to uncompress the whole archive in a temporary place, if you double-click on an .exe file inside the archive opened in WinRAR. Typically, I often download small games, which I just want to try, without the hassle of creating a folder for it, etc. Same for archives containing an installer with its own separate files. In the 7-zip window, if I double-click an exe, it will just extract the exe in a temporary location and launch it. In the small game context (or installer), it means that it will simply fail, because it will miss required files in the same folder. So my question is: Is there a way to make 7-zip extract the whole archive in a temporary folder when launching an exe from inside the archive?

    Read the article

  • Common filesystem for servers behind a rackspace load balancer

    - by thanos panousis
    Our PHP application consists of a single web server that will receive files from clients and perform a CPU-intensive analysis on them. Right now, analysis of a single user upload can take 3sec to conclude and take 100% CPU. This makes our system capacity amount to 1/3 requests per second. My team's requirement is to increase capacity without a lot of code reengineering. A possible solution would be to set up a load balancer in front of multiple servers running the same app, connecting to a common DB. The problem is that the analysis outputs files on disk. A load balancer would increase capacity, but then files won't be available between servers so consequent client requests may fail. We are hosted on Rackspace, is there a way to configure some sort of "common" storage for all servers, without having to rewrite our file persistance code? Current code relies on simple fopens etc. What are our options?

    Read the article

  • What is the easiest way to do a direct file transfer of an extremely large file over the Internet?

    - by Kenneth Cochran
    I would like to transfer a 20+ GB file to a friend. I would like it to: Be fast Ensure data integrity Not require opening ports in either end's firewall Be free Not broadcast the file's existence to everyone on the Internet I've looked a several technologies and nothing seems to fit: Gnutella, BitTorrent, et al. satisfies 1, 2 and 4 JetBytes... 1, 3, 4 and 5 Yahoo Messenger, AIM, etc. 3, 4 and 5 FTP, SFTP... 1?, 4 and 5 rsync... 1, 2, 4 and 5 For a file this size speed and data integrity are the most important. No one wants a 20 GB file to fail a MD5 check after spending two days downloading it. Is there anything that meets all these requirements?

    Read the article

  • Dell replacement return

    - by terrani
    Hi, I am not sure if I can ask this question here. Please let me know if my question is not suitable here. I just received my replacement monitor from dell. While I was talking to dell tech., he told me that Dell would charge me for the replacement monitor if I fail to return my defective monitor within 15 days. I don't know why he told me that when I didn't ask it. so here is my question. How are they going to charge me?? Dell does not have my credit card or bank account info. Do they store my credit card or bank account information on their database???

    Read the article

  • Linux using the link command

    - by Xavier
    Here it goes. I have a folder that contains a not so large amount of space called /data/backup but I have been told that if I link that folder (/data/backup) to an even bigger folder area like /bigdata/backup for example, that I will be able to execute backups to the /data/backup folder because it will be just a link but the data will be seen in both folders and the latter one (/bigdata/backup) will contain the backup results but it will show on both folders and since the /bigdata/backup has far more disk space then the backup will no longer fail because of space problems in the /data/backup one. Is this true? Thanks Xav

    Read the article

  • Lustre - is this bad form?

    - by ethrbunny
    Im going to be consolidating several 'server rooms' into a single installation soon. Part of this effort will be finding a home for 5Tb (and growing) of files / logs. To this end Im looking at Lustre and appreciating its ability to scale. The big vendors want to sell me a $20K SAN to manage this but Im wondering about buying several iSCSI units (like this http://www.asacomputers.com/3U-iSCSI-Solution.html) and using VMs for the OSS machines. This would let me fail-over to cover problems and not require a dedicated system for each OSS. Given articles like this (http://h30565.www3.hp.com/t5/Feature-Articles/RAID-Is-Dead-Long-Live-RAID/ba-p/1422) that talk about how RAID is not keeping up with drive density Im leaning towards more disks with lower capacity each. Again - some akin to the iSCSI array above. Tell me why this is a terrible idea. Do I really need to invest in a PE710 for each OSS/OST?

    Read the article

  • Private Git repo using Smart HTTP with LDAP authentification

    - by ALOToverflow
    I've been crawling the interwebz and getting my hands dirty for the last few days, but I can't seem to make it all work together. I managed to get a HTTP repo working with Ubuntu 10.04 over Smart HTTP (pull and push over HTTP) for a single repo. This means that I do the initial setup over SSH to the server (git init --bare) and after that the clients can pull and push to it (git clone http://servername/allgitrepos/repo.git). Unfortunately it's impossible to add a new repo without SSHing to the server and adding it manually) i.e. git push http://servername/allgitrepos/repo2.git (allgitrepos is available for everyone to read-write and execute) would fail talking about git update-server-info (which seems to be a general error message). So far the repository is anonymous, so I would like to authenticate using LDAP and also use the LDAP creds to make the git commit. So, how can I push new repos to the server and how can I use the LDAP creds to make the git commit. Thanks

    Read the article

  • Bigger ProjectServer farm is performing worse

    - by MSPS DBA
    I am using Project Server 2007 sp3 with SharePoint 2007 sp3 and SQL Server 2008 r2. I have recently moved my farm from 2 servers (1 DB and 1 App/Web) to a very big farm having Many Servers, Clustered Database, Load Balancer, Powerful processors and Large RAM. This Farm has more than one Web Servers, Project App Servers, SharePoint App Servers and a separate Index Server. But the performance of Project Server in the new Farm has been downgraded. Views are taking even more time to load data and Project publishing time has also been increased. I am also facing deadlock problems which are causing the project server queue jobs to fail. Could anyone inform me that what would be the reason of this problem and what should be the starting point to look into the issue? Is it mainly because now the application server needs to communicate with other application servers which were not needed in the previous farm? Thanks!

    Read the article

  • Transferring NS records to a new server

    - by lanemiller
    I feel like that was NOT worded well, but here is my current predicament. I recently had a GoDaddy dedicated server, and decided after their customer support failed to do anything but disappoint, to switch to Rackspace. We have 2 ns records that point to our godaddy server, and we have a few sites left on the server, that rely on it for their DNS zones, and the owners of the domains fail to respond to us. So, the question is, if I need to transfer the sites off of the OLD godaddy NS, can I point the A records from my ns1.domain.com and ns2.domain.com to match up with IP addresses of the Rackspace nameservers? OR, do I cname my NS records to match the rackspace ones? I DO know that this isn't advised, either method, but I need to get these sites moved before Godaddy tries charging another $2k for the server.

    Read the article

  • Vista to Vista network visability issue

    - by Sk93
    Hi All, I've got a Vista Business PC and a Vista Business Laptop connected via a virgin media router (Netgear CG2100D) and I cannot get the two machines to see each other correctly over the network. The laptop is connected via wireless, whilst the pc is wired. Both are set to recieve their network settings automatically (DHCP) and both have the windows firewall (the only firewall on either) turned off completely. I can ping each machine fine from one another using the ip addresses, and I can also connect via \. However, connections via \ fail, and I cannot see the machines in the network map. I have tried turning netBIOS to be "always on" on both adapters, but this makes no difference. I've been messing around pretty much for 6 hours now and am getting quite fustrated by this! (my original aim was to get media sharing working, but I've pretty much abandoned that for now). Any ideas?

    Read the article

  • SAN/NAS with high availability?

    - by netvope
    I have two servers that I plan to use for storage. Each of them has a few SATA disks directly attached. I want the storage to be available even if one of the storage servers is down (preferably the clients wouldn't even notice that the fail-over, although I'm not sure if this is possible). The clients may access the storage via NFS and samba, but this is not a must; I could use something else if needed. I found this guide, Installing and Configuring Openfiler with DRBD and Heartbeat, which apparently does the thing I want. It relies on three components, Openfiler, DRBD, and Heartbeat, and all three of them need to be configured separately. I'm wondering are there simpler solutions? Is using DRBD+Heartbeat the best practice for a situation like mine? I'm also interested to know if there are alternatives that don't depend on DRBD.

    Read the article

  • Windows XP Activation failed, now what?

    - by user26379
    I have a computer with (presumably unlicensed) Windows XP. The activation (over the internet) failed, and I now I cannot get into Windows. After calling Microsoft, it seems like I will have to reinstall everything and install a freshly bought operating system. What are my options here? Is it worth trying the millions of keys and key generators out there? EDIT: I have no way of contacting the manufacturer (it's a no-name box, definitely not Dell, HP or IBM). Would there actually be a key supplied with my version of XP? If its not genuine, wouldnt I just have any old key? And that key would fail activation any way?

    Read the article

  • Email server for huge number of subscriber

    - by bogha
    My question is that my company is thinking of providing a free email account for each of its customers. As a new company we will assume that our corporate email system will be MS Exchange server which will support about 1000 employees. They are asking why not adding the customer list to be a part of Exchange users. My suggestion was to separate the two systems, for the corporate we can use Exchange but for customers (around 30000) we have to use a Linux based system. My only argument was that Linux can be used for enterprise services like this and Microsoft may fail. What do you suggest? And if you are with me on choosing Linux as the server platform, what do you suggest to use as an alternative for Exchange in Linux? Thank you.

    Read the article

  • Problems bringing up a second virtual network interface

    - by tubaguy50035
    I'm having issues adding a second IP address to one interface. Below is my /etc/networking/interfaces # The loopback network interface auto lo iface lo inet loopback #eth0 is our main IP address auto eth0 iface eth0 inet static address 198.58.103.* netmask 255.255.255.0 gateway 198.58.103.1 #eth0:0 is our private address auto eth0:0 iface eth0:0 inet static address 192.168.129.134 netmask 255.255.128.0 #eth0:1 is for www.site.com auto eth0:1 iface eth0:1 inet static address 198.58.104.* netmask 255.255.255.0 gateway 198.58.104.1 When I run /etc/init.d/networking restart, I get a fail error about bringing up eth0:1: RTNETLINK answers: File exists Failed to bring up eth0:1. Any reason this would be? I didn't have any problems with I first set up eth0 and eth0:0.

    Read the article

  • Highly Available Web Application (LAMP)

    - by Anthony Rizzo
    I work for a small company who provides a web application for thousands of users. Earlier this year they had one server hosted one company. We recently acquired another server in a different location with the hopes of one day making this a redundant failover machine. I understand what to do with the mysql replication, I plan on using a master-master replication setup, and rsync to sync the scripts and files, however I am at a stand still about how to configure the fail-over. Ideally I would like the two machines to accept requests, like a round robin dns, however if one machine goes down I do not want requests to go that machine. All of the solutions I am come across assumes high availability of servers in the same location, these servers are in two completely different locations with different public ip address. Any help would be great. Thanks

    Read the article

  • load balancing two web servers each on two different isp's?

    - by Scott
    I have two ISP's that provide me hosting via apache / php / mysql. I am running drupal on them. On occasion the mysql server will go away (crash), so I was hoping to find a reasonable way to have a fail over, if server A SQL is down, all traffic is sent to server B. I know traditionally this is handled in DNS where a second alternate ip is given if there is a problem - or similar. But I do not have control over the isp, other than I can run php, perl and the usual apache stuff. Also, I have static ip's on each isp, and I can create dns entries (A/CNAME/TXT). So, I was hoping there might be a way for me to have a script that checks if drupal has a problem, and if so, somehow alter dns, or ? Or, any other ideas? (other than spending lots more $ on a better isp)

    Read the article

  • Best way to troubleshoot apache not starting?

    - by lowgain
    We have recently gotten a backup server to mirror all our data onto in case the primary server goes down. I've gotten all the sites data updated through rsync, and all the apache config and databases updated. Both machines are on Ubuntu 9 (9.04 on the primary, 9.10 on the backup). So everything seems synced up for the most part at this point (still need to figure out user syncing), and I try to start Apache. I get * Starting web server apache2 [fail] Nothing else indicating what the problem could be. I know I don't have enough info to expect a solution from you guys, so I'd just like to know where I can go from here to further investigate this issue. Would there be any error logs for this? Thanks!

    Read the article

  • How to cleanup tmp folder safely on Linux

    - by Syncopated
    I use RAM for my tmpfs /tmp, 2GB, to be exact. Normally, this is enough but sometimes, processes create files in there and fail to cleanup after themselves. This can happen if they crash. I need to delete these orphaned tmp files or else future process will run out of space on /tmp. How can I safely garbage collect /tmp? Some people do it by checking last modification timestamp, but this approach is unsafe because there can be long-running processes that still need those files. A safer approach is to combine the last modification timestamp condition with the condition that no process has a file handle for the file. Is there a program/script/etc that embodies this approach or some other approach that is also safe? Incidentally, does Linux/Unix allow a mode of file opening with creation wherein the created file is deleted when the creating process terminates, even if it's from a crash?

    Read the article

  • Replace gvimext.dll in Windows 8

    - by Leftium
    How can I get the "Edit with ... using tabs" functionality in gVim on Windows 8 (64-bit)? I'd like to swap out gVim's stock gvimext.dll for one that adds an "Edit with ... using tabs" option to Explorer's right-click context menu. On Windows 7 (64-bit) I used to be able to download the DLL and swap it in by following these instructions. However, I can't get it to work in Windows 8. The stock installation's context menu (sans "... using tabs") works fine (without a restart) ...but after replacing the DLL the gVim context menu options disappear and the gvimext.dll no longer seems to even load. (Windows 8 was restarted) if I again replace gvimext.dll with a backup of the stock DLL, the context menu options remain missing and the DLL still seems to fail to load (Windows 8 was restarted, again) If I re-install gVim, the context menu items return (even without a restart) What is the difference here between Windows 7 (where swapping DLL's works) and Windows 8 (where swapping DLL's fails)?

    Read the article

< Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >