Search Results

Search found 7396 results on 296 pages for 'delayed job'.

Page 169/296 | < Previous Page | 165 166 167 168 169 170 171 172 173 174 175 176  | Next Page >

  • Windows Software to Save Arbitrary Application State

    - by ashes999
    VM software does a great job of saving state when you "turn it off," allowing instant and immediate return to that previous state. Is there some application for Windows that allows me to do the same thing, for any arbitrary software? It would allow me to save/restore state, possibly via a shell command or button that it appends to every window. Edit: For clarity, there are two types of apps: those that save their own states, and those that save others' states. Those that save their own state are like Chrome, which on load, reloads the windows you had open last time. That's not what I'm asking about; I'm asking for an app that can save the state of other apps, kind of like VM software does; but for any app. (A trivial test would be load notepad++, type a bunch of stuff, and save-state; on reset-state, you should be able to multi-level undo a lot of what you wrote, as if you never shut down the application.)

    Read the article

  • Get-Mailbox not returning all mailboxes

    - by rotard
    I am trying to set up an exchange mailbox backup job with Vembu Storegrid and StoreGrid is unable to list the mailboxes for the client. While I was troubleshooting the issue, I did notice another thing: running the Get-Mailbox command on the mail server as the backup user only shows the mailbox for that account, while running Get-Mailbox as my admin account returns a list of what appears to be all the mailboxes. My service account is a member of "Administrators", "Domain Admins", and "Domain Users". What additional permissions might be required to list all mailboxes in the system?

    Read the article

  • Bacula configuration for clients that are turned on and off randomly

    - by Rastloser
    I'm evaluating Bacula as a centralized backup tool for a small network where users will turn machines on and off unpredictably. Some of the headless Linux boxes I need to back up are intended to be turned off by pressing the on/off-button on the case, without any way of telling the user to wait for a backup job to finish. So, we don't know when backup jobs may run (anacron might help with this, right?) and we don't know whether they'll be allowed to finish. Is Bacula a reasonable choice for such an environment?

    Read the article

  • How to verify if my copy operation is complete in Windows 7?

    - by Tim
    Yesterday, I was leaving some job of copying a directory to run overnight. This morning however, I found the computer had restarted because of Windows Update or something. I was wondering if there is some way to check if the copy is complete? One way I guess would be check the last modified time of the copy, and when the system restarted. But I was wondering where to find the time when the system restarted? I was also wondering if where to find some logging files that have the records. I know Event Viewer, but don't know where to find within it. Other methods are welcome too. I also would like to hear suggestions for other ways to accomplish the copy instead of just simple copy and paste. Thanks and regards!

    Read the article

  • Restart an in-use NFS server without interruption (within timeout)

    - by zebediah49
    I have a bunch of compute clients working on jobs, saving output data to a NAS machine. All machines are centos 6.2. They mount it via automount NFS, with a timeout of 1200 (default config). The NAS machine needs to be restarted. If I can restart the machine within that 1200s (20 minute) window, will the clients just block on IO until it comes back up? A minor interruption (pause) in service is ok, as long as it doesn't cause the running processes to error out. If necessary I could loop through and SIGSTOP all job processes, restart and resume them -- I just don't want to break the open file handles. How can I run a restart like this without killing processes with open files?

    Read the article

  • Rewrite the Base URL with mod_rewrite

    - by rotespferd
    My Domain example.com points to the directory public_html. In the directory public_html/php is my index.php file. Now I want that the URL example.com points to *public_html/php/index.php*. I must do this with mod_rewrite because I have no access to the httpd.conf to do something wth Alias oder DocumentBase. In the directory public_html is my .htacces filewith the following content: RewriteEngine on RewriteCond %{HTTP_HOST} exaple.com$ RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ /php/index.php [L,QSA] This do half of the job, because when I enter something like example.com/s in my browser it points to *public_html/php/index.php* as I want it to do. But when I just enter example.com it points to *public_html*. What can I do to fix this?

    Read the article

  • ionice idle is ignored

    - by Ferran Basora
    I have been testing the ionice command for a while and the idle (3) mode seems to be ignored in most cases. My test is to run both command at the same time: du <big folder> ionice -c 3 du <another big folder> If I check both process in iotop I see no difference in the percentage of io utilization for each process. To provide more information about the CFQ scheduler I'm using a 3.5.0 linux kernel. I started doing this test because I'm experimenting a system lag each time a daily cron job updatedb.mlocate is executed in my Ubuntu 12.10 machine. If you check the /etc/cron.daily/mlocate file you realize that the command is executed like: /usr/bin/ionice -c3 /usr/bin/updatedb.mlocate Also, the funny thing is that whenever my system for some reason starts using swap memory, the updatedb.mlocate io process is been scheduled faster than kswapd0 process, and then my system gets stuck. Some suggestion? References: http://ubuntuforums.org/showthread.php?t=1243951&page=2 https://bugs.launchpad.net/ubuntu/+source/findutils/+bug/332790

    Read the article

  • Stream tar.gz file from FTP server

    - by linker
    Here is the situation: I have a tar.gz file on a FTP server which can contain an arbitrary number of files. Now what I'm trying to accomplish is have this file streamed and uploaded to HDFS through a Hadoop job. The fact that it's Hadoop is not important, in the end what I need to do is write some shell script that would take this file form ftp with wget and write the output to a stream. The reason why I really need to use streams is that there will be a large number of these files, and each file will be huge. It's fairly easy to do if I have a gzipped file and I'm doing something like this: wget -O - "ftp://${user}:${pass}@${host}/$file" | zcat But I'm not even sure if this is possible for a tar.gz file, especially since there are mutliple files in the archive. I'm a bit confused on what direction to take for this, any help would be greatly appreciated.

    Read the article

  • Good TN3270 for MacOS? (Mac OS X Snow Leopard)

    - by Bbrado
    I need a good TN3270 emulator for the Mac that supports file transfers to TSO/ISPF (IND$FILE) and 132 char wide screens. Min 132x27 (3278-model-5), better even user defined size like 132x43. So far I've tried TN3270 (no file transfer) and X3270. Besides being an X11 app, unfortunately x3270 does not handle oversize screens correctly and Mocha only has a TN5250. So, what's the TN3270 of choice on the Mac (or how at least do I get x3270 to handle 132xSomething in ISPF correctly, e.g. SDSF Job-Class list)?

    Read the article

  • Converting massive images to PDF, without crashing applications

    - by BloodyIron
    I'm trying to work with a large-format scanner, and we are scanning very long documents. Example, one of our documents we cut into two pieces, and one of those pieces is 3633x82486 in resolution. My application, Scanning Master 21+, which comes with the device (Graphtec CSX300-09) can output PDF, however when I try to save to PDF it complains about file being too large. I can successfully output to BMP however. GIMP can even open this BMP, after taking a while to load it. The resulting files range from 200MB - 1.2GB in size. Acrobat refuses to open the BMP format, saying it isn't supported or is damaged (which I know is not true). As I mentioned, the PDF plugin for GIMP crashes when I try to export to PDF. I'm really not sure what is the best tool for this job. So what is the best tool to produce PDF documents of very large images?

    Read the article

  • How to setup a daily report of the top e-mail senders in Exchange 2010

    - by Belmin
    We have had issues with compromised Exchange accounts sending a large amount of unsolicited e-mails out. We have mitigated this by using a cloud e-mail gateway that does a better job in detecting these outgoing messages as to not hurt our e-mail reputation. However, we would still like to detect any abnormal e-mail activities. One idea is a report of the Exchange accounts with the most outgoing message. Any idea on how to do this? Or a similar stat that may be indicative of an account being compromised?

    Read the article

  • In Excel, how to group data by date, and then do operations on the data?

    - by Bicou
    Hi, I have Excel 2003. My data is like this: 01/10/2010 0.99 02/10/2010 1.49 02/10/2010 0.99 02/10/2010 0.99 02/10/2010 0.99 03/10/2010 1.49 03/10/2010 1.49 03/10/2010 0.99 etc. In fact it is a list of sales every day. I want to have something like this: 01/10/2010 0.99 02/10/2010 4.46 03/10/2010 3.97 I want to group by date, and sum the column B. I'd like to see the evolution of the sales over time, and display a nice graph about that. I have managed to create pivot tables that almost do the job: they list the number of 0.99 and 1.49 each day, but I can't find a way to simply sum everything and group by date. Thanks for reading.

    Read the article

  • All HTTPS, or is it OK to accept HTTP and redirect (secure vs. user friendly)

    - by tharrison
    Our site currently redirects requests sent to http://example.com to https://example.com -- everything beyond this is served over SSL. For now, the redirect is done with an Apache rewrite rule. Our site is dealing with money, however, so security is pretty important. Does allowing HTTP in this way pose any greater security risk than just not opening or listening on port 80? Ideally, it's a little more user-friendly to redirect. (I am aware that SSL is only one of a large set of security considerations, so please make the generous assumption that we have done at least a "very good" job of covering various security bases.)

    Read the article

  • Forensics on Virtual Private servers [closed]

    - by intiha
    So these days with talks about having hacked machines being used for malware spreading and botnet C&C, the one issue that is not clear to me is what do the law enforcement agencies do once they have identified a server as being a source or controller of attack/APT and that server is a VPS on my cluster/datacenter? Do they take away the entire machine? This option seems to have a lot of collateral damage associated with it, so I am not sure what happens and what are the best practices for system admins for helping law enforcement with its job while keeping our jobs!

    Read the article

  • PHP / Drupal equivalent of .bat file [closed]

    - by Pamela
    I am new to Drupal and have just started in Drupal 7. I have a very simple .bat file that calls a .txt file to open a ftp connection, get a file off the ftp server and place it on my desktop. Now that I know that works.. (YAY!) I need to figure out how to have it done with a cron job in Drupal, save it somewhere so that I can unzip it somehow and then populate a table in the database with it. Any advice would be greatly appreciated!

    Read the article

  • Can not start Apache 2.2.22 in Fedora 15

    - by Roderik
    I am trying to start Apache 2.2.22 under Fedora 15 on my local machine. After fixing some errors related to missing modules, httpd -t will just give me 'Syntax OK'. However when I try to start apache as the root user: service httpd start it still returns: Starting httpd (via systemctl): Job failed. See system logs and 'systemctl status' for details. [FAILED] When entering systemctl I don't see any extra information other than: httpd.service loaded failed failed LSB: start and stop Apache HTTP Server So I wonder where to look now to get this back up and running.

    Read the article

  • How to select the page range to print in Windows 8 Modern UI apps?

    - by Magnetic_dud
    Today I wanted to print an email from the Mail app (modern UI). I selected devices from the charms bar, and I chose my printer. The problem is that it was a very long email (a 40 replies unthreaded email), and I only needed the first page. It looks like there is no way to select a page range in the simple printing dialog, am I right? I solved the problem by inserting just one page in the printer and then deleting the job, but this is not a real fix... (I could print to a PDF printer, then open in Acrobat and print again, but...)

    Read the article

  • Setting CPU cores off-limits to all threads not specified (preferably in Windows 7)

    - by Shinrai
    I have a really specific machine configuration in the works that would really be helped out if there were any way to do this...basically what I'm looking for is the opposite of setting CPU Affinity for a process. I want to be able to tell Windows "No applications except [x] are allowed on [these cores]." Is there any mechanism whatsoever for doing this? (Yes, I am aware of some of the potential issues this could cause and I normally would never fool with processor affinities, since the OS usually does a damned good job itself, but this is a pretty odd situation involving some software that is very CPU-bound constantly having to wait on interrupts and DPCs and things from other threads.)

    Read the article

  • Making Windows 7 default to "All Files" when opening a document?

    - by krebshack
    I'm taking over an old coworkers job and part of that means I'll be working on various IFS tickets without any real ERP experience. I'm working with one user who is trying to check in documents in IFS. Right now, IFS defaults to JPEGs and then she has to select "All Files" in order find PDFs. Example: http://i.imgur.com/xK0iAfF.png As much as I'd like to say "it's one extra step, come on" the user's manager has insisted it interrupts her workflow and asked us to get on it. I've spoken with our IFS experts and they're unaware of any setting that would make the open dialog default to All Files in IFS. I've searched in Google for any setting in Windows 7 that would do that - but those have been unsuccessful - I keep getting results about changing which program opens a specific file type.

    Read the article

  • Restore wubi install after changing master drive

    - by Johnny
    Recently I got a 160G drive to install other Unix distributions and learn to use them for the sake of my current job. However I've got only 2 drive connectors free (on the primary channel). Therefore I've decided to remove an 80G hard drive which has the MBR and the main Windows bootloader. My problem in particular is that I've got another 250G drive which has a Wubi installation of Ubuntu 10.04 (Lucid Lynx), which I want to preserve on the bootloader that's going to be running for all the other OSes (potentially Grub). How would I do that? Since Wubi is actually sort of windows reliant, as far as I've learned so far.

    Read the article

  • sql server: losing identity column on export/import

    - by Y.G.J
    Recently I started dealing with SQL Server, my previous experience was in MS-Access. When I'm doing an import/export of a db, from the server to my computer or even in the server, all column with primary key loose the key. Identity is set to false and even bit is not set to the default. How can I can I use an import/export job to make an exact copy of the db and its data? I don't want to have to perform a backup and restore every time I want the same db somewhere else, for another project, etc. I have read about "edit mapping" and the checkbox but that did not helped with the identity specification... and what about the primary key of the tables and the rest of the things?

    Read the article

  • How to get Windows XP to play a sound on mouse clicks

    - by Re0sless
    Is it possible to play a sound each time the mouse button is pressed in Windows? We have some touch screens PC's that run Windows XP, and would like them to beep when pressed so the users know they have registered the click. (the touch screen acts a the left mouse button) I have looked in Sounds and Accessibility options but can not see anything that would work. If there is nothing in windows that would do this is there any third party applications to do the same job. We don't need anything lavish just a simple click or beep would suffice.

    Read the article

  • Only 5 simultaneously users with PPTP VPN?

    - by Plastkort
    we have a windows 2008 R2 standard server which job is to accept incoming VPN connections via PPTP. it seems to work fine, but the moment the 6th user try to connect, we get the following error: ERROR_VPN_DISCONNECT 807 The network connection between your computer and the VPN server was interrupted. This can be caused by a problem in the VPN transmission and is commonly the result of internet latency or simply that your VPN server has reached capacity. where do we find out how to increase the amount of simultaneously users?, I read somewhere that we can have over 100 connected clients at the same time, when I created this server, I did the following : Network & sharing - Network adapters - File - new incoming connections now I tried to choose and create users here, but we have a domain controller which seems to override this, so the domain users works after this I was able to connect atleast 5 clients... how and where do I increase the "capacity"?

    Read the article

  • CPU Configuration Issue for 2 Servers (Server 2008 R2)

    - by Bill Moreland
    I have 2 servers running the exact same Classic ASP code with Access DBs (yes, not ideal, but it is what it is, for now). 1) Xeon 5520 @ 2.27 GHz (6 GB Memory) 2) Xeon E5-2620 @ 2.00 GHz (2 processors, 32 GB Memory) For most pages the newer E5-2620 processes the pages between 10-15% faster. On pages requiring heavy and/or multiple complicated access stored procedures (queries) the older 5520 does a much better job. I believe the servers are configured nearly identically. My question: is it possible that the newer, multi-processor server is not as good at handling Classic ASP as the older single processor? Is there a configuration difference that needs to be in place that I'm missing since I'm shooting for identical implementations?

    Read the article

  • Everything You Ever Wanted to Know about Mod_Rewrite Rules but Were Afraid to Ask?

    - by Kyle Brandt
    How can I become an expert at writing mod_rewrite rules? What is the fundamental format and structure of mod_rewrite rules? What form/flavor of regular expressions do I need to have a solid grasp of? What are the most common mistakes/pitfalls when writing rewrite rules? What is a good method for testing and verifying mod_rewrite rules? Are there SEO or performance implications of mod_rewrite rules I should be aware of? Are there common situations where mod_rewrite might seem like the right tool for the job but isn't? What are some common examples?

    Read the article

< Previous Page | 165 166 167 168 169 170 171 172 173 174 175 176  | Next Page >