Search Results

Search found 41882 results on 1676 pages for 'png files'.

Page 667/1676 | < Previous Page | 663 664 665 666 667 668 669 670 671 672 673 674  | Next Page >

  • Backup solution

    - by user66115
    We are currently looking for a new backup solution. Our current network is 5 remote location with a tape backup in each plant. Right now we are looking at a MPLS VPN and running backups out of our main plant. The main thing that we backup are user private folders and department files. And each plant has it's own file server that houses CAD drawings. My main plan is to have every thing but that CAD drawing at the main faculty. We would start with a main backup of the drawing files and then do change backups back to the main plant. Besides tapes what would be the best way to backup. Our contact at Pc Connection is point us toward a Tandberg Data device.

    Read the article

  • using wbadmin to backup and recover

    - by g7rpo
    HI I am using wbadmin to perform backups of a specific folder, primarily to backup my VHD files this is working fine but I tried to recover the files today using a different machine to the one which created the backup and couldnt get the machine doing the recovery to 'see' the backups. Is there a way to do this as my worry is that if I have a failure on the host which is perfmorming the backups I need to be able to install hyper-v on another host and recover the backed up VMs to there until I can rebuild the host. It appears that this isnt possible, I am hoping I am missing something. Any help would be greatly appreciated.

    Read the article

  • I can't write to a folder which I'm a member of

    - by user3265472
    I'm trying to setup folder access to a group so that all members of that group can create/edit/delete files within the folder. # create my group and add a member sudo addgroup dev sudo adduser martyn dev Now, logged in as "martyn", check my user has been added to "dev" group groups martyn martyn : martyn dev Now I want to change the group ownership of my project folder so all members of that group can edit it and files/folders within it. sudo chgrp -R dev myproject Just to check: martyn@localhost:/var/www$ ls -l total 4 drwxrwxr-x 3 dev dev 4096 May 31 15:53 myproject Now here's where it fails. I want to create a file within myproject (logged in as "martyn", a member of "dev"): vi myproject/test ..but when I try to save the file I get the following error: "myproject/test" E212: Can't open file for writing Why, as user "martyn" which is a member of "dev", can I not write this file? Even if I create the file so it exists, change the ownership to "dev" then try to edit and save - I get the same error.

    Read the article

  • How to automatically print the contents of a folder in OS X?

    - by MDRoz
    I would like to be able to set up a folder on my Mac, where I could dump files and have them be automatically sent to the default printer. This way I would be able to print files at home when I'm not physically at home, using something like Dropbox. It wouldn't have to be real-time; i.e., a scheduled job that checks the folder every so often would be acceptable. What's the easiest approach? Automator? Applescript? Cron job?

    Read the article

  • Homegroup sharing problems

    - by soandos
    I can see other people in my homegroup and their folders, but when I click on those folders, I cannot see the contents (no error message, just nothing happens). The other people in the Homegroup can see me just fine, and my files In addition, I cannot see the files that they have under the network tab, though they can all see each others stuff. What could be the issue? The homegroup has already be created and recreated numerous times. Perhaps unrelated, but I am unable to turn off password protected sharing.

    Read the article

  • How can I display host name on windows desktop

    - by Martin
    I do a lot of work on Windows Server 2008 remote desktops and often lose track of which host I am currently logged on to. Is there a way of displaying (without installing any non-standard apps) the host name or IP address of the host I am connected to in either the wallpaper or the notification area? I tried creating files in the desktop with the name of the machine - but my roaming profile shows the same set of desktop files on every machine, so that was scuppered. Duh! In shell windows this is easy: just set the prompt to display the host name. Surely there is a simple way of doing the same for the graphical desktop.

    Read the article

  • Open original Microsoft Office document (not "version 1") on Mac OS X Lion restart

    - by FlyingMolga
    My MacBook Pro running Lion has been frequently freezing lately, and I've had to restart with the power button. When Lion starts up again, the Microsoft Office applications that were running start and load different autosaved versions of the documents I had open (i.e. it does not open abc.xlsx but [version 1] of abc.xlsx). Sometimes it also opens the original files. Several times I've inputted data into these "version 1" files, only to try to save it and realize that it isn't the original file and is sometimes missing data that is contained in the original file. Is there any way to make autorecover open the actual document with the unsaved changes, instead of making a new temporary version?

    Read the article

  • Applications starts very slowly from a network path

    - by Snowfox
    Hi We have a windows 2008 server which hosts the network share \\srvcompany\lib. This share contains several applications needed for the daily business. Every client/user (all win xp) has shortcuts on the desktop to these apps. We have the problem that at several (but not all) clients the apps starts very slowly. If I copy the application's programm files to a local folder then they'll start fastly. When I watch the memory usage in the task manager on such a "slow" machine while an applications starts I notice that the memory usage grows much slowier than when I start the app from a "fast" machine. But when I copy files with Windows Explorer from this share, the speed is nearly the same. I've also checked the network driver, both tested clients have the same network card with the same driver version. Has anyone an idea where or what I should check next to solve this problem? Thanks for answers.

    Read the article

  • Excel - Disable AutoFormatting on Import

    - by Philip Wales
    How can I stop Microsoft Excel from auto formatting data when imported from a text file? Specifically, I want it to treat all of the values as text. I am auditing insurance data in excel before it is uploaded to the new database. The files come to me as tab delimited text files. When loaded, Excel auto-formats the data causing leading 0's on Zip Codes, Routing Numbers and other codes, to be chopped off. I don't have the patience to reformat all of the columns as text and guess how many zeros need to be replaced. Nor do I want to click through the import wizard an specify that each column is text. Ideally I just want to turn off Excel's Auto-Formatting completely, and just edit every cell as it were plain text. I don't do any formula's or charts, just grid plain text editing.

    Read the article

  • Is it safe to compress my Windows 7 %USERPROFILE%\AppData folder?

    - by Kev
    Having just read Scott Hanselman's latest blog entry, Guide to Freeing up Disk Space under Windows 7, he suggests turning on NTFS compression which I already do for a number of less travelled folders that contain static files such as downloads or images. However I am wondering if it's wise to turn on NTFS compression for the whole of my %USERPROFILE%\AppData folder? My system drive is a 128 GB SSD residing in a Dell Precision T5400 3Ghz Quad Core Xeon workstation so I ought not to notice the extra cycles used to compress and decompress files on their way to and from the disk. However would there be any good reasons not to do this? In fact could I safely compress the whole of my %USERPROFILE% folder?

    Read the article

  • Add folder name to beginning of filename - getting multiple renames

    - by Flibble Wibble
    I've used dbenham's excellent response to the question of how to add the folder name to the beginning of a filename in a cmd script. @echo off pushd "Folder" for /d %%D in (*) do ( for %%F in ("%%~D\*") do ( for %%P in ("%%F\..") do ( ren "%%F" "%%~nxP_%%~nxF" ) ) ) popd What I'm finding is that seemingly randomly (though it probably isn't) sometimes the script will run through several child folders and rename correctly but then it gets to a folder where it gets stuck in a loop and starts adding the folder name repeatedly to the file inside. I have 90,000 files in 300 folders to rename this weekend. Any chance you can guess the cause? PS: Is there a maximum number of files that are acceptable in each folder?

    Read the article

  • Sharepoint 2007 reset permission inheritance

    - by e-mre
    I have this SharePoint 2007 document library which has several levels of folders and files. Some folders in the middle of the hierarchy do not inherit permissions from their parents and have their unique permissions defined. It is a huge library and there are many folders like this. I am currently changing the permission model of my library and I want to reset all those unique permissions and have all of them inherit permissions from the library root. (Something like "Replace child object permissions" checkbox available in windows files system security window) If this is not possible, seeing a list of folders that have their unique permissions defined would also do.

    Read the article

  • Apache's htcacheclean doesn't scale: How to tame a huge Apache disk_cache?

    - by flight
    We have an Apache setup with a huge disk_cache (500.000 entries, 50 GB disk space used). The cache grows by 16 GB every day. My problem is that the cache seems to be growing nearly as fast as it's possible to remove files and directories from the cache filesystem! The cache partition is an ext3 filesystem (100GB, "-t news") on an iSCSI storage. The Apache server (which acts as a caching proxy) is a VM. The disk_cache is configured with CacheDirLevels=2 and CacheDirLength=1, and includes variants. A typical file path is "/htcache/B/x/i_iGfmmHhxJRheg8NHcQ.header.vary/A/W/oGX3MAV3q0bWl30YmA_A.header". When I try to call htcacheclean to tame the cache (non-daemon mode, "htcacheclean-t -p/htcache -l15G"), IOwait is going through the roof for several hours. Without any visible action. Only after hours, htcacheclean starts to delete files from the cache partition, which takes a couple more hours. (A similar problem was brought up in the Apache mailing list in 2009, without a solution: http://www.mail-archive.com/[email protected]/msg42683.html) The high IOwait leads to problems with the stability of the web server (the bridge to the Tomcat backend server sometimes stalls). I came up with my own prune script, which removes files and directories from random subdirectories of the cache. Only to find that the deletion rate of the script is just slightly higher than the cache growth rate. The script takes ~10 seconds to read the a subdirectory (e.g. /htcache/B/x) and frees some 5 MB of disk space. In this 10 seconds, the cache has grown by another 2 MB. As with htcacheclean, IOwait goes up to 25% when running the prune script continuously. Any idea? Is this a problem specific to the (rather slow) iSCSI storage? Should I choose a different file system for a huge disk_cache? ext2? ext4? Are there any kernel parameter optimizations for this kind of scenario? (I already tried the deadline scheduler and a smaller read_ahead_kb, without effect).

    Read the article

  • Online backup service _with_ filtering (by extension, size and so...)

    - by QyRoN
    Hello, I recentely discovered an ability to backup personal data to online server but I was never surprised when I didn't found a popular service provides filtering capabilities, i.e. all of them they backup all the contents of specific folder. Are there any free options with filtering? To be specific, I need following features: Backup to online server. Automatic but bandwidth-transparent backup, i.e. it will backup my files automatically but won't try to do it if I'm heavily using the computer or internet at the moment. Individual filtering settings for folders, i.e. I want to specify which files to backup in every folder. Some free plan (since I'm not going to use more than 500MB of space).

    Read the article

  • How do I transfer .pst e-mails from Outlook to a Mac e-mail program?

    - by user46248
    New Macbook Pro came in yesterday with MS Office Outlook crashes consistently on importing from .pst after about 30 seconds. I see a EXC_BAD_ACCESS and something about CF get string length or the like. Tried 2 different .pst files created on different dates of different folders. Mail and Thunderbird lack options for importing from .PST. I can export to .csv but still didn't see an option for importing that. How can I get my .pst files into an e-mail client on the Mac?

    Read the article

  • Prioritize file sharing performance in Windows Server 2008

    - by cmbrnt
    I've got a server running Windows Server 2008, and use it mainly for sharing files throughout the domain from a number of disks. It's running on VMware ESXi 4.0, in case that matters. My problem is that when I log in to the server to check user permissions etc, the access speed the files on the remote disks almost grinds to a halt. I havn't been able to measure the speeds, but I would guess it slows down to about 100kB/s as soon as I log in. This is on a gigabit network and the problems are equal for all users, even the ones connected to the same switch as the server. I've assigned 2 GB RAM to the server, and reserved it 1,5Ghz processor power. I don't have to do anything special on the server for this halt to occur. How can I make sure file sharing is prioritized on the server, so no matter what applications I'm using it will always make sure file sharing works properly? Could this be a VMware issue?

    Read the article

  • How to unmangle PDF format into a usable text or spreadsheet document?

    - by Chuck
    Upon requesting some daily/hourly sales data from a coworker who is responsible for such requests, I was given a series of PDF files. The point of sale program that is used, for some reason, answers requests for this type of information in the form of PDF files. The issue: The PDF files look to be in a format that should easily be copy and pasted into a spreadsheet. There are three columns that look to be neatly organized across two pages. When copy/pasting the first page, all three columns from the PDF's first page are dumped into a single column consisting of the Date followed by the Hours for the transactions on that day. The end of this Date/Time information is followed by all of the Total Sales values that should be attached a Date and Time of the transaction. (NOTE: There are no duplicated Dates in the Date column, ie, Multiple transactions for a day only have one yyyy/mm/dd listed for the first row but not the following rows.) While it was a huge pain, it was possible to, in about four or five steps, get the single column of data broken out into three columns that matched the PDF. The second page of the PDF file, when attempting to copy/paste into a spreadsheet, creates a single column with the first third of the cells being the Dates from the PDF, the second third of the cells being the Hours of the transactions and the final third of the cells being filled with the Total Sales. After the copy/paste there is no way to figure out which Hours belong to which Dates or Total Sales due to the lack of the duplicated Dates in the Date column as mentioned above. My PDF-fu is next to non-existent. I've just now started to work with PDF editors and some www.convertmyPDFforfree.com websites, so far, with absolutely nothing remotely coming anywhere near usable output. (Both methods have so far done nothing but product blank documents.) Before I go back and pester my co-worker into figuring out a way to create a report in some other format than PDF, is there any method by which to take the data that looks to be formatted correctly in a PDF and copy/paste it into a spreadsheet that will look the same? I appreciate any help that can be made available. The sales data isn't so sensitive that I couldn't part with a bit to let somebody actually see what it is that needs to be dealt with, just let me know. The PDF's are less than 100kb each so sending them shouldn't be a burden to any interested party.

    Read the article

  • Cron job failing to backing up a Postgres database

    - by user705142
    I'm unsure what's going on here: I've got a backup script which runs fine under root. It produces a 300kb database dump in the proper directory. When running it as a cron job with exactly the same command however, an empty gzip file appears with nothing in it. The cron log shows no error, just that the command has been run. This is the script: #! /bin/bash DIR="/opt/backup" YMD=$(date "+%Y-%m-%d") su -c "pg_dump -U postgres mydatabasename | gzip -6 > "$DIR/database_backup.$YMD.gz" " postgres # delete backup files older than 60 days OLD=$(find $DIR -type d -mtime +60) if [ -n "$OLD" ] ; then echo deleting old backup files: $OLD echo $OLD | xargs rm -rfv fi And the cron job: 01 10 * * * root sh /opt/daily_backup_script.sh It produces a database_backup file, just an empty one. Anyone know what's going on here?

    Read the article

  • Windows 7 - cannot access my own external disk

    - by Tomas
    I use Windows 7 Home Premium and external USB disk with NTFS partition. I cannot write-access the my own files on it, even as a member of Admnistrators group! Is there any way how to go around this permission checking, without actually writing some permission information to every folder on it? I have 3 external disks (up to 1TB), and I have thousands hundreds of files on each!!! Doing some permission change, that will actually go recursivelly through all folders on all my disks is plain brain damage!! 1) Is there any way how to change it somehow globally? (like mount options...) .. Or how to go around this annoying permission checking? It was working in Win XP normally! 2) if not, and I must do the recursive operation on all folders, how to do it PERMANENTLY, so that I don't need to do it again on another Windows 7 computer!

    Read the article

  • What antivirus software supports updates without an internet connection?

    - by Michael Gundlach
    I'm putting antivirus software on Windows 7 computers in the middle of Africa. The computers don't have internet access, but still need to be protected against viruses from CDs and thumbdrives. Separate from these computers is one computer that does have extremely spotty internet access. What's the best AV software for this situation? The important part, as I see it, is that we need to keep the computers up to date, but can't let the AV software suck down updates at its leisure: the computers are disconnected, and getting emails onto the connected computer is a challenge enough. We thought we might transfer update files to the connected computer using a protocol that can handle repeated connection drops (e.g. FTP with resume.) Then we'd manually apply the update files to the disconnected computers. Does any AV software support this? Is there a better solution?

    Read the article

  • Slow WLAN file transfer between server and tablet

    - by user266985
    My file server is running Ubuntu 12.04 and I'm sharing files from it over samba. It is connected via gigabit ethernet. My desktop, running Windows 8.1, is also connected via gigabit ethernet. I can transfer files between the two and completely saturate that gigabit pipe. However, I just got a Surface Pro 2, and I'm trying to stream HD movies from my server to the device over WiFi. For some reason, I can't break much past 1.5MB/s transferring files over the network. I've tried streaming through XBMC and a standard file copy; no difference. To add the confusion, if I connect to my guest network and then use my VPN server (installed on the router) to access the file server, I get around 3.2MB/s. I've been running diagnostics to determine the root and I think I've found it but I have no idea what is causing it or how to fix it. Router: Asus RT-N66U Surface Pro 2 Network Card: Marvell Avastar 350N (Driver 19/09/2013 v14.69.24044.150) InSSIDer: Link Score: 100 Co-Channels: 0 Overlapping: 0 5GHz Network Channel: 48+44 iperf File Server as Server; Surface Pro 2 as Client - TCP Performance: Acceptable ------------------------------------------------------------ Server listening on TCP port 5001 TCP window size: 85.3 KByte (default) ------------------------------------------------------------ [ 4] local 192.168.0.90 port 5001 connected with 192.168.0.56 port 57367 [ ID] Interval Transfer Bandwidth [ 4] 0.0- 1.0 sec 10.1 MBytes 84.7 Mbits/sec [ 4] 1.0- 2.0 sec 10.4 MBytes 87.6 Mbits/sec [ 4] 2.0- 3.0 sec 10.6 MBytes 88.8 Mbits/sec [ 4] 3.0- 4.0 sec 10.7 MBytes 89.5 Mbits/sec [ 4] 4.0- 5.0 sec 10.1 MBytes 84.4 Mbits/sec [ 4] 5.0- 6.0 sec 10.2 MBytes 85.8 Mbits/sec [ 4] 6.0- 7.0 sec 7.04 MBytes 59.1 Mbits/sec [ 4] 7.0- 8.0 sec 10.8 MBytes 90.2 Mbits/sec [ 4] 8.0- 9.0 sec 10.6 MBytes 89.1 Mbits/sec [ 4] 9.0-10.0 sec 8.62 MBytes 72.3 Mbits/sec [ 4] 0.0-10.0 sec 99.2 MBytes 83.1 Mbits/sec iperf Surface Pro 2 as Server, File Server as Client Performance: Poor ------------------------------------------------------------ Client connecting to 192.168.0.56, TCP port 5001 TCP window size: 22.9 KByte (default) ------------------------------------------------------------ [ 3] local 192.168.0.90 port 40233 connected with 192.168.0.56 port 5001 [ ID] Interval Transfer Bandwidth [ 3] 0.0- 1.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 1.0- 2.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 2.0- 3.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 3.0- 4.0 sec 1.25 MBytes 10.5 Mbits/sec [ 3] 4.0- 5.0 sec 1.62 MBytes 13.6 Mbits/sec [ 3] 5.0- 6.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 6.0- 7.0 sec 1.38 MBytes 11.5 Mbits/sec [ 3] 7.0- 8.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 8.0- 9.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 9.0-10.0 sec 1.62 MBytes 13.6 Mbits/sec [ 3] 0.0-10.1 sec 15.0 MBytes 12.4 Mbits/sec For some reason, it gets capped and I haven't got a clue why. Any suggestions? Edit: My link speed is reported as 270Mbps by Windows. I'm less than two metres from the router with a clear line of sight.

    Read the article

  • How can I check the actual size used in an NTFS directory with many hardlinks?

    - by kbyrd
    On a Win7 NTFS volume, I'm using cwrsync which supports --link-dest correctly to create "snapshot" type backups. So I have: z:\backups\2010-11-28\cygdrive\c\Users\... z:\backups\2010-12-02\cygdrive\c\Users\... The content of 2010-12-02 is mostly hardlinks back to files in the 2010-11-28 directory, but there are a few new or changed files only in 2010-12-02. On linux, the 'du' utility will tell me the actual size taken by each incremental snapshot. On Windows, explorer and du under cygwin are both fooled by hardlinks and shows 2010-12-02 taking up a little more space than 2010-11-28. Is there a Windows utility that will show the correct space acutally used?

    Read the article

  • How can I make .vimrc read from an external file?

    - by GorillaSandwich
    I'd like to modify my .vimrc to read the value of a variable from an external file. How can I do this? Specifically, a friend and I share a git repo with our .vim files, but there are a few small differences in what we want in our configs. So most of the file is common, but we use if statements to determine whether to load user-specific sections, like this: let whoami = "user2" if whoami == "user1" ... After checking our common .vimrc out of source control, we each have to change the let whoami assignment so our own section will be loaded. Instead, I'd like to keep a separate file, which can be different for each of us, and from which vim will load that variable value. Maybe another angle on this is: Will vim automatically read all the files in my .vim directory? If so, we could each put a symlink in there called username.vim, and link that to an external file that would be different for each of us.

    Read the article

  • Best practice for administering a (hadoop) cluster

    - by Alex
    Dear all, I've recently been playing with Hadoop. I have a six node cluster up and running - with HDFS, and having run a number of MapRed jobs. So far, so good. However I'm now looking to do this more systematically and with a larger number of nodes. Our base system is Ubuntu and the current setup has been administered using apt (to install the correct java runtime) and ssh/scp (to propagate out the various conf files). This is clearly not scalable over time. Does anyone have any experience of good systems for administering (possibly slightly heterogenous: different disk sizes, different numbers of cpus on each node) hadoop clusters automagically? I would consider diskless boot - but imagine that with a large cluster, getting the cluster up and running might be bottle-necked on the machine serving the OS. Or some form of distributed debian apt to keep the machines native environment synchronised? And how do people successfully manage the conf files over a number of (potentially heterogenous) machines? Thanks very much in advance, Alex

    Read the article

  • Print each bookmark of a PDF separately

    - by Dave
    I have a very large (1000 page) PDF which contains about 100, ten page each documents one after the other. I would like to have them sent to my office printer as individual files so my office printer will print them double sided and staple each one individually. I'm using Adobe Acrobat X and think the first step is to bookmark the start of each of those 100 documents. I don't know the next step though. I also have a batch printing program so if i can extract each of those 100 bookmarks to individual files that would work too. Thanks for all the help.

    Read the article

< Previous Page | 663 664 665 666 667 668 669 670 671 672 673 674  | Next Page >