Search Results

Search found 40229 results on 1610 pages for 'deleted files'.

Page 600/1610 | < Previous Page | 596 597 598 599 600 601 602 603 604 605 606 607  | Next Page >

  • FTP Issue when connecting to a debian machine from windows

    - by erin c
    I have a .net application which copies bunch of files to a specific FTP folder on a debian machine on periodical basis, ftp folder has 755 mod, owner of the directory is the ftp username that I authenticate in .net application. So far I tested this application with bunch of debian boxes, my initial attempts generally fail with following message if I try it with a debian machine that I haven't tried it before: "remote server returned an error 550 File unavailable" When I see this error, I log onto another debian machine on my network, and I try FTP'ing the debian box that returns the error message from command line. I generally "put" a very small file to the folder in question, right after that windows application starts copying files successfully via FTP. It is as if my command line ftp operation fixes the problem and makes debian compatible with my .net application. I checked permissions before and after the problem, it doesn't look like what I did changed anything, I am at loss understanding why this problem occurs and why it is fixed with my silly hack. Can anybody tell me where to look at next to fix this extremely annoying issue?

    Read the article

  • Right Clicking Network File Very Slow Over VPN

    - by Reafidy
    I am having real troubles with a VPN connection. The VPN connection is not used for a internet connection just for file browsing. File browsing is slow, taking about 3-4 seconds to bring up a list of folders. I can live with this, however the problem is when I right click on a file. Sometimes the right click menu comes up instantly but sometimes it brings up the wait icon for anywhere between 30 seconds to a few minutes before displaying the menu. I ran speedtest.net and the results were: 3.08 Down/0.13 Up (Mbps) 0.13Mbps = 16kbps upload. So I am not experiencing miracles with opening files. A 120kb file can take anywhere from 5 to 30 seconds. Sometimes transfering/opening files happens as expected other times its slow but the real issue is with the right click as mentioned. Anyone have any ideas? Using PPTP Clients are all windows 7 pro

    Read the article

  • Convert Pdf to exe to raise security

    - by kamiar3001
    hi folks I have some pdfs I want convert all to exe files and put them into folder on my cd and my customers run exe files instead of pdf and for security reason. so I have tried pdf2exe tools but I need something totally free. pdf2exe is evaluation version and there are some limitation. Please tell me something free, I don't need complicated one I just need encapsulating into exe file. In fact I don't want someone be able to save pdf document and I want to have Independent viewer on my cd.

    Read the article

  • Set default open-with app to a Python program on a Mac

    - by Vincent
    I use the open source application UliPad to edit restructured text files (rst). It is a Python application that I launch via Terminal like so: python32 UliPad.py I have python32 as an alias to the 32bit install of Python on my machine. I have several versions installed. First I would like a way to launch UliPad like other OS X apps. Not really sure how to do this. Secondly, I would like to set all .rst files to be opened with UliPad.py. Is there a way to do this? I know how to choose the default app in Finder but not sure how to choose UliPad as that app.

    Read the article

  • Log rotation with automatic *.log file discovery

    - by Mikko Ohtamaa
    I am hosting several websites which each of run their own Python process and write *.log output files, but the directory structure is not standardized. Example: -rw-r--r-- 1 plone plone 125M 2012-08-29 11:35 ./x/var/log/instance-Z2.log -rw-r--r-- 1 plone plone 19M 2012-08-29 00:07 ./zope2.9/y/log/event.log -rw-r--r-- 1 plone plone 188M 2012-08-13 00:09 ./zope2.9/y/log/Z2.log -rw-r--r-- 1 plone plone 137M 2010-11-16 09:41 ./zope2.9/y/log/event.log I'd like to make log rotate autodiscovery these log files and run a log rotation on them, as opposite to manually type in every log file to logrotate conf. Does any existing tools offer this kind of log file discovery and rotation capabilities, without manually specifying each file? If not... then just write a shell script which generates the logrotate conf?

    Read the article

  • How to increase the number of items in the "recent" folder in Windows 7?

    - by netvope
    Windows 7 keeps a list of recently used files in C:\Users\<username>\AppData\Roaming\Microsoft\Windows\Recent. Based on my observation, it keeps 10 items for each file extension. So when you So open 11 .txt files in a row, the 2nd to the 11th items will stay in that Recent folder, but the first one will be gone. My question is: How to keep an unlimited number of items in that Recent folder? Note: Increasing the per-application recent items (e.g. as in http://www.mydigitallife.info/2009/05/21/change-increase-or-decrease-number-of-recent-or-frequent-items-displayed-in-windows-7-taskbar-jump-list/ ) has no effect on the per-user Recent item folder that I'm concerning.

    Read the article

  • Connecting two pcs through cable, at the same time as wireless

    - by Steffen
    I've got 2 pcs setup with WLAN - however when I copy files between the machines it takes forever (obviously, since I run 801.11g). So I thought, why not just connect them directly with a cable (my wireless router is the other end of the house, that's why I don't use cables to that) However how should I configure the LAN connection on each machine, so I can use it for transferring files between them, whilst using my WLAN to access the Internet ? I was thinking about letting one machine be gateway, but then Windows pops up with a lot of warnings (basically it says this won't work when you're using two disjoint networks) And now I'm at a loss as to what to try.

    Read the article

  • permission denied when trying to execute a binary I burned to a CD-R

    - by user16654
    On a UBUNTU karmic machine, I burned a cd from the command prompt using: cdrecord -v speed=16 dev=0,1,0 /FPS.iso The CD now contains an executable and some files. I tested the cd by loading it onto another machine (Red Hat 5.3) and when I try to run the program I get the following message: bash: ./FPS1_1: Permission denied I can open other files like text documents (the executable also comes with shared libraries). I realized I had burned the cd as root so I burned another one as another user but I still got the same problem. How can I remove this permission or what is the problem? P.S. the image was in / if that helps

    Read the article

  • Reasonable Location to Install Web Service on Server

    - by Mr. Disappointment
    Firstly, I'm a software developer and not qualified as any kind of system or server expert so I'm looking for advice in order to help me prevent faults on our server. I've written a modular system to carry out certain tasks for us autonomously to prevent us from writing the same old code over and over again. This consists of a Windows Service (.NET), a Web Service (WCF), a shared Class Library, and a Database which will run on a Windows Server 2003. The problem comes, for me, in deployment. Specifically the web service - naturally the local service (and required shared library) are persisted (by default and convention) in the Program Files folder, but storing the web service here just seems absurd to me (even though we'd lock it down to appropriate use only). Should the files be stored some place else all together? Or split them up and store the web service elsewhere?

    Read the article

  • How to backup a remote VPS machine?

    - by morpheous
    I am considering opting for a VPS solution, with the server running Ubuntu server. I am pretty new to this, and I need to come up with a backup policy for my server data. Initial data is likely to be about 80Mb, and I expect the data to grow at approximately 5Mb to 10 Mb a day. Can anyone recommend: A backup/restore policy (best practises for a small startup) Which tools to use for backup? Another thing that is not clear to me is - where are the files backed up to normally (in the case of remote servers). If the files are backed up to the same machine (or even to another machine but with the same host), there is potentially, a single point of failure). How do people normally backup their server data, and is the probability of machine meltdown or the host company server farm "catching fire" so remote as not to be worth worrying about - especially for a small (read one man) startup like me?

    Read the article

  • Using rsync to take backup of folder

    - by Ali
    Hi, I have a server (Linux) with NAS which is mounted as folder "mount" I have website in "public_html" folder. I want to take backup of website in mount folder automatically at certain intervals for e.g. every hour. I read that there is something called "rsync" which is used to make two folders sync. And it doesn't copy all files every time and instead matches if the file has been changed and then only update changed files. How do I use it to make automatic backups? I have root access to server. Thanks

    Read the article

  • Encoding video stream for playback on a vanilla Windows XP with mencoder

    - by Tamás
    I have a bunch of PNG files, generated from a script. They represent consecutive frames of a video sequence and I'd like to encode them into a single AVI file (or some other video format) using mencoder. What parameters should I use to ensure that the video can be viewed on a vanilla Windows XP using Windows Media Player with no extra codecs installed apart from the default ones? So far I've tried -ovc lavc -lavcopts vcodec=wmv2 and -ovc lavc -lavcopts vcodec=msmpeg4 with no success. (Background story: some of the people I'm collaborating with on a scientific project cannot install any codecs on their university computers without the help of the local sysadmins, who are of course not very willing to install anything. I'd like to ensure that they can also view the video files I am creating).

    Read the article

  • Disable MathML output of eLyXer

    - by Gryllida
    eLyXer is a standalone LyX to HTML converter. In the resulting file, equations are formatted as MathML, and the file itself starts with an XML tag. This causes two problems: LibreOffice does not read the XML file (it can read HTML files, but not XHTML). I am unable to copy and paste the equations into a document editor such as LibreOffice with the goal of subsequent conversion into .doc, because .doc files do not support MathML. The eLyXer help page mentions an option to only use simple math, but there is no option to set math equations to output as images. And I already set Document Settings Output Math equations Format: images in LyX, which presumably is saved in the lyx document somewhere. A web search did not come up with any solutions.

    Read the article

  • SMB returns the entire file instead of header info

    - by billdlawson
    Starting a section of code checks for access to many data files (flat files so each table is a file) and when I do a packet capture, in our capture only the header info is sent by the server to the client. However I have one Customer who is using a SAN that gets the whole file instead of just the header info,and besides just being slower, this is causing file access issues. They have already turned off OPLOCKS at the server and at the workstations. This is not client server. The data files and the application reside on the server but the users run the application locally via a shortcut with a mapped drive or UNC. So when I simply select an option that prompts for a vehicle number, not tryng to select a record but rather simply verify the datafiles are accessible, that window opens in 1-2 seconds for me. When they do the same thing it takes 6-15 seconds after there several users are running the program. Maximum number of users is 15. The program has a lot of small modules, 800 .cob modules. So it is very chatty but these are datafiles. We have Wireshark captures that show he's pulling the whole file and we're just getting the header. Thier capture vs ours. We suspect the SAN. Has anyone ever heard of a SAN improperly interpreting runtime requests? So an SMB request. This is Acucobol-GT (now Microfocus). The application is written in COBOL. This is not a new program just a new problem. This is one customer of over a thousand who are otherwise running smoothly and we are totally stumped. All XP users, the server is Windows 2003 (with Virtual server) and I don't yet know the SAN info. Also we have many installations running virtual servers but only few on SANs or we just don't know it. This is not a network throught put issue, the load is less than 5% on the server and theer are no timeout or retransmits. PS If it wasn't for Wireshark I'd still be chasing my tail. An application trace file on thier installation just looks like they run slower. If you want the Wireshark trace file I can make it available. Thanks in advance - Please excuse my verbosity (word?) but I'm not sure what's relavent.

    Read the article

  • A simple Volume Replication Tool for large data set?

    - by Jin
    I'm looking for a solution to the following: Server A (Site A) - Win 2008 R2 - approx 10TB (15TB max) of data - well over 8 million files Server B (Site B) - Win 2008 R2 I want to assynchronously replicate Server A's volume to a volume on Server B for data redundancy. Something that I can say to my users, "go here for data" when/if Server A goes belly up due to machine problems, disaster, etc. Windows 2008 R2 does have DFS, but microsoft does not apparently support this large of a dataset (or more accurately, more than 8 million files - according to the docs I could find). I also looked at Veritas Volume Replication, but this seems almost too much as I would also require Veritas Volume Manager. There are numerous "back-up" software which makes a 1-1 backup, which would be ok, but since it will be transfering over internet, I'd like something that has compression during transfer like DFS has. Does anyone have any suggestions regarding this?

    Read the article

  • wbadmin incremental system state backup

    - by user74513
    I am doing system state backups on a Windows Server 2008 R2 Enterprise (Service Pack 1) machine and expected the backups after the first one to be incremental. However with each backup a new directory with vhd files are created and the vhd files are almost the same size as the with the first backup. So the backups does not seem to be incremental. I used the following command to do the backup: wbadmin start systemstatebackup -backupTarget:f: I played around with the settings under "Configure Performance Settings" in the Windows Server Backup plugin in Server Manager but according to the description at the top of the dialog these settings are not applied to system state backups. Are there any settings available for wbadmin system state backup to make the backups incremental?

    Read the article

  • Use Windows Briefcase from the command line

    - by Daniel
    I have a thumb drive on which I take many of my files with me. I would like to synchronize it with my computer automatically when I connect it. I currently have a script run every time I connect it so that I can do the many things that need to be done when it connects (check for updates to the portable applications, etc.). I want my synchronization utility to: Detect conflicts work correctly when I move files or change the folder structure tell the difference between a file that was deleted on one side and a file that was added on the other work from the command line or at least have a command that will open up a window provide a confirmation screen before doing anything The Windows Briefcase does all of these except the command line. Is there any program that does all of these, or is there a way to synchronize the briefcase from the command line?

    Read the article

  • Questions about linux root file system.

    - by smwikipedia
    I read the manual page of the "mount" command, at it reads as below: All files accessible in a Unix system are arranged in one big tree, the file hierarchy, rooted at /. These files can be spread out over several devices. The mount command serves to attach the file system found on some device to the big file tree. My questions are: Where is this "big tree" located? Suppose I have 2 disks, if I mount them onto some point in the "big tree", does linux place some "special marks" in the mount point to indicate that these 2 "mount directories" are indeed seperate disks?

    Read the article

  • Diffing file contents

    - by PHeiberg
    I have two plain text files, each file is containing a list of strings sorted alphabetically with one string per line. I want to diff the files and have an output of all strings that exist only in file2. Preferabbly I want the operation to be possible without any 3rd party tools, or with a minimum of installations of tools that is "normal" to find in a windows command line environment, such as GNU Diffutils, Powershell, etc. The output should be in text form (file or as command line output). Example: File 1 contents: A C D File 2 contents: A B C E Result wanted: B E

    Read the article

  • Scanned JPEGs are large and slow to load - can they be optimized losslessly?

    - by Alistair Knock
    I have hundreds of JPEG photographs which were scanned about 5 years ago from negative using a Konica Minolta DiMAGE Scan Dual IV. The dimensions are ~4500x3000, and the filesize is around 12Mb, compared to shots from a DSLR with dimensions of 3000x2300 and filesize of 2-4Mb (actually, these are the output from a RAW convertor). The filesize is obviously quite a big difference, but the issue that's bothering me is that the (perceived) loading time is at least 10 times slower. Is this size/speed discrepancy likely to be because the scanner software saved the JPEGs inefficiently / using an old compression format, or is it simply that the scanned negatives contain much more "detail" (in the form of grain/noise) than the digital images? If the former, is there a way to losslessly optimize them? I've tried re-exporting the scanned files to full size JPEG from my RAW software but the filesize is pretty much the same. Both files will have been saved at 100 quality.

    Read the article

  • Transfer nearly an entire file system to a fresh install as smoothly as possible

    - by Xander
    I've got a friend who needs his computer working in just a few hours. His files are safe, however, he managed to corrupt his main install of Windows 7. My plan is to go in with a Linux disk, copy his C:\ do a backup drive I've got and then reinstall. Restoring many of his files will be pretty simple (such as documents and such), however, things such as applications won't transfer as easily. Is there any easy way to transfer applications such as MS Office (which he needs in the morning) or other commercial software packages without having to go through the hassle of locating the keys and reinstalling them completely? I don't think just moving them over will work just because of the fact that I'm sure much of that is stored in the registry (validation stuff and such). Anyways, quick responses would be super nice! Also, additional help would be great!

    Read the article

  • FTP - 530 Sorry, the maximum number of clients...?

    - by aSeptik
    Hi All! i know this is not a properly code question, but who of you don't use an FTP client!? ;-) Ok my problem is that my FTP work great, exept when i upload files on a particular client server! on this server happen that some files are uploaded fine and others not, they stop while uploading at half of it's size, then this error is displayed: 530 Sorry, the maximum number of clients (4) from your host are already connected. Unable to make a connection. Please try again. Obviously this is not true, i'm the only one that is uploading! Anyone had the same experience with this!? PS: i have tried many different FTP, all display the same error or just hung up! Thank's

    Read the article

  • Does Ubuntu Server have any sort of cron job to automatically clear /tmp?

    - by DWilliams
    I know it clears out /tmp on reboots, but I haven't been able to find any sort of cron job on my server that clears /tmp. I recently set up a script that writes lots of files to /tmp and my server usually goes several months between reboots so I'm concerned about it being cluttered. I've seen several other distros that have a tmpwatch script installed by default. Ubuntu's repository seems to have replaced tmpwatch with tmpreaper. Is there any mechanism in place on Ubuntu (8.04 currently, soon to be upgraded to 10.04 when I get around to it) to clean up temp files on a server that doesn't regularly reboot or do I need to install tmpreaper?

    Read the article

  • what is Remote Desktop Services in Windows Server 2008 R2 all about?

    - by fejesjoco
    Seriously, I'm lost in all that sales mumbo-jumbo. Let's say I want 1 or 2 users to be able to remotely log on to a server, run Word, Visual Studio, Firefox, and whatever. Do I gain anything at all if I install Remote Desktop Services? Or do I just install Desktop Experience feature pack, enable remote desktop and voila, nobody will ever notice the difference? Here's what TechNet says about Remote Desktop Session Host: A Remote Desktop Session Host (RD Session Host) server is the server that hosts Windows-based programs or the full Windows desktop for Remote Desktop Services clients. Users can connect to an RD Session Host server to run programs, to save files, and to use network resources on that server. Users can access an RD Session Host server by using Remote Desktop Connection or by using RemoteApp. The good old simple remote desktop can also host a full Windows desktop for remote clients so that they can run programs, save files and do all that stuff. Why do they write about it like it's such a great new invention, besides that they want to sell it? RDSH doesn't seem all that different at all. What do I install when I install RDSH, since all those features are already there in Windows? What's even more confusing is that you need to take special care when you want to install applications to an RDSH so that they will be usable by many concurrent users. Why? All the modern applications install the program files in one directory, store some common settings in the ProgramData folder and the HKLM hive, and store user specific settings in the Users folder and the HKCU hive. They are designed to be usable by many users on the same machine. 2 or 2000 users can use them concurrently without any efforts. I can sign in with 2 users to a server with only remote desktop enabled, and both of us can run Word or anything without any problems, can't we? So what changes if I set RDSH to install mode, or what happens if I don't? Why is the feature to switch between install and execute mode there at all? Yes I know of some advantages in Remote Desktop Services, like there's no 2 user limit, it supports virtualization, video acceleration and stuff, it has a whole infrastructure with gateway, web access, connection broker, etc. But I don't need those, so if you take these away, how are these two technologies different? From the articles it seems like they are completely different technologies, whereas it looks to me that they are completely the same at the core, and Remote Desktop Services just adds some additional features, but doesn't reinvent anything.

    Read the article

< Previous Page | 596 597 598 599 600 601 602 603 604 605 606 607  | Next Page >