Search Results

Search found 51183 results on 2048 pages for 'local files'.

Page 103/2048 | < Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >

  • Program to set time on local computer using GPS data

    - by dmkerr
    I have a laptop running Windows XP that does not connect to the Internet and I would like to use a USB GPS receiver as a time source to set the clock. Generally, the laptop will not have access to the open sky so getting a strong, multi-satellite connection is not likely. Ideally, I would like to be able to have the program detect the GPS receiver and set the clock without intervention from the user. Is what I'm seeking even feasible?

    Read the article

  • How to avoid copying corrupted files with rsync

    - by Roberto Aloi
    I have an HDD with plenty of files, some of which are unfortunately corrupted. I'm now trying to copy the good files into a new HDD. I'm using: rsync -azP SRC TGT When rsync comes to one of the corrupted files, I can see a message in the console: rsync: read errors mapping XXX: Input/output error (5) In the target folder, I still see the corrupted file, which I'm not able to open and which I have to delete manually. Is there any option to tell rsync not to copy files after a i/o error?

    Read the article

  • Are PHP session files ever deleted?

    - by GetFree
    I see there are thousands of files in my "/tmp" directory (a CentOS machine) and almost all of them are PHP session files. I'm worried about the possible impact this might have on my system. Are those files ever deleted either by the OS, Apache or PHP? or I have to take care of it myself?

    Read the article

  • Mercurial (hg) commit only certain files

    - by bresc
    Hi I'm trying to commit only certain files with hg. Because of of hg having auto-add whenever I try to commit a change it wants to commit all files. But I don't want that because certain files are not "ready" yet. There is hg commit -I thefile.foo, but this is only for one file. The better way for me would be if I can turn off auto-add as in git. Is this possible? thx

    Read the article

  • iptables intercept local traffic

    - by Anonymous
    i hope someone can help me out with somewhat simple task. I'm trying to redirect a client in my router through my desktop PC, so i can dump the traffic and analyze it (its potential source of poisoning the network with malicious packets). However i don't have a second NIC on my hands and i was hoping i can redirect all the traffic from that IP through my PC. In essence to become MITM for the client. Does anyone have any idea where to start: Current state: (localip)-(router)-(internet) And what i want to do: (localip)-(pc)-(router)-(internet)

    Read the article

  • Access node.js local server though mobile via same shared wifi

    - by laggingreflex
    EDIT: I was stuck in this situation before but then it was Apache-related But this time I'm using NodeJS, so the old answer doesn't help. I'm running apache a NodeJS webserver (on port 80) on Windows 7. I want to access the webserver through my mobile which shares the wifi router with my pc locally. http://localhost works from PC. But I can't access http://192.168.1.4 from either my phone or even my computer. ipconfig /all on my computer lists my ip address as 192.168.1.4 Wireless LAN adapter Wireless Network Connection: IPv4 Address. . . . . . . . . . . : 192.168.1.4(Preferred) I can ping my phone's (internal) ip address [192.168.1.5] from PC and vice-versa, I can ping my PC [192.168.1.4] from my phone. So why can't I access http://192.168.1.4 from my phone? (or PC) Firewall is off.

    Read the article

  • Is there a way to backup all my files and replace with 0-byte files with same name?

    - by laggingreflex
    My main drive on laptop keeps filling up so I take a backup on a USB and delete the original files. But then I find myself getting (downloading or getting from someone else) files that I already have backed-up but couldn't recall at the moment. So is there a way I can keep a 0-byte file with the same name as teh backed-up copy so that when I'm asked whether to overwrite the existing file, I can easily choose no knowing I probably have this file already in the backup. EDIT: better yet, replace with a shortcut(.lnk) on the external drive so I can access the files hassle free and not get any errors because of 0-byte files being accidentally opened.

    Read the article

  • Preventing my postfix to send my local users spam

    - by Jack
    I have a postfix/dovecot mail server with 100 different users. When they send an email they need to be authenticated. I successfully use saslauth to achieve this. Few days ago I had a problem. One specific user, probably with a virus or a spam-bot installed in its computer, started to send out through my server thousands of emails in few hours. As result, my ip has been blocked by many isp provider (@aol, @yahoo, and others) and has been listed in many blacklist, making all my 100 users unable to send any email to anyone. What is the best practice to avoid this problem? It would be great if my server could recognize a spamming user and automatically block it. Also, have a limit of, say, 30 emails per hour could be a partial solution. Any idea how to face this problem? Thank you

    Read the article

  • Can not copy files from Windows 2003 server over network

    - by Mark
    It seemed quite strange. I have a share folder with full read/write permission on my Windows 2003 server. With a XP client, I can create a new folder on the share folder, and I can copy files to it normally, but I can not copy these files back to my client PC. I tried use ftp,and webdav to get the files from server. None of them worked. Is the issue related with NETWORK SERVICE? Thanks for your help.

    Read the article

  • ZFS, dedupe and PST files

    - by Unreason
    I am interested to know what would be expected maximum dedupe ratio for a set of PST files. I have ~40G of pst files from ~15 usres with high level of duplication of attachments. I am running tests to see if I can have significant space savings if I store the data on ZFS with dedupe. For this purpose I have installed a test setup of Nexenta, but was wondering if someone here had already done this and what level of deduplication I might expect (or in another words how sensitive are pst files to block alignment and what are the parameters that can influence the ratio?). Initial test show very low dedupe ratio and I did find explanation that block level dedupe would not be efficient here and that byte level dedupe would be much better (and that it should be performed by application that is aware of internal organization), so I am just double checking here if someone have some more input. Otherwise I will probably be converting PST files to IMAP.

    Read the article

  • How to avoid compressing compressed files

    - by Gzorg
    Most compression programs compress all files by default. But when archiving a folder containing already compressed files, there is no need to compress them a second time, such as archives, packed setup program, jpg, movies, mp3,.... Are there any compression programs that allow an arbitrary list of type of files to be stored while the others are still compressed ? It looks like Winrar can't. I expect this would be doable with tar+gz/bzip2 and some scripting in various ways. Edit : Winrar can

    Read the article

  • DOS application to allow remote management of files over serial link

    - by tomlogic
    Harken back to the days of DOS. I have an embedded DOS handheld device, and I'm looking for a tool to manage the files stored on it. I picture an application I can launch on the device that opens COM1 up for commands to get a directory listing, send/receive files via x/y/zmodem, move/delete files, and create/move/delete directories. A Windows application can then download a recursive file listing and then manage those files (for example, synchronizing with a local directory). Keep in mind that this is DOS -- 8.3 filenames, 640K of RAM and a 19200bps serial link (yuk!). I'd prefer something with source in case we need to add additional features (for example, the ability to get a checksum of a file for change detection). Now that I've written this description, I realize I'm asking for something like LapLink or pcAnywhere. Norton no longer sells DOS versions of pcAnywhere and LapLink V for DOS seems pricy at $50. Are you aware of any similar apps from those good old days?

    Read the article

  • Speed-up large number of files deletion on NTFS volumes

    - by sharptooth
    Every now and then I need to delete a folder containing something like 500k files from an NTFS volume. I do this with Windows Explorer. Since NTFS journals all the service data changes each deletion is carried out serially and so the whole 500k files deletion takes ages. I remember when I did the same in FAT32 it ran uncomparably faster. Is there any way to speed up deletion of large number of files on NTFS volumes?

    Read the article

  • Google Drive and sync?

    - by Royi Namir
    Two questions please: Where are Google Drive offline docs stored in my computer? which folder ? (*when accessing docs.google.com/offline) I have a text file in my Google Drive. When I click on it, I can view it only (no edit). The only option to edit is to export it to Google Docs, but now I have 2 files: the original text file and the the editable one. So now I have to sync both the regular file AND the version created by Google Docs. Is that the normal behavior?

    Read the article

  • How to make local apache server public/visible ?

    - by George
    Hello. I am running an Apache2 server on a Fedora 13. I'd like to make it publicly accessible(visible).For example I'd like when somebody types http://my.ip.numbes/ that they would see what I have in my document root folder. Just for a presentation of a course work at university. Permissions are set to 755. User owning the document root is apache. SELinux is temporarily disabled. But port 80 is closed. I tried to open it by adding an entry to iptables and restarting them, no change. I guess I am missing something big here. Help would be greatly appreciated. Note: I have a static (public, real) IP address.

    Read the article

  • Program to swap files between drives?

    - by josi
    Has anyone built a program/script to transfer files between 2 hard drives, but like if both are near full....so one copies 1 file over then the other copies the other file, then they delete the files that were copied? Kind of annoying, have a 6tb raid at about 4tb full, then 1 4.5tb basically full, can't really swap them easily....without doing many copies and deletes of files.... Anyone know a way to make them just swap? lol

    Read the article

  • Realtime file-level mirroring from local NTFS to network drive

    - by hurfdurf
    We have some data collection machines running WinXP. After a new file is written, we would like to immediately copy the new file to network storage (a NetApp CIFS share) automagically. We need realtime or near realtime copies generated (copy upon filehandle close would be fine -- these are not long-running system logs). Two commercial applications I've found so far are MirrorFile and IBM's Tivoli CDP. Are there any reliable open source programs or simple ways to get Shadow Copy to do something similar? Bonus points if it runs as a service.

    Read the article

  • Why would anacron not be running?

    - by Rory
    I have a Ubuntu system that has anacron installed. However I'm pretty sure it's not running. It's not running the commands in /etc/cron.daily to rotate the syslog files (I'm using sysklog, which has its own rotating log method, not using logrotate). The last time the logs were rotated were in October 2009. /var/spool/anacron/cron.daily exists and the contents are 20091015. AFAIR we had a power outage then, and everything rebooted. How can I debug anacron? How can I see why it's not running? My first instinct is to look for /var/log/anacron, but that's not there. How can I fix it to make it run again?

    Read the article

  • 7-Zip many files from different folders?

    - by mafutrct
    I would like to add a large number of files with different names from different folders to a single 7-Zip archive using 7za.exe. This should be simple, but it turned out to be a major pain. I created a file that contains the paths (7za a out.7z @list.txt), but once there are too many (~100) files, it fails. Apparently the content of the argument file is pushed onto the command line buffer [Edit: This was likely a misinformation on my part, either way it was not the reason], which is far too small (the number of files to add is more than one million). Splitting the process up by adding the files one by one is not feasible due to the way 7za works: When adding the next file, it creates a copy of the archive, adds the file to the copy and finally replaces the original. This is terribly slow once the archive gets to a couple 100 MB in size. So far I am using a combination of the two approaches by adding a dozen files each time in a loop, but it is an unreliable hack and still very slow. Is there a better way to do it? I tried to use 7-Zip wrapper DLLs (I'm a C# programmer), but none of them worked reliably and I was repeatedly suggested to just use 7za instead.

    Read the article

  • Firefox Does NOT get local site cookie

    - by Campo
    This is a weird one. We have a production server (Server 2008) and two staging servers (Server 2008 and Server 2003) I have sites on all of these. They all use cookies. On the Production server when browsing to our site www.supernovainteractive.com there is a cookie that detects when you visted the site and it will not refresh the logo animation (top left hand side) on clicking to another page. This works for all browsers on the production server. I’m not sure what’s going on but for some reason cookies are not working on one site in the 2008 staging server only. This is when browsing using Firefox (3.6.3) they work fine on all other browsers (IE, Chrome, Safari, Opera) In addition, the 2003 staging server works fine. You can test on the Supernova Interactive site by noticing the logo in the top left corner. It uses a cookie to detect if you’ve already seen the animation. Once you’ve seen it once, it doesn’t animate again until tomorrow. Currently, it’s animating every time. I have opened an outside facing port so others can see the issue. Http://exchange.supernova.com:10009 Any ideas on this one? Firewalls are off on the server. Notice you do not get a cookie from Exchange.supernova.com.

    Read the article

  • Local Only 3G Router via Broadband Device?

    - by GateKiller
    I am looking for the name/type of device which does the following: Connects to the internet via ethernet or wireless and then produces a "fake" 3G signal for my iPhone to connect to. The 3G siginal in my office is very weak or non-existant and I need a way of boosting or replacing it.

    Read the article

  • How to handle files that don't need version control in mercurial

    - by richardh
    I am new to mercurial, and for the most part do LaTeX reports and statistical calculations in R using .csv and/or .sqlite files. Re LaTeX, all I really care is the .tex file. Re R, I don't need version control on the .csv or .sqlite files because they are static. When I do 'hg add' for a repo with a .csv and/or .sqlite file, I get a warning like: rev2.sqlite: up to 3070 MB of RAM may be required to manage this file (use 'hg revert rev2.sqlite' to cancel pending addition) So I revert and subsequently use adds like hg add -X *.sqlite. I guess I really have two questions: (1) Should I ignore these warnings? Because these large files are static, can I just add to the repo knowing that the diff files will always be empty and not worry about wasted resources? (2) If I should keep excluding these files from the repo, is there away that I can fix this option? I.E., add to my .hgrc file something that always appends an option like -I *.tex -I *.R to my 'hg add' commands? Thanks!

    Read the article

  • Routing connections to passthrough a local machine

    - by xiamx
    Please tell me if what I'm trying to do is feasible. I have a router named "R" which is connected to WAN. R allows adding rules to the routing table. There are numerous of machines connected to the LAN port of R, they all have ip addresses 192.168.1.* assigned with DHCP on R. Among those machines, there's a machine C with ip address 192.168.1.100. I want all traffic of other machines in the subnet to pass-through machine C where some filtering and logging will be done. Is this possible? Is there a name for what I'm trying to do? (so i can do more googling later)

    Read the article

  • Local Network - Windows 7 and Vista can't see each other

    - by ca8msm
    I've got a strange issue at home that has been bugging me for weeks, but I really need to get it sorted now so I'll detail as much as I can and hopefully someone can spot what might might be wrong. I have a wireless router connected to the internet and 3 devices connected to it. They are: Name OS Network IPv4 PC1 Windows 7 WORKGROUP 192.168.2.2 LAPTOP1 Vista WORKGROUP 192.168.2.3 PS3 192.168.2.4 and they all get their IP addresses dynamically. Both PC1 and LAPTOP1 can ping PS3 and get a response. PC1 and LAPTOP1 are unable to ping each other by ip address unless I ping by their name (which bizarrely shows that it is pinging via the IPv6 address). Also, to confirm this both PC1 and LAPTOP1 can ping each other via the long IPv6 address that they both have so they can obviously see each other just not via IPv4. I've disabled the firewalls on both machines as well to rule that out. I don't really know what IPv6 is used for and I've tried disabling it on both machines but all that happens then is that neither machine can see each other at all then. Does anyone have any idea of what may be stopping them seeing each other, any ways I can look at fixing this, or any network tools that may help identify where it is failing? Thanks, Mark

    Read the article

< Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >