Search Results

Search found 5793 results on 232 pages for 'ftp sync'.

Page 155/232 | < Previous Page | 151 152 153 154 155 156 157 158 159 160 161 162  | Next Page >

  • How to fix audio/game stuttering in Google Chrome's Flash plug-in?

    - by Simon Belmont
    I'm having an issue. Windows XP, running the latest Chrome 23 build. I'm using Flash 11.5 built into Chrome (Pepper Flash). It runs horribly. Chrome 22 did not have this issue as far as I recall. What a shame. YouTube videos stutter badly and after a while, they begin to lag and lose sync with the video. I disabled Pepper Flash and tested HTML5 video in YouTube and it was smooth as glass. Additionally, certain Flash based games are almost unusable now. The plug-in is using 100% CPU and it lags horribly in these games. Google/Adobe, please fix this. I shouldn't have to disable the built-in Flash plug-in (with added sandboxing security) and use regular Flash to resolve this. Short of waiting for an update to Chrome, does anyone have a better solution to fixing this? I am all ears.

    Read the article

  • Apache mpm-itk Performance

    - by Matt Beckman
    I manage a bunch of VPSs with memory ranging from 1GB to 8GB. Most of these websites are Joomla websites, and the servers must support multiple sites/users/S-FTP. I use mpm-itk almost exclusively (mostly due to it's convenience in these shared environments). However, I'm aware it isn't known for performance, so I need some advice on making it faster. Due to the lack of documentation when I first went the way of mpm-itk, I included only one setting in the config, and that was to limit each user to 50 clients (the rest I left up to defaults): <IfModule mpm_itk_module> MaxClientsVHost 50 </IfModule> Are there any better alternatives available? Are there any settings supported in mpm-prefork or mpm-worker that are also supported in mpm-itk? Thanks!

    Read the article

  • How to properly store dotfiles in a centralized git repository

    - by asmeurer
    I'd like to put all my dotfiles (like .profile, .gitconfig, etc.) in a central git repository, so I can more easily keep track of the changes. I did this, but I would like to know how to properly handle keeping them in sync with the actual ones in ~/. I thought that you could just hard link the two using ln, but this does not seem to work as I expected, i.e., if I edit one file, the other does not change. Maybe I misused the ln command, or else I misunderstand how hard links work. How do people usually do this? Judging by GitHub, it's a pretty popular thing to do, so surely there's a seamless way to do it that someone has come up with. By the way, I'm on Mac OS X 10.6.

    Read the article

  • Picking up a lot of failed authentications for various accounts

    - by Josh K
    My server is getting a lot of various failed authentication attempts for various accounts. The most common one (that I've seen ) or the root account. I have since enabled Fail2Ban and ran several rootkit / malware checks to ensure I wasn't compromised. Is there anything else I should do? I only have three accounts enabled, and SSH access for only two. I have a full 48hr ban on anyone making more then six failed SSH login attempts. I do not have FTP enabled.

    Read the article

  • What is the simplest and fastest way to transfer large file through a Windows network?

    - by Sake
    I have a Window Server 2000 machine running MS SQL Server that stores over 20GB of data. The database is backed-up every day to the second harddrive. I want to transfer those backup files to another computer to build another test server and for recovery practicing. (the backup never actually got restored for almost 5 years. Don't tell my boss about that!) I have trouble transfering that huge file through the network. I've tried plain network copy, apache download, and ftp. Any method I tried end up failing when the amount of data transfered reach 2GB. The last time that I successfully transfered the file, it was through a usb attached external harddrive. But I want to perform this task routinely and preferably automatically. Wonder what is the most pragmatic approach for this situation ?

    Read the article

  • Questions about Domains and DNS

    - by ShoX
    Hi, I am totally new to the DNS and server hosting world and not quite sure what I need. I want to get a domain, forward it to my own server, so that the user sees example.com in the url bar and example.com/foo/bar will work. Depending on what subdomain it is, it should do different things (another base-directory at webserver, ftp, etc). Also my email should be able to be sent to and received by that server. What irritates me, is the fact, that in the A-record I can only list IP-addresses and no ports. So do I have to set up a nameserver on my own server? Or do I accomplish this via vhosts on my webserver? I would appreciate any help or link to a tutorial. I know how DNS works, know some basic apache-stuff, etc... so no need to explain that. Thanks

    Read the article

  • One-Way Backup Service? [closed]

    - by Jon Rodriguez
    Up until a month ago, my girlfriend has used MobileMe to backup all the files on her MacBook. This turned out terribly when a quirk of MobileMe caused it to erase all of her files on MobileMe, and then sync the newly-erased MobileMe down to her computer, erasing everything. A week's worth of college essays and CS homework were gone. Now, I am terrified of any commercial cloud-backup solutions because of the possibility of this happening. Going off the list provided in these answers, could you please help me find a good backup service that is completely one-way? I want a service where there is literally not a single line of code that has the possibility of writing to my computer's drive. I want a pure one-way backup service.

    Read the article

  • Do I have to chmod 777 my NFS folder when I share?

    - by luckytaxi
    Under Redhat, if I export a folder as an NFS mount, does the folder have to have RW for users/groups/others? Right now /storage/software is -rwxr-xr-x root/root i.e. /etc/exportfs /storage/software *(rw,sync) On my client, I can mount but I can't write. I'm using a regular user and NOT root. I think "no_root_squash" fixes it but I really don't want that. Then again, nor do I want to have to chmod 777 the folder on the server.

    Read the article

  • How do i restart my linux server in every 2 days via crontab?

    - by Barkat Ullah
    I have a Linux server containing the os version below: Linux 2.6.32-220.7.1.el6.x86_64 I want to restart it in every 2 days, please help me, I want to do it via crontab. Another help, I used a code below to drop my memory caches in every hour. 0 * * * * /root/clearcache.sh #!/bin/sh sync; echo 3 > /proc/sys/vm/drop_caches But 1st 15 in every hour my server remain so slow after cleaning the caches. My sites do not load during every hour in 1st 15 minutes. In another way if I restart my server then also caches are removed. So I decided to restart my server in every 2 days to drop my caches. Will it be helpful to restart? Or is there any other way to drop my memory caches that will not down my server?

    Read the article

  • Backing up data in an encrypted way

    - by Eli Bendersky
    I have the following use case: There's some data from my PC I want to periodically back-up online I own some hosting, so I want to use that for the backups, don't want to pay to another backup service I want to encrypt my data locally prior to moving it to the server I have no problem writing scripts to automate the process (say, periodically generate the backup and upload by FTP to my server), but my main question is about step 3 - the encryption: which way is recommended to encrypt my files (say, collected into a .ZIP) prior to uploading to the server? P.S. TrueCrypt seems popular but it's not quite what I'm looking for, since I don't want the files to be constantly encrypted here on my PC.

    Read the article

  • Firewall GPO not applying despite being enumerated by gpresult

    - by jshin47
    I have a need to open up the admin$ share on all of my domain's client PC's and I am trying to do so using group policy. I defined computer policy for Windows Firewall with Advanced Security in a policy object linked to the appropriate container and added the appropriate rules. However, they are not being applied! I feel like I have tried all of the obvious steps: I've checked gpresult and the resulting set of policy is the way that I would expect it to look. I've gpupdate /force and gpupdate /sync on a few client computers, but no matter what I do they don't seem to respond to my changes. I know that other computer policies in the GPO are being applied so it is strange that these are not. I have also disabled exceptions on clients in the firewall GPO, but that doesn't seem to be applying either. Here is a screenshot of the firewall.cpl from a client: Basically, although other options in the same GPO ARE applied for computer policy, the firewall settings seem to be ignored.

    Read the article

  • Download folders from dev server to local drive

    - by Niall Collins
    I am developing a .net web application on a local environment. I have a dev server that the application is installed on. Within the web application on the dev server I have four folders that I dont have locally and that are controlled by another application. In my day to day development I require the four folders on local PC. I would like to automate the process of pulling the folders from the dev server to my local drive, so I can keep thing in sync. Ideally something like this Run file from main folder (be it a bat file, powershell, some sort of job, open to recommendations) Download 4 folders supplied to it. First download bring everything down, from them on only pull the changes Not sure where to start with achieving this but would appreciate any help would with. I know there are apps out there that do something like this but would like to give a go writing something to do this before I resort to using one of them.

    Read the article

  • Linux servers seeing bad download performance behind Sonicwall firewall

    - by Joshua Penix
    I'm working with a pair of co-located CentOS Linux servers sitting behind a Sonicwall PRO 2040 Enhanced firewall running in transparent bridge mode. These servers are having a strange problem downloading files more than a few megabytes in size. For example, if I try to wget or FTP a copy of the Linux kernel from kernel.org, the first ~1-2MB will download at 600+K/s, and then throughput will drop off a cliff to 1K/s. I've reviewed all the firewall configuration settings for anything suspicious, but found nothing. More interestingly, I performed the same download with a Windows server sitting behind the same firewall, and it sailed right through at 600+K/s the whole way. Has anyone seen this? Where should I start looking to troubleshoot this problem?

    Read the article

  • Looking for a good Web Server that is cheap

    - by SoLoGHoST
    I am a Project Manager, and former Lead Developer for a software portal system that requires a forum software to run. I am in need of a server that is cheap, reliable, and supports the latest PHP (5.2+), MySQL, unlimited e-mails (preferably), a cPanel, multiple sub-domains (atleast 3+). Currently I am paying $34.95 USD/month (approx. $420 USD/year). This is too high for me to pay to keep the site running. I just recently became Project Manager and in charge of Finances and I'm extremely concerned for the future of Dream Portal. With those prices I'm not sure I'll be able to keep it running for too long. Can someone please tell me of a good server that meets all of the requirements that I listed above that is cheaper on a yearly basis? Note: Currently on a Dedicated Server with limited disk space at 15000 MB (15 GB), monthly bandwidth = 500000 MB, 50 emails limit, 20 sub-domains limit, 30 FTP accts., and 25 SQL Databases.

    Read the article

  • Secure Apache Virtual Hosts?

    - by Dr Hydralisk
    I am going to host a few small sites on VPS, and each of them are going to run my own custom PHP scripts. I am fairly certain that they are secure (did everything in the book, plus some of which is not in the book) to make sure they can't be exploited. But just to be safe I want to know how I could secure each of the virtual hosts so that they can't escape from there virtual host (if a hacker uploaded a shell they could not go above the www folder a legitimate user can't do in ftp no matter how many times they click ..) folder on Debian and Apache.

    Read the article

  • How to share iPhoto library between multiple users

    - by Mark
    I am looking for advice on the best way to share an iPhoto library between two users on the same Mac. I currently use this approach and it works fine most of the time. One issue I have is that I will get a permission error when syncing my iPod. This seems to happen if the other user has used iPhoto and I have not opened iPhoto before syncing the iPod. If I open iPhoto then sync the iPod again there is no error. How do others solve this problem?

    Read the article

  • How to analyse logs after the site was hacked

    - by Vasiliy Toporov
    One of our web-projects was hacked. Malefactor changed some template files in project and 1 core file of the web-framework (it's one of the famous php-frameworks). We found all corrupted files by git and reverted them. So now I need to find the weak point. With high probability we can say, that it's not the ftp or ssh password abduction. The support specialist of hosting provider (after logs analysis) said that it was the security hole in our code. My questions: 1) What tools should I use, to review access and error logs of Apache? (Our server distro is Debian). 2) Can you write tips of suspicious lines detection in logs? Maybe tutorials or primers of some useful regexps or techniques? 3) How to separate "normal user behavior" from suspicious in logs. 4) Is there any way to preventing attacks in Apache? Thanks for your help.

    Read the article

  • How do I install something from source and make it available to root?

    - by pwny
    I have a CentOS VM and I need to install the latest version of Ruby on it. Unfortunately, yum only makes Ruby 1.8.6 available so I'm trying to install Ruby from source. Here's what I'm using: cd /usr/src sudo -s wget http://ftp.ruby-lang.org/pub/ruby/1.9/ruby-1.9.3-p125.tar.gz tar -xvzf ruby-1.9.3-p125.tar.gz cd ruby-1.9.3-p125 ./configure make && make install The problem is that once that's done, I can only use Ruby as a regular user but I need to use it as root to install some gems. For example, as a regular user I can do ruby -v and it works but sudo ruby -v outputs bash: ruby: command not found. What am I missing to make stuff I install from source available to all users?

    Read the article

  • scp vs netatalk, samba, and/or vsftpd with External USB drive

    - by KitsuneYMG
    I set up a ubuntu server machine to share an ext2 formatted external usb drive. When attempting to copy a single 275MB files from said device through netatalk, I get estimated download rates at around 45 min. With samba and ftp (using vsftpd) I get 1+ hours! Using scp to copy the file results in complete download within 5 minutes. Another option, ssh+cp from external device to ~ and then using netatalk to grab it from there results in a total time of arounf 7 minutes. Does anyone have a clue what is misconfigured? Assuming that nothing is, is there any fs/pseudo-fs that would use the internal hdd as an intermediate location/onion-layer for the external hdd (for reads only)? Details: AppleVolumes.default: /mnt/ext USB allow:username cnidscheme:cdb options:usedots,upriv

    Read the article

  • to measure throughput of testing device connect to server via AP

    - by samantha
    Description of project- I have a test tool to which DUT connects. The test tool has an access point in it and once DUT get connected to it via mac address we check RSSI and some other features of WiFi of DUT. Now I am wondering is there is any way I can measure throughput of Device under test via mac address of DUT from server side. Test-tool has LINUX fedora 11 in it and major coding is done in c/C++ and json command. Previously, I have tried to install ftp server on test-tool and DUT can connect to it and we can measure the throughput or data transfer rate, but this is not feasible solution as it requires lot of intervention from DUT. What I am interested in is 1) To run some script on server side /test tool and it gives me throughput of bandwidth of connected device may be via mac address of DUT OR 2) Server script transfer some files/packets to DUT and we can measure the throughput. Coding is not a major challenge at this stage , I just need some tool which requires minimum intervention from DUT.

    Read the article

  • Single device that can work as tablet and as desktop PC

    - by flow
    I have a macbook pro and an iPad 2. I tried to use iPad 2 for work when I am out of the office but this is not very comfortable since one has to sync documents, not all Os X Lion apps exist for iPad 2 and so on. I think a solution would be a single tablet device that at the office can be connected to some dock station and then I have the equivalent of my macbook, but when I am out of the office I can touch the screen and used the same applications while on the road. As far as I know the iPad 2 does not and will not allow this. Therefore I wonder if you could recommend me another hardware/tablet/etc that could work this way now as of March 2012.

    Read the article

  • Redundant Web Space

    - by alisia123
    I have following problem. My domain is registred on service "A" My web-space (not a server) is on Godaddy. Once a week is my service unreacheble I am sure that is godaddy problem. My idea is to by some webspace of one different service to make my service redundant. Note: My service is only few html files with javascrit. I dont' need to sync But how to do it? Whre do I have to say "there are two webspaces, if one is not reacheble, so use the other" ? Thanks a lot

    Read the article

  • Using rsync with link-dest from HFS to NTFS

    - by Tom
    Hi, I'm having a problem with rsync. I'm on a Mac and I'd like to sync my everyday's changes from my HFS+ partition to my NTFS formated networked drive. Pretty simple, and everything goes well except that it syncs every file each times. Here's my script: #! /bin/sh snapshot_dir=/Volumes/USB_Storage/Backups snapshot_id=`date +%Y%m%d%H%M` /usr/bin/rsync -a \ --verbose \ --delete --delete-excluded \ --human-readable --progress \ --one-file-system \ --partial \ --modify-window=1 \ --exclude-from=.backup_excludes \ --link-dest ../current \ /Users/tommybergeron/Desktop/Brainpad \ $snapshot_dir/in-progress cd $snapshot_dir rm -rf $snapshot_id mv in-progress $snapshot_id rm -f current ln -s $snapshot_id $snapshot_dir/current Could someone help me out please? I've been searching for like two hours and I still am clueless. Thanks so much.

    Read the article

  • Setting up podcasting for a non-tech user

    - by Force Flow
    I have a user who wants to start making podcasts, but they only have basic skills when it comes to technology. So, I was trying to get a process together that would be easy for them to follow. To upload files (the mp3's and rss feed files), I have an explorer shortcut for their FTP space. To record the podcast, I was going to either use audacity or PodProducer. For the RSS feed, I was looking for a podcast RSS generator of some sort. In my search for this, I've come across a lot of dead links and a lot of paid tools, so I haven't come up with anything too useful. Is there a free, reliable webservice or windows-based tool available that folks like to use?

    Read the article

  • how can i realize a video-wall on 3-9pc with vlc

    - by Luca
    hello! i have to create a videowall, from 3 to 9 monitor. every monitor as a pc. actually, i stream from a server 9 movies with different istances of VLC, but i could also play on every machine the relative video with a single player. there's no problem. the real problem is that i really dont know how to sync the videos on a LAN...unfortunately there is a NETSYNC module inside VLC wich is NOT working. here are some info about my setup: videowall from 3 to 9 monitor || from 3 to 9 pc, all with the same configuration || a gigabit router+switch for the "dedicated" LAN im really stuck in this situation, if anyone has an idea or just a completely different solution, please, share it with me! thanks a lot in advance! :)

    Read the article

< Previous Page | 151 152 153 154 155 156 157 158 159 160 161 162  | Next Page >