Search Results

Search found 25629 results on 1026 pages for 'site maintenance'.

Page 702/1026 | < Previous Page | 698 699 700 701 702 703 704 705 706 707 708 709  | Next Page >

  • Offset AND incremental backup

    - by Pyrolistical
    I already do backups from my main computer to my server computer using synctoy. But now I also want to do off-site backup. My idea so far: have source hard drive (we'll call S) at home have backup hard drive at work called B have transport hard drive called T connect T at work and record index of files on B take T home and check index of S and note new/changed/deleted files and copy changed files to T take T to work and update S repeat Its basically a sneakernet and using all of the advantages of it. High bandwidth, low latency. Is there some software to do this, or do I have to write it myself?

    Read the article

  • Monit send alert to nagios NSCA

    - by mYzk
    I want to monitor a web with monit check host function and if the site is down then alert it to nagios nsca so it sends the info to nagios and nagios marks it as status OK or host down for example. The problem is that how can I make the monit alert fuction send the info to nagios nsca. I am not sure that this will work, but what I came up with is: set alert exec 'echo -e "nagios nsca format" | /usr/local/nagios/bin/send_nsca -H serveraddress -c /usr/local/nagios/etc/send_nsca.cfg' Would this work and is it the best solution to work with or can it be done some other way?

    Read the article

  • Elevate the weight of browsing history in Google Chrome's autocomplete

    - by maayank
    Google Chrome has the feature of auto-completing web addresses while you type them in the address bar. Alas, it gives absurdly more weight to Google's own auto-suggest v.s. my own browsing history, which seems a bit foolish - if I regularly (i.e. twice a week) check a certain website with the keywords "foo bar ponies" in its url, it is reasonable to expect that I will want to visit that site again and not other sites. While a bit subjective, to the very least I would expect such URLs to be in the list Chrome suggests, even if not at the top. Is there some plugin/secret option that alters the default behavior?

    Read the article

  • Retrieve malicious IP addresses from Apache logs and block them with iptables

    - by Gabriel Talavera
    Im trying to keep away some attackers that try to exploit XSS vulnerabilities from my website, I have found that most of the malicious attempts start with a classic "alert(document.cookie);\" test. The site is not vulnerable to XSS but I want to block the offending IP addresses before they found a real vulnerability, also, to keep the logs clean. My first thought is to have a script constantly checking in the Apache logs all IP addresses that start with that probe and send those addresses to an iptables drop rule. With something like this: cat /var/log/httpd/-access_log | grep "alert(document.cookie);" | awk '{print $1}' | uniq Why would be an effective way to send the output of that command to iptables? Thanks in advance for any input!

    Read the article

  • Pick Up BioShock and Bioshock 2 for Price of a Big Mac Meal

    - by Jason Fitzpatrick
    Pre-ordering just opened on the third-installment of the highly-acclaimed horror-survival game series BioShock, BioShock Infinite. As part of the pre-order promotions, you can pick up a bundled copy of BioShock and BioShock 2 for a song. For the unfamiliar, BioShock is an atmospheric first-person-shooter backed up by an incredible storyline set in the underwater utopian-turned-dystopian city of Rapture. BioShock 2 continues the story in Rapture and the upcoming release (Febuary 2013) of BioShock Infinite takes place in the same game universe but fifty years before the events of the first two installments. If that seems like the kind of game you could dig into, Amazon has the Windows-platform version of BioShock and BioShock 2 bundled together for a scant $7.49–81% off the Steam and general retail price. The best part about the promotion is you can either download the games from Amazon or, for those of you that use Steam, you can simply plug the game product key into Steam. You can read more about the both the original two games and the upcoming release at the official BioShock site. BioShock Dual Pack [via Geeks Are Sexy] How Hackers Can Disguise Malicious Programs With Fake File Extensions Can Dust Actually Damage My Computer? What To Do If You Get a Virus on Your Computer

    Read the article

  • Tools to manage large network of heterogeneous web applications?

    - by Andrew
    I recently started a new job where I've been tasked with managing a global network of heterogenous web applications. There's very little documentation. My first order of business is to create an inventory of all of the web applications. Are there any tools out there to manage a large group of web apps? I'd like to collect a large dataset for each website including: logins for web based control panels logins to FTP/ssh accounts Google analytics tracking code for each site 3rd party libraries used SSL certs, issuers, and expiration dates etc I know I could keep the information in Excel or build a custom database, but I'm hoping there's already a tool out there to help me with this.

    Read the article

  • Step 2 of instructions is not clear to me

    - by Albert Frye
    I want to make a bootable USB stick. I run the UUI. I see the instructions on this site: http://www.ubuntu.com/download/desktop/create-a-usb-stick-on-windows Step 1 says: Select "Ubuntu Desktop Edition" from the dropdown list Okay, so the actual title of the drop down box is: Select a "Linux Distribution" from the dropdown to put on your USB I am pretty new to computers. 67 years old. Live alone. Bought my first computer 3 months ago. So I will have to assume that when the instructions say "Ubuntu Desktop Edition", that means the same thing as "Linux Distribution". Okay, No big leap there. So far, so good . . . . . . . So I pick the very first selection: Ubuntu 13.10 Desktop i386 I'm not sure why there are so many choices, but I'm guessing I'm pretty safe with the first one. It's for a Toshiba Satellite laptop 64 bit Windows 7. Okay, now for step 2: The instructions say: Click 'Browse' and open the downloaded ISO file. The message in the window just before the "Browse" button says: Browse to your ubuntu-13.10*desktop*i386.iso -- Okay, so where's that file? So I click "Browse" and start looking for that file. It is nowhere to be found. So where the heck is it?

    Read the article

  • Windows XP, USB-Stick and multiple Partitions

    - by Bobby
    Hello. I've got an USB-Stick with multiple Partitions on it (FAT32 (active), FAT32, Ext2 <-- that's another story) and it seems like that my Windows XP can only mount the first partition of the stick. If I try to mount the second one using the volume manager it tells me that I need to make it active and reboot...is it really that limited or am I just missing something here? Partitions: FAT32, System Rescue CD, bootable and active FAT32, some tools ext2, some data (I know that I need extra drivers etc., but that's not asked here. Edit (Solution): Thanks to the answer with the RMB (ReMoveable Bit) I was able to dig up a solution described at this site (Section: On flash drive only the first partition works). Basically, there's an Hitachi Driver available which filters the RMB on Driver-Level, which just needs to be a little modified to function with basically every USB-Stick. All you need to do is adding the "Device Instance ID" to the driver and then use this driver.

    Read the article

  • .htaccess deny from all does not work?

    - by jeffery_the_wind
    I am running Apache 2.2.20 on a Ubuntu 11.04 web server. I have a Joomla site running on it, but I have also added some custom content. In the main web directly I have added a folder /images/sub_folder and in this sub_folder I have put a bunch of pictures. I do not want anyone to be able to simply access these pictures directly from the web, so I made a .htaccess file in that sub_folder and just put the following line in it: deny from all There doesn't seem to be any effect, I can still access the images directly from a web browser. I have restarted the Apache service. What am I doing wrong? Thanks Tim

    Read the article

  • Firefox location bar search: odd behaviour with Comodo firewall

    - by JNat
    I recently installed Comodo firewall, and it changed the behaviour of my Firefox location bar. I used to type something there and it either redirected me to a google search page or it actually redirected me to the site I wanted at once: If I typed 'imdb cloud atlas' it would redirect me to IMDB's page on Cloud Atlas, and the same would happen if I typed 'wiki' or 'wikipedia', redirecting me to Wikipedia's page on it. Now, it redirects me here. I checked this page and everything is according to what they describe, yet it doesn't seem to act the way it should. And there are no Comodo add-ons or extensions installed on Firefox. Can anyone help me?

    Read the article

  • Not enough storage is available to process this command

    - by Mohit
    I am getting this error on almost all of the operations on a Windows 7 pro 32 bit machine. By operations I mean anything I do. Update a repo from subversion. Access a local IIS Site. Copy a big folder. Run an installer.and sometime if I try again. It get solved. I think there is something wrong wit windows7 . I searched around and found posts suggesting to increase IRPStackSize value in registry I did that no Luck. I am using Microsoft Security Essentials Version: 1.0.1961.0 as my antivirus package Once this errors starts popping up. I have to restart and then in after some random time. It starts showing up again. Any help is appreciated. I am losing lot of my time in restarting my system or retrying again and again.

    Read the article

  • What kind of hosting do I need?

    - by Robert Smith
    I migrated this question from serverfault. Hopefully this is the appropriate place. I have been trying to answer this question but I haven't found an specific answer to my situation. As I want to pay for what I need, I thought I could get a good answer here. I have a custom made forum (rather than a built-in forum like the ones you can find in plugins, e.g. WP-Forum or phpBB type of software) in Django. I don't want to use Apache and modwsgi because it's usually very memory-hungry and I can't afford a big server. I prefer a combination of nginx and gunicorn which I think is very efficient (maybe you can also tell me what you think about that). I'm expecting to receive 10,000 to 20,000 visits each month with 15,000 to 30,000 page impressions. I have reviewed some cloud services like Amazon EC2 or Rackspace and other more traditional services (Linodo). This site won't use videos or big images and I certainly don't need a huge amount of bandwidth (200GB would be definitely too much). I need shell access so shared hosting is out of the question. What do I need to run a website like that without problems? What about RAM? 256MB would be enough (that's the amount of RAM offered by small instances in Amazon and Rackspace)? Do you know of any alternative to those I mentioned? If you need more information to provide a useful answer, please don't hesitate to ask. By the way, I was told that Linodo is not all that different to Amazon EC2 but this website is supposed to work 24/7, so I can't take advantage of Linodo's flexibility regarding creating and deleting instances. Thanks in advance.

    Read the article

  • Domain Controller Placement

    - by Matt
    I've been working through some Exchange training documentation (the official MS e-learning package) and all of the design scenarios allude to placing at least one DC from your forest root domain in every site. I'm not sure whether this just relates to Exchange, but I can think of a number of issues we experience in our forest that would be resolved by this. For example, a Microsoft support engineer has stated that EVERY client in a child domain (i.e. all workstations and member servers) need access to a forest root DC to check certificate/template permissions, even on a subordinate CA. I have attempted to locate documentation or guidelines from Microsoft on this, but have not been able to find anything. I found the Domain Controller Placement guide, but it's only a form - you would use it to document where you will place your DCs, but it doesn't give any guidance on where you should deploy them. Does anyone know where, or if, I can find any such documentation?

    Read the article

  • How cloudfront works?

    - by Dharmik Bhandari
    I'm planning to Implement CDN(Content Delivery Network) of Amazon which is known as CloudFront in ASP.NET MVC3 with c#. I've googled about it but little bit confuse about few things mentions below. Is it compulsory that we have to uploads all static resources to CDN Network first and then we can use or Is it manageable by Amazon to crawl site static resources which is predefine folder or directory of sites? Is Amazon automatic update its copies when we anything change in static resources or every time we have to upload updated resources to CDN network.

    Read the article

  • Troubleshooting a slow database server with no load

    - by user1721724
    I'm getting ready to soft launch my website and I've run into some problems with what I think is being caused by my MySQL database running on Fedora. All websites run fine, just as I'd expect, but any pages that establish a database connection hang until the connection is established, and then bang, the site loads as it should. Ex. My landing page (http://www.thrusong.com) doesn't make a database connection and loads quickly. User profile pages (http://www.thrusong.com/john) make a database connection and load slowly, even though most of the data comes from memcached and the database currently has no load on it. This problem just came up yesterday when my router died and I began using my Pace 2Wire modem with built-in router. Before, my old router was set to handle everything. My ISP says the settings in the modem are correct. Any ideas? Thanks in advance.

    Read the article

  • 301 redirect: Is this good or bad for 2 domains?

    - by Tim
    Since i couldn't find any appropriate answer to my specific question, I wanted to ask you. I've read alot of things about the 301-redirect for moving pages and so on. A customer of mine has booked a new domain last year for better search results (he included his main keyword into the domain. Before he had only a domain with his business name, which had nothing to say about what he does). I told him, that he should do a 301-redirect so he doesn't loose his position in Google and to redirect all new customers coming from the old domain to the new domain. After about one year where his site hat a good amount of traffic the search results of Google for his keywords are getting more worse. Since he didn't maintain his website (no new content, bad content on all pages and so on) I assumed this would be the problem. He gave his website to another company which also makes websites. They told him, that this 301-redirection is very bad for his website. They removed it, and also updated his content and the template so now he has the same meta keywords on every page (instead of the specific ones I put there before). He also removed the canonical-tag which I placed there to ensure no duplicate content. What I am now afraid of is, that without this redirect Google now will find duplicate content and therefore kick him out of the index, which would be a nightmare, since most of his customers come over his website. I need verification of the fact, that the 301 isn't bad but in fact the correct way of working with 2 domains. If possible with good sources I can point out to him since he don't wants to hear anything about this. If someone also has a few words about the keywords and the canonical-tag I would really appreciate it! Thank you very much!

    Read the article

  • How to suppress "Not collecting exported resources without storeconfigs"?

    - by Andy Shinn
    I'm getting the following in my Puppet master syslog over and over: Sep 27 11:52:05 puppet1 puppet-master: Not collecting exported resources without storeconfigs Sep 27 11:52:06 puppet1 puppet-master: Not collecting exported resources without storeconfigs Sep 27 11:52:06 puppet1 puppet-master: Not collecting exported resources without storeconfigs I'm not actually using storeconfigs: [ashinn@puppet1 ~]$ cat /etc/puppet/puppet.conf [agent] server = puppet.mydomain.com environment = production report = true [main] logdir = /var/log/puppet vardir = /var/lib/puppet ssldir = /var/lib/puppet/ssl rundir = /var/run/puppet factpath = $vardir/lib/facter pluginsync = true certname = puppet1.mydomain.com [master] modulepath = $confdir/environments/$environment/modules manifest = $confdir/environments/$environment/manifests/site.pp templatedir = $confdir/templates autosign = $confdir/autosign.conf ssl_client_header = SSL_CLIENT_S_DN ssl_client_verify_header = SSL_CLIENT_VERIFY report = true reports = hipchat Any way I can suppress these messages? What do they actually come from?

    Read the article

  • Blogger homepage won't update!

    - by Sims Siniron
    i am new on blogging webmaster Tools. When i usually add new post to my blog, it will automatic updated my homepage also. But from last 14th January, my homepage won't update by Google SEPR. As a result i am losing my popularity on SEPR. Previously when i post new article, 70-80% will go to the first page result. But after the problem occurs, none of them reach in top 15page of Google SEPR :( Last 1/12/12, Google webmaster sent me a "Notice of DMCA removal from Google Search" massage to indicates one of my URL that contains some infringing content which i deleted after receiving their notice. Not only that, i also cheeked all of my posts if there any additional infringing content available. After removing that, i fill out Google's Content Removed Notification form to notify them and Google also sent me a feedback that they received it and suggest "In the future, if you have removed the allegedly infringing content from your site (and won’t put it back), please use the correct form" which also i filled up. Now my question is that, Is everything alright which i did before? Although my new posts are indexed in GSEPR with ".." but why Google Robots.txt won't update my homepage which previously automatically updated when a new article was published.

    Read the article

  • Linux web server shared hosting file errors

    - by dfilkovi
    I'm using a shared hosting to host my website and have some problems with files from time to time. First, one of my file (php) was missing a part of code (nothing to do with hackers just a random piece of code was missing), then after some time a value inside a mysql table was also missing a part, then a whole table column disappeared, after that a whole file on my site disappeared and lastly again some code from a file disappeared, my hosting service says it has nothing to do with them, but this is stupid, how can this happen, no hacker attack could do such a thing, I believe it's some kind of a disk corruption or bad backup. Anyone have any ideas?

    Read the article

  • About C# objects and the possibilties it has

    - by user527825
    As a novice programmer and I always wonder about c# capabilities. I know it is still early to judge that but all i want to know is can c# do complex stuffs or something outside windows OS. I think C# is a proprietary language (I don't know if I said that right) meaning you can't do it outside Visual Studio or Windows. Also you can't create your own controller (called object right?) like you are forced to use these available in toolbox and their properties and methods. Can C# be used with OpenGL API or DirectX API Finally it always bothers me when I think I start doing things in Visual Studio, I know it sounds arrogant to say but sometimes I feel that I don't like to be forced to use something even if its helpful, like I feel (do I have the right to feel?) that I want to do all things by myself? Don't laugh I just feel that this will give me a better understanding. Is Visual C# like using MaxScript inside 3ds max in that C# is exclusive to do Windows and Forms and Components that are Windows related and maxscript is only for 3d editing and manipulation for various things in the software. If it is too difficult for a beginner I hope you don't answer the fourth question as I don't have enough motivation and I want to keep the little I have. Note: Sorry for my English, I am self taught and never used the language with native speakers so expect so errors. I have a lot of questions regarding many things, what is the daily ratio you think for asking (number of questions) that would not bother the admins of the site and the members here. Thank you for your time.

    Read the article

  • Is it okay to showcase templates/layouts recreated in different codes in a portfolio?

    - by Souta
    I have several different templates/layouts, both simple and complex. I recreated these templates multiple times, just using different codes. (Say, a complex one was originally made in only HTML and CSS, I recreated it using HTML, Javascript, CSS, then again with a HTML and PHP concoction, and etc.) I wanted to showcase my work and skills by doing this, but I don't know if it would be okay for that all to go into a resumé/portfolio. This is why: Freelancing Does potential business really care about how their site is made, as long as it looks and functions to their liking? (As in, should I just only show the one example of each template/layout and not the multiple recreations?) Potential Hire However, if a potential employer were to stumble across my resumé/portfolio, would having the multiple recreations do any good for a career outlook? (As in, this potential employer is a company where I could be working on a team to create/develop sites and not be freelancing; would a lack of skill-shining turn this employer away because I didn't set myself apart and show that I'm not just like every other budding web designer?) Those two issues have me wondering if it is okay to have a resumé/portfolio combined for this specific reason. Or does something like this not matter to potential business (as a freelancer) because they wouldn't care either way as long as it looks and functions to their liking and therefore it is okay to showcase the recreations with the originals?

    Read the article

  • What are the risks in putting website files in the "root" folder of a shared web hosting server?

    - by Obay Ouano
    A site I've been asked to manage is hosted (shared) on GoDaddy, with this folder structure: / public_html public_ftp mail stats logs etc... However, the website files are stored in the / folder, and NOT in public_html. I'm not sure if this is how GoDaddy sets up their customers' accounts, or if the old web developer accidentally changed it from public_html to root. But when we call up GoDaddy to tell them to correct this (move files to public_html), they won't change it and insist that there is no security risk unless someone gets a hold of the FTP password. Is this true? (I have always read that website files should be inside public_html.) If not, where could this setting be changed? The .htaccess is empty.

    Read the article

  • Web Filter For Multiple Networks

    - by Rob
    I have been using a Barracuda web filter 310 in our network and I have just had enough of it. It does not support trunking and we have several networks that have users that need to be web filtered. (I guess if everyone just did their flippin jobs I would not have this issue) but the management wants me to get it resolved. Does anyone know the top five web filters that are better than the barracuda web filters that support network trunking so that I can have multiple domains and subnets going through it? Thanks in advance - everyone on this site is gold in my book!

    Read the article

  • CPanel: Every url is being redirected to http://:2083

    - by Frank
    On my cpanel server, I restored about 50 accounts from crashed cpanel server. All of the sites were working fine, but suddenly without changing anything, every site started to get redirected to url "http://:2083/"., There is nothing in logs, no errors. when i do wget it says: wget grinfeld.com.br --2012-09-04 13:18:23-- http://grinfeld.com.br/ Resolving grinfeld.com.br... 198.101.221.254 Connecting to grinfeld.com.br|198.101.221.254|:80... connected. HTTP request sent, awaiting response... 301 Moved Location: https://:2083/ [following] https://:2083/: Invalid host name.

    Read the article

  • Globacom and mCentric Deploy BDA and NoSQL Database to analyze network traffic 40x faster

    - by Jean-Pierre Dijcks
    In a fast evolving market, speed is of the essence. mCentric and Globacom leveraged Big Data Appliance, Oracle NoSQL Database to save over 35,000 Call-Processing minutes daily and analyze network traffic 40x faster.  Here are some highlights from the profile: Why Oracle “Oracle Big Data Appliance works well for very large amounts of structured and unstructured data. It is the most agile events-storage system for our collect-it-now and analyze-it-later set of business requirements. Moreover, choosing a prebuilt solution drastically reduced implementation time. We got the big data benefits without needing to assemble and tune a custom-built system, and without the hidden costs required to maintain a large number of servers in our data center. A single support license covers both the hardware and the integrated software, and we have one central point of contact for support,” said Sanjib Roy, CTO, Globacom. Implementation Process It took only five days for Oracle partner mCentric to deploy Oracle Big Data Appliance, perform the software install and configuration, certification, and resiliency testing. The entire process—from site planning to phase-I, go-live—was executed in just over ten weeks, well ahead of the four months allocated to complete the project. Oracle partner mCentric leveraged Oracle Advanced Customer Support Services’ implementation methodology to ensure configurations are tailored for peak performance, all patches are applied, and software and communications are consistently tested using proven methodologies and best practices. Read the entire profile here.

    Read the article

< Previous Page | 698 699 700 701 702 703 704 705 706 707 708 709  | Next Page >