Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 192/592 | < Previous Page | 188 189 190 191 192 193 194 195 196 197 198 199  | Next Page >

  • What do I do if a user uploads child pornography?

    - by Tom Marthenal
    If my website allows uploading images (which are not moderated), what action do I take if a user uploads child pornography? I already make it easy to report images, and have never had this problem before, but am wondering what the appropriate response is. My initial thought is to: Immediately delete (not just make inaccessible) the image File a report with the National Center for Missing and Exploited Children with all information I have on the user (IP, URL, user-agent, etc.), identifying myself as the website operator and providing contact information Check any other images uploaded by that IP user and prevent them from uploading in the future (this is impossible, but I can at least block their account). This seems like a good way to be responsible in reporting, but does this satisfy all of my legal and moral responsibilities? Would it be better not to delete the image and to just make it inaccessible, so that it can be sent to the National Center for Missing & Exploted Children, the police, FBI, etc?

    Read the article

  • Website restyle, SEO migration plan?

    - by Goboozo
    I am currently in a project for one of my biggest clients. We have built a website that will -replace- the old website. When it comes to actual content its is largely the same. However, the presentation of the content has changed drastically. From our point of view much more user-friendly (main reason to update the site). Now, since the sites presentation has changed we have some major changes in: HTML & CSS: To change the presentation of the content URL's: To make them better understandable (301 redirects have been taken care of and are in place) Breadcrumbs: To enhance the navigation (we have made the breadcrumbs match exactly with the url's) Pagination: This was added to enable content browsing Title tags: Added descriptive title tags to the major links and buttons. Basically all user content including meta tags have remained the same. Now since this company is rather successful and 90% of its clients come from Google's organic results I am obliged to take all necessary precautions. People tell me I need a migration plan to prevent the site being hurt in Google, but I have never worked using such a plan... ...So, based on the above. Would you consider a migration plan necessary and what precautions/actions would you recommend to prevent us being put down in our SERP positions? Many thanks in advance for your answers.

    Read the article

  • Howto fix "[Errno 13] Permission denied" in mailman mailing lists

    - by Michael
    After migrating domains from one plesk server onto another, I got several of those mails every day: (the target mailbox does not exist, so I get those as undeliverable mail bounces) Return-Path: <[email protected]> Received: (qmail 26460 invoked by uid 38); 26 May 2012 12:00:02 +0200 Date: 26 May 2012 12:00:02 +0200 Message-ID: <20120526100002.xyzxx.qmail@lvpsxxx-xx-xx-xx.dedicated.hosteurope.de> From: [email protected] (Cron Daemon) To: [email protected] Subject: Cron <list@lvpsxxx-xx-xx-xx> [ -x /usr/lib/mailman/cron/senddigests ] && /usr/lib/mailman/cron/senddigests Content-Type: text/plain; charset=ANSI_X3.4-1968 X-Cron-Env: <SHELL=/bin/sh> X-Cron-Env: <HOME=/var/list> X-Cron-Env: <PATH=/usr/bin:/bin> X-Cron-Env: <LOGNAME=list> List: xyzxyz: problem processing /var/lib/mailman/lists/xyzxyz/digest.mbox: [Errno 13] Permission denied: '/var/lib/mailman/archives/private/xyzxyz' I tried to fix the permissions myself, but the problem still exists.

    Read the article

  • My blog not even ranking for exact title match [on hold]

    - by Akshay Hallur
    I have original in detail blog posts related to blogging and SEO. This domain has been dropped (expired) 2 times before my acquisition. I am the 3rd owner of the domain since 143 days. Blog posts are not ranking even for exact titles. Google+ or LinkedIn shares will show up instead of my content.Some blog posts are not even indexed. I am hardly getting around 7 organic visits / day. Example 1 : http://www.infoflame.com/offer-pdf-of-blog-posts-for-likes-and-shares/    Title: Offer Readers PDF of Blog Posts for Their Likes and Shares not indexed at all.  Example 2 : http://www.infoflame.com/anchor-text-for-seo/    is indexed but not coming up for the exact title. Suspect: Dropped domain, less likely used for spam( WayBack machine (2 drops) 3 captures since 2004, I don't know whether there was Email spam) (But no manual actions in WMT, so no reconsideration request). What's the reason for this? Should I wait? How can I tell Google that ownership is changed and the domain is now spam-free? or should I de-index it and start a new blog? Thank you, for any advises.

    Read the article

  • is RapidSSL wildcard cert supported by major browsers?

    - by Jorre
    I'm thinking of buying a wildcard SSL cert from clickSSL : http://www.clickssl.com/rapidssl/rapidsslwildcard.aspx That would be a rapidssl certificate, and I was looking into my firefox options to see if RapidSSL is in the list of recognized Authorities. My certificate manager doesn't mention RapidSSL anywhere. Am I looking for the wrong name, e.g. is rapidssl recognized by browsers under a different name? I want to be sure that this certificate is working in all major browsers (including IE6)

    Read the article

  • Why do some bad websites rank well?

    - by BradB
    Consider the following scenario: you are pitching SEO/Website Optimisation to a prospective client and you explain to them the importance of great copy and content, how acquiring links (ethically) can increase page rank, why the quality of the HTML build matters (H1, H2 tags, w3c validation etc), why keyword research is beneficial, you may drop in a few Google Webmaster Guideline or Matt Cutts references to back up your claims and rubbish the "back hat" approach as being no longer effective for good measure. Your advice is ethical and in the eyes of best practices, spot on. Then, the client points out to you some of their long established competitors on Google and you see these competitor websites ranking in the top spots (1 to 3) for medium to highly competitive search phrases that your client wants to compete for. These websites totally contradict your ethical approach and pretty much violate every best practice previously noted. They even out perform other "white hat" competitors who are in accordance with the above guidelines. I experienced this today. One of these well ranking websites had: About six microsites with more or less the same copy and a slightly varied layout Little or not textual content I would almost say duplicate content across the sites, but there was so little of it it could barely qualify for being duplicate All the content in Flash (with a music track that kicked in on each page load, not so much of an SEO issue - but it helps paint the picture) Keyword stuffing behind the Flash file with a bunch of black text on black background in the style of keyword 1 keyword 2,keyword1,keyword 2,keyword 2 keyword 3 and so on... The exact keyword stuffed combination present on every page of the website A bunch of clearly self made links from poor quality forums and directories with little or no Page Rank Links exchanged across the microsites How do you explain your way out of this when this hard evidence is sat in front of you undermining your great pitch?

    Read the article

  • google custom search gives different result number for same query

    - by santiagozky
    We are using google custom search and we have found that often the totalResults iterates between two values, even for the same query. The different values can be slightly different or more than double. The parameters I am using look like this: https://www.googleapis.com/customsearch/v1? q=something cx=XXXXXXXXXX lr=lang_en siteSearch=www.mydomain.com start=1 fields=context%2Citems%28fileFormat%2CformattedUrl%2Clink%2Cpagemap%2Csnippet%2Ctitle%29%2Cqueries%2CsearchInformation%28searchTime%2CtotalResults%29%2Cspelling%2FcorrectedQuery key=YYYYYYYYYYYYYYY filter=0 This is problem because of calculating the number of result pages. How can I get the same results for the same query?

    Read the article

  • Geotargeted subfolder questions (Portugal/Brazil and Switzerland)

    - by Lucy
    We are at the beginning of the process to get multilingual versions of a website. We will be using subfolders working off the core domain (eg mydomain.com/fr/), set the geotargeting at webmaster tools and set hreflang attribute. I would really appreciate your help with a couple of questions. 1/Portuguese: we will have a Portuguese language version of the site. Our intention is to use this to cover users in both Portugal AND Brazil. ie, we are not going to do separate folders mydomain.com/pt/ and mydomain.com/br/ Can I use 2 hreflang attributes for this language version to tell Google it covers Brazil AND Portugal? What country code to use for this subfolder? 2/Switzerland Does anyone have best practice advice how to do this? One one hand, the subfolder should be mydomain.com/ch/ but as Switzerland covers 2 language possibilities (French AND German) - what to do? thanks

    Read the article

  • Using subdomains or directories for main categories?

    - by Matthieu
    I have a website which references places to travel to in the world. Those places are (of course) grouped by countries. Here is an example of an actual URL of my website: http://awespot.org/country/105/iceland I am wondering if it would be better, in terms of SEO, to have the countries separated in subdomains: http://iceland.awespot.org/ I know subdomains are considered by Google as different websites, so I am considering the 2 options: separating would mean also separating the pagerank and benefits of links to the website but separating would enable me to create a "web" or related websites (all related to travel and all) that link and benefit to each other I am only asking about SEO here, I know there are other questions raised by this possibility (user experience, administration... even password completion by web browsers)

    Read the article

  • Webmaster Tools, www and no-www, duplicate content and subdomains

    - by Jay
    I have not come to any conclusive answers on the following after many hours of research on many websites to the specific issue that I am trying to figure out. My company has two websites a main one at www.example.com and one at subdomain.example.com which is a subdomain of the first and is our self hosted blog. The way Google sees these with the www or no-www (called naked for now on) is that each of these actually are different when the www or naked version is used/not used in the front of the domain. I completely understand this. It is also advised that both should be set up in the Google Webmaster Tools, which I have done. Correct me if I am wrong on that in regard to having both set up. Now the way it appears is that we can set a preferred domain up in Webmaster Tools only at the root domain level. The subdomain can not have this and actually says the following "Restricted to root level domains only". So it appears that the domain should follow what the root domain says which on our preferred one says to display the www.example.com. and not the naked version. That is one issue I have in that one displays one way and the other displays another. Is it that we have the wrong redirects in place for the subdomain? Another question is does this have any affect on SEO in regards to duplicate content on the web in how we have set this up?

    Read the article

  • Symbolic link not allowed or link target not accessible: /var/www on Ubuntu 11.04

    - by Jamie Hutber
    I am getting a 403 when i access http://mayfieldafc.local/ upon looking in the apache logs i am getting [Wed Nov 16 12:32:59 2011] [error] [client 127.0.0.1] Symbolic link not allowed or link target not accessible: /var/www I have what i believe to be the correct permissions set on /var/www. hutber can create and delete files, hutber being my user. I can also execute as program on this folder. in mayfields vhost its: <Directory /var/www/mayfieldafc/docroot> Options +FollowSymLinks AllowOverride None Order allow,deny Allow from all </Directory> I am pulling my hair out not being able to work on my sites with my work ubuntu install. I know of nothing else that could be effecting this. So any ideas?

    Read the article

  • Firefox stalls on rendering when chrome doesn't

    - by amccormack
    I have a webpage that loads quickly 100% of the time in chrome, but only 10% or so of the time in Firefox. Looking at the fiddler capture, Firefox only loads 2 of the 100ish files being pulled before it hangs. The error does not seem to be on the server or network side, however, because Chrome never encounters a problem. How do I find the root of this stall? While I suspect Firefox's javascript execution is what is causing the hang, are there any particular methods to narrow down the search for the bad code?

    Read the article

  • Highly SEO optimised forum posts

    - by Tom Gullen
    Given the following forum post: Basics of how internals of Construct work I've used GameMaker in the past. And I know some C++ and have used a few 3d engines with it. I have also looked at Unity, though I didn't get too much into it. So I know my way around programming etc... My question is, how does construct work internally? I know it allows python scripting, which itself is "technically" interpreted, though python is pretty fast as far as being interpreted goes. But what about the rest? Is the executable that gets cre... The forum software will take the first 150 chars of the first post as the page meta description, and the title will be the thread title. All ok. So in Google it will appear as: Basics of how internals of Construct work I've used GameMaker in the past. And I know some C++ and have used a few 3d engines with it. I have also looked at Unity, though I didn't get too much... http://www.domain.com/forum/basics-of-how-internals-of-construct-work.html Now the problem is (not so much with this thread, but other ones) is the first 150 chars don't always create the best meta description. Is it worth my time to cherry pick threads and manually set their description/title tags so they read like: Internal workings of Construct 2 Events aren't converted to any other language. The runtime is a standalone compiled EXE application, which is optimised and actually very fast. Your events... http://www.domain.com/forum/basics-of-how-internals-of-construct-work.html The H1 on the page is still the original title, but we have overridden the title and description to look more friendly on search results. Is this advantageous forgetting the obvious time cost?

    Read the article

  • Different robots.txt for two different domains point to same folder

    - by Ali
    Hi, I have the following two domains: domain.com test.domain.com Both point to same folder which is "public_html". What I want is a different robots.txt file for each domain. So when someone browse domain.com/robots.txt then a different file is shown. And when someone go to test.domain.com/robots.txt then a different file is shown. How can I do this using URL rewriting in .htacces? Thanks

    Read the article

  • How to set up a SPF record?

    - by MeltingDog
    The clients on my VPS are all getting spammed. The spam seemingly comes from their own email addresses - it is clear that somehow something got into my VPS and was able to capture all the email addresses that existed and is now using them to send spam. I was advised to set up a SPF record, but I am unsure what this is or how to go about it. After reading, I have figured out how to create one in CPanel, but I cannot find what to do with it now. Do I copy it into somewhere in my DNS records in Zone Management? Can anyone point me in the right direction?

    Read the article

  • Mod Rewrite - url rewriting

    - by modrewriteNewbie
    I am very new to mod rewrite. I need to redirect any user with "citzenhawk" parameter in their url to my url for example http://www.mywebsite.com/?sc=CX12N003&cm_mmc=affiliate--citizenhawk--nooffer-_-na&prfc=5&clickid=0004c845fa9a87050a4277221a003262 should result into http://www.mywebsite.com/ Here are my rewrite conditions: RewriteCond %{QUERY_STRING} (&|^)cm_mmc=(.)citizenhawk(.)(&|$)$ RewriteRule ^/rrs/ [NC,R=302,L] Where am I going wrong? Is my RewriteCond wrong?

    Read the article

  • Why does Google Search Engine reject my title tag's change?

    - by Michal P.
    I made a simple webpage http://pundaquitboat.michaelspages.com/ giving it the the title tag "Boat – Pundaquit" and I have submitted it to Google bot by Google Webmaster Tools. Then I decided to change the title to "Anawangin trip" of the same page and I submited my webpage again in the same way to Google bot. The result was that the new title of my webpage coexisted with the old title of the same webpage in SERPs for maybe 2 days. After that the new title was rejected and if I enter site:pundaquitboat.michaelspages.com/ I can see that Google has my old copy of my webpage with old title in its database. This problem doesn't occur in Bing when I can enjoy high position of "Anawangin trip" phrase. (In Bing I haven't submitted the old version of title.)

    Read the article

  • Google Analytics - Showing multiple site stats at once

    - by John
    Is there a way in google analytics to add multiple sites to and show all the stats together? So like the graphs and total visits/unique hits all combined for all the sites added to the google analytics account? For example if I have: site1.com site2.com site3.com Under one google analytics account, is there a way in google analytics tool to merge them together so I can see a sum of all traffic in one report?

    Read the article

  • Increasing traffic for music blog

    - by Wladimir Ivanov
    I own a music blog with some articles and of course youtube iframes and mp3 listening plugins. I get traffic from google and some not very popular site (I don't mention it because you can think it's a SPAM) where I post links with pictures to my posts. Any ideas to get more traffic for this kind of blog? I know about Myspace, blog directories, video sharing sites, same niche blogs and relevant forums. Anything I'm missing? Thanks in advance

    Read the article

  • Tracking Unique site Views for 2012 - Not my website

    - by user580950
    I am in trouble. I placed and advt on a website in 2012 which said he has 950,000 unique visits each month so early in 2012 i advertised with them. The advertised didn't worked out so checked in 2-3 months time and i saw that the unique visitors on their site was 8,000 at that time.I immediately close the account I dont remember which site i was checking the unique visitors.That advt company has filed a dispute against me. So is there any tool that give me stats of 2012 of any website. i tried google trends but it doesnt show statistics .

    Read the article

  • Stop Apache serving filetypes

    - by ProfSmiles
    Preferably using .htaccess files, though .conf files are an option, is there any way to stop Apache serving certain filetypes? For example, .db shouldn't be served for obvious reason (privacy and whatnot, etc.), so could I make them show as a 404 but still have them available for my CGI scripts? Putting these sensitive files in a directory other than /public_HTML/ is also an option, though I like having them in the same directory as the scripts for ease of use. Cheers

    Read the article

  • Is it better to have AWS EC2 and RDS is the same Availability Zone?

    - by Dan
    I run a web app in an AWS EC2 instance and the database for the app in an RDS instance both in Amazon Web Services Region East-1. However, one of them is in Availability Zone 1a and the other is in 1d. Am I getting all the speed benefits of having both instances in the same "data center" (East-1) even if they are in different Availability Zones, or can I optimize by moving them to the same Availability Zone?

    Read the article

  • What tools to use for efficient link building?

    - by Evgeny
    As most SEO experts keep saying, it is not just the content that you have - but also a hefty amount of quality incoming links to your content that is important -- these are the two ways to get to the top of the search results. The question is where do I find the incomnig links? One way I know is Google Blog Search, it can be used to find blogs with related information to your content and some allow to leave comments. The comments usually consist of your name, e-mail and website. If you put your keyword instead of your name, then the keyword turns into a link to your website. Unfortunately most blogs put rel=nofollow on such links, but some blogs don't do that. What other ways are there to find quality pages to put keywords links back to your website? Quality link usually means: located on a page with relevant content does not have a rel=nofollow in the <a has a relevant keyword as in <a href=websitekeyword</a the page with the link has high PageRank (3+) and TrustRank

    Read the article

  • How to remove HTML code from search result page content

    - by Jack Torris
    I have music website. There are 46 album pages and each page has different player and files. I just entered the one of album's URLs in a search engine. I found that Google is displaying player code in search result content. For example, enter this URL in Google and check the results. Each result displays a .mp3 file in content section. I see this: This page contains a demo of and documentation for the new jPlayer Playlist add-on, ... mp3:"http://www.jplayer.org/audio/mp3/Miaow-01-Tempered-song.mp3", ... I don't want Google to show the player code and mp3 files in search result. How can I hide audio files and player code from search engine? What would be the best solution for it?

    Read the article

  • Strange Google Analytics result when new site launched

    - by Howard
    I have a web site which is mainly contains a few pages, and now we revamp-ed a new site which contains several hundred pages. We have Strange Google Analytics result, as follow: Before: Traffic sources (all traffic): 674 Content (all pages, unique PV): 674 After: Traffic sources (all traffic): 291 Content (all pages, unique PV): 1235 As you can see, the unique PV has increased as expected (as we have more pages now and the site is better), but why the traffic sources is lower and has a large gap? Any idea?

    Read the article

< Previous Page | 188 189 190 191 192 193 194 195 196 197 198 199  | Next Page >