Search Results

Search found 9724 results on 389 pages for 'pro zeck'.

Page 159/389 | < Previous Page | 155 156 157 158 159 160 161 162 163 164 165 166  | Next Page >

  • Moving large amounts of data between shared hosts

    - by Bryan M.
    I recently acquired a client who is a photographer and was interested in moving web hosts since his current host had threatened to throw him off due to CPU spiking. The migration went fairly easily, with about 350MBs of website and media files. Then I discovered about 60GBs of client galleries he had failed to mention. I am unable to move this much data myself, since I'm capping out at about 20kb/s on the FTP connection. Has anyone encountered a situation where they needed to migrate this much data between cheap hosting? Should we contact the hosting companies about this (he is moving from Westhost to MediaTemple)?

    Read the article

  • Hacked by our own hosting company!

    - by dazhall
    OK, so our hosting company decided to clone our site and database onto a new serve. Without our knowledge or permission they then edited our code to point to the new database. The old server was left running, still pointing at the original database. The DNS was changed to reflect the new IP address of the server. Obviously during the propagation customers were hitting both the new and old servers, resulting in orders coming in to both databases, sometimes being split between the two. We're now attempting to reconcile the two databases. The question I have is is it still hacking if it was done by your own hosting company?! I'm fairly sure they shouldn't have edited our code! If they had left it as it was the site would have stayed pointed at the original database and we wouldn't be in this mess! I'm thinking that legal advice is need but just wanted to know if anyone had ever come across this situation before?!

    Read the article

  • Opinions on .gr (Greek) registrars?

    - by Marc Bollinger
    None of the previous questions tackle some of the one-off (or further) countries' registries, beyond .co.uk, .it, et al. or else I'd have found an answer myself. I'm just looking for information for a vanity domain, so obviously I'm alright without an answer, but it's an unasked question (or at least, unanswered), and I'm not exactly in a hurry to give my credit card information over country lines, sight unseen.

    Read the article

  • Managing products on a an ecommerce site [closed]

    - by John
    I've had a site that sells widgets for many years. I do not inventory my widgets, but the cost of adding them to the site and makings sure the site is current is becoming cost prohibitive. Here are the facts: I sell a single class of widget. I have about 50,000 widgets on my site. I have about 100 vendors that create and dropship the products when they get an order from me via email. Each vendor carries from 50 to 5000 types of widgets. Vendors all have websites with images and descriptions of their products. Each widget is produced in limited supply and usually sell out in 1-5 years. Prices of the widget often go up, sometimes more than 50% before they sell out. My vendors aren't very tech sophisticated. They have websites with their products, but most can't supply an api or database dump. Their websites usually display retail prices to the public, but I login or refer to a price list (usually excel) for wholesale prices. As it stands now, I hire local people to add and describe each widget to our website. It usually takes a person 4 minutes to add a widget to the site. This doesn't include moving to a new vendor. I feel like the upload/edit process is as good as it can get via a form/website. The problem is that it is getting very expensive to upload and keep the widget inventory current. I often get orders for something after it's sold out from the vendor or the price is wrong. This seems like it would be a problem in many industries. Can anyone suggest the cheapest way to upload inventory and ensure prices are current from my vendors? I'm assuming it will involve outsourcing, but I would like ideas on how to setup the compensation model.

    Read the article

  • My blog which gets 300+ daily impressions has stopped appearing on the 1st page of Google

    - by Sangram
    I have a blog regarding Placement papers from December 2010. My monthly impressions are around 4000. For the last 2 days, my blog has disappeared from Google search engine result pages. Impressions have reduced drastically. Please check Stat reports: My blog is still on the search engine because when I search site: mydomain.com on Google, I can see my all pages indexed over there… But my pages which used to appear on the first or second pages of Google do not appear any more. Example: If I search with query GE round 2 code writing test on bing.com or Yahoo search, the first link on the result page is my blog. But if you do same on Google, my URLs do not appear even on the 1st 3 result pages. I used to get lots of visitors by these search query earlier.

    Read the article

  • Is it unwise to blacklist an IP address?

    - by hawbsl
    We have a form on a commercial website which has been abused (but only once or twice) by someone from a particular IP address. A colleague wants to blacklist that IP address from the website. Seems to me that's overkill, and that there's a risk that genuine customers sharing that same IP address would be blacklisted too. I suppose a big part of my question is how many people might be sharing that same IP address and could be affected by our blacklist. I suspect that's a "how long's a piece of string" question but some ballpark answer would be really helpful. We're in the UK if that's significant.

    Read the article

  • What are the hard and fast rules for Cache Control?

    - by Metalshark
    Confession: sites I maintain have different rules for Cache Control mostly based on the default configuration of the server followed up with recommendations from the Page Speed & Y-Slow Firefox plug-ins and the Network Resources view in Google's Speed Tracer. Cache-Control is set to private/public depending on what they say to do, ETag's/Last-Modified headers are only tinkered with if Y-Slow suggests there is something wrong and Vary-Accept-Encoding seems necessary when manually gziping files for Amazon CloudFront. When reading through the material on the different options and what they do there seems to be conflicting information, rules for broken proxies and cargo cult configurations. Any of the official information provided by the analysis tools mentioned above is quite inaccessible as it deals with each topic individually instead of as a unified strategy (so there is no cross-referencing of techniques). For example, it seems to make no sense that the speed analysis tools rate a site with ETag's the same as a site without them if they are meant to help with caching. What are the hard and fast rules for a platform agnostic Cache Control strategy? EDIT: A link through Jeff Atwood's article explains Caching in superb depth. For the record though here are the hard and fast rules: If the file is Compressed using GZIP, etc - use "cache-control: private" as a proxy may return the compressed version to a client that does not support it (the browser cache will hold files marked this way though). Also remember to include a "Vary: Accept-Encoding" to say that it is compressible. Use Last-Modified in conjunction with ETag - belt and braces usage provides both validators, whilst ETag is based on file contents instead of modification time alone, using both covers all bases. NOTE: AOL's PageTest has a carte blanche approach against ETags for some reason. If you are using Apache on more than one server to host the same content then remove the implicitly declared inode from ETags by excluding it from the FileETag directive (i.e. "FileETag MTime Size") unless you are genuinely using the same live filesystem. Use "cache-control: public" wherever you can - this means that proxy servers (and the browser cache) will return your content even if the rest of the page needs HTTP authentication, etc.

    Read the article

  • When acquiring a domain name for product xyz, is it still important to buy .net and .org versions too?

    - by Borek
    I am buying a domain name for service xyz and obviously I have bought .com in the first place. In the past it was automatic to also buy the .net and .org versions. However, I've been asking myself, why would I do that? To serve customers who mistakenly enter a different TLD? (Would someone accidentally do that these days?) To avoid a chance that competition will acquire those TLDs and play some dirty game on my customers? If there is a good reason, or a few, to buy the .net and .org versions these days I'd like to see those listed. Thanks.

    Read the article

  • Using an old penalized domain for a new website

    - by MiladSafaei
    I had a website with 2 domains like these: firstdomain.com and first-domain.com. The main domain was first-domain.com and the other one was 301 redirected to first one. The main domain got a Google Penguin penalty some months ago. I uploaded the site on an new domain and removed Google index of old domain by using the remove URL tool in Webmaster Tools. Now, I want to use firstdomain.com (which was redirected to the penalized domain) for a new and fresh website with new and perfect content. Is it probable that history of this domain affects the new website and harms its ranking?

    Read the article

  • Repeating keywords in inbound links

    - by JJ_Jason
    Hy. I have a service similar to bit.ly. The link generation method is similar but the site is not. A user uses my site just like the mentioned bit.ly, but i offer a differnet kind of service for which i would want to rank (on Google) for. If i were to generate links such as: mysite.com/my-keywords/1Asdf34 would it be considered spammy or black hat? The same for bit.ly would be: bit.ly/url-shortening-services/3k1dS4sd For bit.ly it would defeat the purpose, but url length in my case does not have to be short.

    Read the article

  • Why doesn't Firefox cache my images and CSS

    - by Richard A
    I am using IIS7, I have already set up the following. But when I run Firefox it seems not to cache any of my images even with "remember history" set. <?xml version="1.0" encoding="UTF-8"?> <configuration> <system.webServer> <staticContent> <clientCache cacheControlCustom="public" cacheControlMode="UseMaxAge" cacheControlMaxAge="7.00:00:00" /> </staticContent> </system.webServer> </configuration> However when I use Firebug it still points to Firefox not caching images and CSS: public,max-age=604800 Content-Type text/css Content-Encoding gzip Last-Modified Mon, 27 Jun 2011 03:53:22 GMT Accept-Ranges bytes Etag "507968c27d34cc1:0" Vary Accept-Encoding Server Microsoft-IIS/7.5 X-Powered-By ASP.NET Date Mon, 27 Jun 2011 13:06:41 GMT Content-Length 5067 Request Headersview source Host www.xx.com User-Agent Mozilla/5.0 (Windows NT 6.1; rv:2.0.1) Gecko/20100101 Firefox/4.0.1 Accept text/css,*/*;q=0.1 Accept-Language en-us,en;q=0.5 Accept-Encoding gzip, deflate Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive 115 Connection keep-alive Referer http://www.xx.com/ Cookie __utma=62996397.135679654.1309106351.1309159743.1309164158.8; __utmz=62996397.1309106351.1.1.utmcsr=(direct)|utmccn=(direct)|utmcmd=(none); __utmc=62996397

    Read the article

  • wordpress is truncating and replacing category name and description in version 2.7.1 [migrated]

    - by Jayapal Chandran
    version: WordPress 2.7.1 I recently created a new category in a blog by wordpress. I created a long category name like win32 api programming and the description as windows api programming and win23api programming. But after saving it the name and the description changes to name: win32 api instead of win32 api programming and desc: Win32 api snippets I don't want to upgrade my wordpress because i dont like some new features in the new releases. what should i do to get my actual strings(names) intact?

    Read the article

  • Transition to new site

    - by James Hill
    I'm almost finished rewriting the website for a non-profit organization. The existing site receives ~5,000 a month. The new site is being written in ASP.Net and the existing site is PHP. The current hosting provider does not support .Net hosting, so I'll be switching providers. My question revolves around the transition from the old site to the new. I would really like to get the new site up at the new hosting provider and do thorough testing before changing the DNS records for the domain. Question: How can I put the new site up, test it, make any changes/additions necessary before updating the domain DNS to point to the new IP without Google indexing the content? Also, what SEO repercussions should I be aware of when making such a drastic change to the content that exists under the domain name?

    Read the article

  • HTML background-size:cover with floating objects

    - by Mikhail
    I have a trivial page with body having an image background, with background-size:cover. I set html { height:100% } to fill up the entire page regardless of the content amount. Up to this point everything worked as expected. I've added a div and set position:absolute; right:0; width:200px; This, again, worked as expected, until I added content. When this div is populated so much that the contents take up more space than the height of the page, the scroll bar appears. Scrolling down reveals that the background image does not actually cover the entire page. This is due to the fact that my div is taller than 100% of the HTML height. How can I address this?

    Read the article

  • How to target just one search engine and optimise for that

    - by mickburkejnr
    I've been dabbling with SEO a lot in the last 6 months, and one thing that has surprised me is the disparity between Google and Bing in the way they deliver results. A website ranked for a specific keyword/phrase on Google may rank 3rd on the first page, but using the same keyword/phrase on Bing will display the same website but ranked 15th for the exact same keyword/phrase. I came up with the idea to increase traffic to my website by targetting Bing instead of Google for several reasons. The biggest one is that while it's not the biggest search provider, people still use it, and I feel that if other websites have been "neglected" and not optimised for Bing my website would stand a better chance of getting near the top of their search rankings. The question is though how would I do this? A lot of the SEO advice on the internet is generic, but I can't help feeling it's Google orientated for obvious reasons. How could I optimise my website to be Bing friendly, rather than Google friendly? I know it sounds like suicide as I'm taking myself out of the Google mindset, but I feel it could work wonders for traffic to the site.

    Read the article

  • Difference between mail. and pop. & smtp.?

    - by Lea Hayes
    When hosting a website I often notice that all of the following are defined under DNS: POP = mail.example.com SMTP = mail.example.com versus POP = pop.example.com SMTP = smtp.example.com Is it wise to use "mail.example.com" for both POP and SMTP when configuring a mail client? What is the difference between each of the two approaches? It seems to work fine (sends and receives mail as expected).

    Read the article

  • check when a page was first seen by google

    - by sam
    Is there a way to see when a page was first published, by checking when it was first cahced by google (obviously its not 100% fool proof because there is a couple of days delay in some cases but it will give you a good idea.) The only other way i could think of checking it published date is if the page / post had a publicly viable time stamp on it, but in the case im looking for it, it dosnt have a publicly visible time stamp.

    Read the article

  • Remove third/nth level domains from google Index

    - by drakythe
    Somehow google has indexed some third(and fourth!) level domains that I had attached to my server temporarily, eg. my.domain.root.com. I now have these redirected properly where I would like them to go, however with a carefully crafted search one can still find them and I'd rather they not be exposed. My google foo skills have failed me in finding an answer, so I come to you wonderful folks: Is there a way/How do I remove sub-level domains from google search results? I have the site in google webmaster tools and verified, but all the URL removal requests I can perform append the url to the base url, not prefixed. And finally, how can I prevent this in the future?

    Read the article

  • Can other domain registrars view non-public whois information?

    - by user3188544
    If my domains are hosted at a registrar (lets take Gandi, for example) and it has privacy protection on the whois information, can another ICANN-accredited registrar (GoDaddy, for example) still view my actual information that is behind the privacy guard? i.e. I don't have a GoDaddy account. But, since they are ICANN-accredited, could they access the real whois info without the privacy protection?

    Read the article

  • What is the replacement for the Web Intents HTML standard?

    - by Tom
    "Web Intents" were deprecated in Chrome 24 (November/2011) and are no longer supported in any browser: We also gathered a lot of valuable data and feedback from our experimental support for Web Intents and decided to disable the feature in today's Beta release. Is there an HTML5 standard that I can look into as an alternative to what Web Intents intended? I'm interested in how web services can be stitched together. For example, imagine a website that can import a image from any number of web-services, modify the image in some way, then push the file back to any number of other web-services, all via HTML5 standards.

    Read the article

  • Panda 4: Reducing #indexed pages. How much is enough?

    - by Noam
    I've been hit by panda 4 (40% decrease). I didn't see any change during panda 1-3. From what I've read it and when compared to my site, the change is probably due to the fact that I have over 30M pages indexed on Google, and they've starting seeing that as some sort of bad indication. Although I feel all of the pages have a unique value that Google should crawl, it seems I should make some tough calls and deduce the indexed pages according to some prioritization I will conduct. The question is what should be my target, or what factors should help me figure out a relevant target. How many pages should I try to reduce to? - 25M - 15M - 1M - 2000 Is it enough to add noindex to low priority pages or should I also remove all internal linking to them?

    Read the article

  • Do PHP-FPM (and other PHP handlers) need execute permissions on the PHP files they're serving?

    - by Andrew Cheong
    I read in a post at Server Fault that PHP-FPM needs execute permissions. However, the answer in When creating a website, what permissions and directory structure? only grants read and write permissions to PHP-FPM. Maybe I don't quite understand how PHP handlers (or CGI in general) work, but the two claims seem contradictory to me. As I understand, when Apache / Nginx gets a request for foobar.php, it "passes" the file to an appropriate handler. That is, I imagine it's as if www-root (or apache or whomever the webserver's running as) were to run some command, /usr/sbin/php-fpm foobar.php Actually, no, that's naive, I just realized. PHP-FPM must be a running instance (if it's to be performant, and cache, etc.), so probably PHP-FPM is just being told, "Hey, quick, process this file for me!" In either case, I don't see why execute permissions are necessary. It's not like the webserver needs to literally execute the file, i.e. ./foobar.php Is the Server Fault answer simply mistaken?

    Read the article

  • Can 302 redirect on wordpress homepage affect SEO?

    - by user2402116
    I manage a website. In order to lead the visitor to the latest post when entering the website, I set a 302 redirect from the homepage URL to the latest post URL. There's 1 to 2 posts a month so the redirection is made manually each time a new post gets published. Despite the website registration on google it doesn't get indexed at all. After searching for some solution on blogs and forums I understood this issue might be related to the 302 redirection. It's the first time I use redirections on a website, can someone help me?

    Read the article

  • Custom ecommerce solution (similar to newegg.com) price [closed]

    - by Andrew
    I am wondering, how much would it cost me to hire a team of developers who could make improved clone of newegg.com ecommerce solution together with the back end? Please give comments based on your experience and realistic measures. Comments like "hire team of freelancers from India -they will do it for $1,000" are not welcome. I am talking about using realistic, quality and proven developers. Maybe good advise on how to approach ecommerce development? Thank you

    Read the article

  • CDN virtual subdomain causes duplicated content

    - by user3474818
    I have created a subdomain and a CNAME record which points to the domain root. The subdomain www.static.example.com is actually a copy of the entire website www.example.com and it is supposed to act as an CDN and serve static content in order to improve speed. However, all of my content can be accessed via subdomain aswell, so Google has indexed it all and now I am dealing with duplicated content. How could I deny access to crawlers for the subdomain baring in mind that I do not have different subfolder for subdomain, so I can't create a separate robots.txt file?

    Read the article

< Previous Page | 155 156 157 158 159 160 161 162 163 164 165 166  | Next Page >