Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 96/216 | < Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >

  • Google SMTP relay sending limits

    - by Gavin
    I'm considering using Google Apps for email with my company domain and for sending emails to customers from my website using SMTP. On Google's website it says the following: Limits for registered Google Apps users A registered Google Apps user cannot relay messages to more than 2,000 recipients per day. Limits per domain Per-domain sending limits are determined by the number of users in your Google Apps account. There are two per-domain limits: The maximum number of recipients allowed per domain per day is approximately 130 times the number of users in your Google Apps account. The maximum number of recipients allowed per domain in a 10 minute window is approximately 9 times the number of users in your Google Apps account. Additionally, the maximum number of recipients allowed per domain per day for accounts not yet paid for during the first month of service is 100. If I'm a single user, with a single domain, then does that mean I can only email 130 people a day using SMTP? That limit seems low.

    Read the article

  • How do I remove a LOT of indexed pages from Google?

    - by Thierry
    A few weeks ago we have figured out that Google has indexed some information we would rather keep in some confidentiality, in the format of individual PDF files. Our assumption was that this was a problem with our robots.txt we had overlooked. Even though we are not sure whether or not this is the case, we are certain that the robots.txt file is in a valid format and is, according to Google's webmaster tools, blocking the files. However, even after this adjustment that has been made weeks ago, Google still has the PDF files indexed, but does tell us further information cannot be provided due to the robots.txt file being present. As you can hopefully understand, this is unwanted behaviour due to the nature of the documents. I am aware that there is a request page being provided by Google for this purpose, but there are a lot of files. Is there an easier way to get Google to remove all of the files from its search engine? If not, is there anything else you could advise us to do besides manually requesting Google to remove every single page? Thanks in advance.

    Read the article

  • Issues with web hosting at home

    - by hari
    I want to host a small personal website at home. One basic problem I am hitting is, From inside home network, I cannot access my domain name. I have to use the local ip (something like 192.168.1.4) to access the website. This ip is the desktop which is hosting the website. Because of this mapping, I have issues setting up a simple wordpress blog on it too. How do I get past this issue? edit:0 when I try to access www.example.com (my domain) from within my home network, I get redirected to my router login. PS: 1) I am using dyndns service to map my non-static ip to my domain name. 2) My portforwarding works fine.

    Read the article

  • How can I convince IE to honor my explicit instructions to make a table column X pixels wide? [migrated]

    - by AnthonyWJones
    Please consider this small but complete chunk of HTML: <!DOCTYPE html > <html> <head> <title>Test</title> <style type="text/css"> span {overflow:hidden; white-space:nowrap; } td {overflow:hidden; text-overflow:ellipsis} </style> </head> <body> <table cellspacing="0" > <tbody> <tr> <td nowrap="nowrap" style="max-width:30px; width:30px; white-space:nowrap; "><span>column 1</span></td> <td nowrap="nowrap" style="max-width:30px; width:30px; white-space:nowrap; "><span>column 2</span></td> <td nowrap="nowrap" style="max-width:30px; width:30px; white-space:nowrap; "><span>column 3</span></td> </tr> </tbody> </table> </body> </html> If you render the above in Chrome you'll see the effect I'm looking for. However render it in IE8 or 9 the width and/or max-width is ignored. So my question is how do get IE to simply let me specify the width of a cell explicitly? BTW, I've tried various combinations of table-layout:fixed and using colgroup with cols and all sorts, nothing I've tried convinces IE to what I'm clearly asking it to explicitly do? If I had any hair before starting this I wouldn't have any left by now.

    Read the article

  • How can I prevent Google mistakenly offering to translate a page?

    - by DisgruntledGoat
    Several of my site's pages are appearing in search results with [Translate this page] next to it. When I click that it takes me to Google Translate and translates my page "from Catalan to English". The pages are in English but have a couple of foreign words (actually Japanese romanisations, not Catalan) that appear to be tripping Google up. A few weeks ago I set the html tag to <html lang="en"> which from research appears to be the best method to specify the language of a document. Google has cached the pages with this attribute but it is still offering to translate. More research led me to a "notranslate" attribute which prevents translation entirely: <html lang="en" class="notranslate">. The problem now is users cannot translate from English to their desired language! Are there any other solutions that force Google to parse my site as English only?

    Read the article

  • Suggested ways of collecting 1000's of links to MSM media articles

    - by Matt
    I'm currently running a modified Wordpress site that is uniquely designed to simply publish links to other sites, similar to The Drudge Report. Right now I have a few dozen Google Alerts setup and go through each result manually and if it matches a few niche keywords I'm working with, then I add a link to the article to my site. I do the manual checking because sometimes Google Alerts finds links to sites that belong to service providers, organizations, or products, but all I want are mainstream news articles. So my question is there a more efficient - and ideally automated - way to go about performing highly qualitative searches and aggregating such links?

    Read the article

  • How does one block unsupported web browsers?

    - by Sn3akyP3t3
    Web browsers with an end of life no longer receive security updates which not only makes them vulnerable to the end user, but I imagine its not safe for the server's which receive visits by them either. Is it practical to block or enforce and notify the end user that their browser is unsafe and unsupported? If so, how would one achieve that? I don't know of any official or crowd-sourced listing with that information to parse and keep up to date. I'm aware that the practice can be custom built with User Agent parsing and feature detection for HTML5 enabled browsers.

    Read the article

  • Sharing unique links on social media vs SEO

    - by MJWadmin
    We're currently implementing a voucher system on our site which will allow our users to obtain a 25+% discount on certain products, provided they donate 10% of the purchase price to charity. We will offer the ability to share the discounts via social media in return for larger discounts to the sharer for each person who clicks through the link and buys an item. I understand that social links have SEO benifits, but this appears to be based on lots of people sharing the same link. If our voucher users share a unique link i.e. http://ourdomain.com/sipsfesdf rather than a fixed link http://ourdomain.com/product-name will we still receive the same benifts? Should we instead share something like http://ourdomain.com/product-name/sipsfesdf Thanks in advance.

    Read the article

  • Dreamweaver CS5 Test server works but cannot connect to host server through files window

    - by Toni
    I've been managing this site for a long time and update coupons on it approximately every 60 days. For some reason, I'm not having problems: I opened DW CS5 today and made the changes necessary to update coupons. I was able to connect to the host server with no problem but most of my coupon images were not showing up. DW tells me I have 70 broken links, which can't be the case because I've reviewed them. Some links work and are the same as the broken links other than the file name. Unable to figure it out, I thought maybe restarting my Mac would help. However, upon logging back into DW, I am now unable to connect to the host server. I get an FTP error notice that the file doesn't exist or there is a permissions problem. Funny thing is, I can connect successfully if I test the connection through the Site Management window. I have connected to my host server through FileZilla and see all the files there, unfortunately, I still can't get the web pages to display the coupons. Has anyone else had this issue and if so, what is the solution? I feel like this is probably a simple fix, but I cannot for the life of me determine what it is! If anyone knows a solution, I'd really appreciate the help! -Toni

    Read the article

  • How to handle URLs with diacritic characters

    - by user359650
    I am wondering how to handle URLs which correspond to strings containing diacritic (á, u, ´...). I believe what we're seeing mostly are URLs where diacritic characters where converted to their closest ASCII equivalent, for instance Rånades på Skyttis i Ö-vik converted to ranades-pa-skyttis-i-o-vik. However depending on the corresponding language, such conversion might be incorrect. For instance in German, ü should be converted to ue and not just u, as seen with the below URL representing the Bayern München string as bayern-muenchen: http://www.bundesliga.de/en/liga/clubs/fc-bayern-muenchen/index.php However what I've also noticed, is that browsers can render non-ASCII characters when they are percent-encoded in the URL, which is the approach Wikipedia has chosen, for instance http://de.wikipedia.org/wiki/FC_Bayern_M%C3%BCnchen which is rendered as: Therefore I'm considering the following approach for creating URL slugs: -(1) convert strings while replacing non-ASCII characters to their recommended ASCII representation: Bayern München - bayern-muenchen -(2) also convert strings to percent encoding: Bayern München - bayern_m%C3%BCnchen -create a 301 redirect from version (1) to version (2) Version (1) URLs could be used for marketing purposes (e.g. mywebsite.com/bayern-muenchen) but the URLs that would end being displayed in the browser bar would be version (2) URLs (e.g. mywebsite.com/bayern-münchen). Can you foresee particular problems with this approach? (Wikipedia is not doing it and I wonder why, apart from the fact that they don't need to market their URLs)

    Read the article

  • Hosting multiple client website on single hosting account

    - by Bhavesh Gangani
    I'm WebDesiner and I have currently only a few clients for making website. I've unlimited hosting account and I want to host their websites in my account without reseller account (actually it is not needed for constness). Only my client's need is ftp access to their personal directory. So is it possible to give them saperate phpMyAdmin access in this strategy ? As per my knowledge it is done with "addon" domain pointing on my hosting account's directory with cPanel, am I right ? or there is another solution for it except reseller account ?

    Read the article

  • Keep search engine from indexing specific content on your site

    - by Jimmy Chopps
    I've got a pretty weird scenario that I was wondering someone could help me out with. I recently created a blog site and noticed that search engines have been including the content of my footer in with the description. This presents a problem because my footer is basic a brief legal statement saying that the views are my own and don't represent the company I work for (and yada yada yada). So, basically, I need a way to prevent search engines from indexint that content in my footer or even my footer altogether. I've been looking back through some of my SEO books and searching through forums but this doesn't seem possible. Is it possible to keep search engines from indexing only certain content on a page? If it isn't possible, what alternatives are there to ensure this legal mumbo jumbo doesn't show up in the results?

    Read the article

  • Beats Audio Headphones is the best passion of those designs

    - by WoolrichParka
    Beats By Dre Pro is accordingly smarter to buy the headsets on the internet back you will get a huge gathering of headsets place you can gently be able to achieve an overall choice.I want to give monster surpasses headsets all the awards because they create viewing activities about the Globe Cup so interesting.It has aegis adjoin your anthology and beanbag aggressive awnings recognizing the clover autogenously and abiding external.They are use firm with Beats Headphones tv set can also be incredibly common.wufengfengmaple36

    Read the article

  • Does the Adblock Plus extension prevent malicious code from downloading/executing? [closed]

    - by nctrnl
    Firefox and Chrome are my favourite browsers. The main reason is an extension called Adblock Plus. Basically, it blocks all the ad networks if you subscribe to one of the lists, like EasyList. Does it also protect against malicious ads on completely legitimate websites? For instance, several news websites use ad services that may allow a malicious user to insert "evil code". This makes the web very unsafe, especially for those who lack a serious antivirus product.

    Read the article

  • Multiple domain names with pages linking to one website

    - by Mark Ravenhill
    Hello, I work for a company who have been redesigning their company website. I have been asked to register loads of domain names that contain the keywords that they want to use on the original site. Each of these domain names will contain a one page website with a destription of what the company offers and a link saying something along the lines of 'click here for more infmormation' which then takes you to the main site. The idea being the main site will then be recieving a lot of inbound links and hopefully rise in the google rankings, not to mention bring in more customers who have come to the site from all the other domain names who wouldn't have normally got to the website because it wasn't ranked on the first page. Is this a good idea or will Google see this as spam and penalise the main site for having loads of links to it from one page websites hosted on the same nameserver? Any advice would be greatly appreciated. Thanks, Mark.

    Read the article

  • Moving from one DNS provider to another

    - by Senthil Kumaran
    I had registered with a particular DNS provider X and I have been unhappy with their services and now when the time for renewal came, I did not renew and I let it expire. I am hoping that once it is expired from this provider, I would be able to sign up for the same domain name from an alternative provider which I have tested and I am satisfied. What kind of precautions should I take? The domain name is not a critical one, it is of a NGO and we prefer to own it again without any change in the name. The information given by the expiry notice says Domains can be renewed between 90 days before and 14 days after the expiry date. If domains are not renewed they will be removed from the account and set for deletion. Should I wait for time till gets deleted at their end so that I can sign up for the same from another provider?

    Read the article

  • Standard ratio of cookies to "visitors"?

    - by Jeff Atwood
    As noted in a recent blog post, We see a large discrepancy between Google Analytics "visitors" and Quantcast "visitors". Also, for reasons we have never figured out, Google Analytics just gets larger numbers than Quantcast. Right now GA is showing more visitors (15 million) on stackoverflow.com alone than Quantcast sees on the whole network (14 million): Why? I don’t know. Either Google Analytics loses cookies sometimes, or Quantcast misses visitors. Counting is an inexact science. We think this is because Quantcast uses a more conservative ratio of cookies-to-visitors. Whereas Google Analytics might consider every cookie a "visitor", Quantcast will only consider every 1.24 cookies a "visitor". This makes sense to me, as people may access our sites from multiple computers, multiple browsers, etcetera. I have two closely related questions: Is there an accepted standard ratio of cookies to visitors? This is obviously an inexact science, but is there any emerging rule of thumb? Is there any more accurate way to count "visitors" to a website other than relying on browser cookies? Or is this just always going to be kind of a best-effort estimation crapshoot no matter how you measure it?

    Read the article

  • Does Google AdWords care about duplicate content?

    - by Yarin
    Our site offers several families of products, all of which have a common set of configurations. For simplicity's sake, we'll say we offer products A, B and C, each with configurations 1, 2 and 3 Products: A, B, C Configurations: 1, 2, 3 We want to create landing page <- ad group combinations that reflect each possible combination of each product and configuration. Each product and each configuration have their own page, and so each landing page would have include the product content and the configuration content: ourproducts.com/A-1 (Contains copy for A and 1) ourproducts.com/A-2 ourproducts.com/A-3 ourproducts.com/B-1 ... etc... As you can see, this will lead to duplicate content across our product pages, though in different combinations. My question is, does this matter from AdWords point of view? Will there be any negative consequence to repeating portions of content this way?

    Read the article

  • How can I make a browser trust my SSL certificate when I request resources from an external server?

    - by William David Edwards
    I have installed an SSL certificate on one of my domains and it works perfectly, but on some pages I include a Google Font, which causes my certificate icon to change in: instead of: The reason, according to Google Chrome (translated with Google Translate): Your connection to xxxxxx is encrypted with 128-bit encryption. This page includes other resources which are not secure. These resources can be viewed by others while in transit and can be modified to fit. So how can I make the browser 'trust' my SSL certificate, even though I request an external resource from Google Fonts? And also, does it matter that I use links like these: <link rel='stylesheet' id='et-shortcodes-css-css' href='https://xxxxxx/wp-content/themes/Divi/epanel/shortcodes/css/shortcodes.css?ver=3.0' type='text/css' media='all' /> instead of <link rel='stylesheet' id='et-shortcodes-css-css' href='wp-content/themes/Divi/epanel/shortcodes/css/shortcodes.css?ver=3.0' type='text/css' media='all' /> Thanks!

    Read the article

  • yahoo media player not working

    - by luca590
    I have a yahoo media player embedded in my webpage. I am currently using Ruby on Rails to create/edit my web page. When i click the play button next to a track the YMP waits a while and then goes to the next track without playing the first one. I then get a warning on my second (last) track that its file could not be found. Does anyone has a better recommendation for an audio player or a way to fix this one?

    Read the article

  • Fixing Google Chrome text antialias for .ttf fonts

    - by 71GA
    I have found a topic which presents a solution on how to get antialising working in Google Chrome - Windows, but they use .svg format. I have a .ttf format and I import all of my fonts like this at the moment: @font-face {font-family: "t1"; src: url(../fonts/title/circle.ttf);} @font-face {font-family: "t2"; src: url(../fonts/title/sanserifing.ttf);} @font-face {font-family: "t3"; src: url(../fonts/title/serveroff.ttf);} @font-face {font-family: "t4"; src: url(../fonts/title/pupcat.ttf);} How can I achieve antialising done right in Google Chrome Windows?

    Read the article

  • Website cookie scanner

    - by user359650
    I'm in charge of a relatively big corporate website (circa 95K pages) and need to perform a cookie audit. I can see cookies issued on a per-page basis with Chrome or Firefox console, but given the amount of pages I need a tool to automate the process. I tried to google for website cookie scanner but my search was unfruitful and found: either online tools which only scan the home page paid services (ex1, ex2) Does any of you know about a tool to scan an entire website and generate a report showing which cookies are being used and which page set them?

    Read the article

  • 250 k 404 & 410 errors in Webmaster Tools. Bad backlinks?

    - by Natália
    Our webmaster tools account is showing 250.000 errors related with weird links from other sites. These URLs are comming mostly from non existent sites or are being generated directly by our website. Here some examples of these urls: oursite.com/&q=videos+caseros+sexo+pornos+gratis&sa=X&ei=R638T8eTO8WphAfF2vG8Bg&ved=0CCAQFjAC%2F%2Fpage%2F2%2Fpage%2F3%2Fpage%2F4%2Fpage%2F3%2Fpage%2F4%2Fpage%2F3%2Fpage%2F4%2Fpage%2F5%2Fpage%2F4/page/3 Our site is a popular spanish adult site, yet we don´t have keywords which are being mentioned in this url. Apparently this link comes from our site. Some more examples: oursite.com/&q=losmejoresvideosporno&sa=X&ei=U__8T-BnqK7RBdjmhYsH&ved=0CBUQFjAA%2F%2Fpage%2F2%2Fpage%2F3%2Fpage%2F2%2Fpage%2F3%2Fpage%2F2%2Fpage%2F3%2Fpage%2F4%2Fpage%2F3%2Fpage%2F2%2Fpage%2F3/page/4 Once again: not our queries, not out urls. oursite/tag/tetonas We think that it might be other site, which is having a policy of extremely bad SEO based on other sites branding and keywords usage: thirdsite/buscador/tetonas-oursite The question is: if other sites are generating these urls, how can we prevent this? Why the tag is being generated if no link was added to the other site? What should we do with these errors? 301? 410 gone? I have read all similar Q&A here but none of them seems to solve our problem. It is not likely to be a bad ad (Inspected them all). Maybe some all content which google decided to recrawl suddenly? Maybe third parties bad SEO policy? Maybe all of them? Any help will be higly appreciated,

    Read the article

  • Bootstrap 3.0.0: How to use data-slide-to outside of indicators?

    - by Griffin
    I am attempting to make a small gallery like the one shown below - I'm sure you've all seen them considering they are fairly common. When trying to make one using bootstrap I ran into a major problem. I can't seem to link the smaller bottom images to the larger top one that was when one of the smaller ones is clicked it changes to the selected image. I am attempting to use data-slide-to however it does not seem to work outside of the "carousel-indicators". I can't put it into the carousel indicators list because that moves the images up into the gallery (It may be possible to fix this with CSS but my attempts have been worthless). Does anyone know the problem? I've tried tags around each image that didn't seem to work then I tried divs. Still nothing. Things to note: I am using 3.0.0 All images are generated (if you haven't guessed already) Smaller images are separate from larger one (not auto scaled down)

    Read the article

  • Pagination for product listing, what to use? "canonical" or "rel-prev-next" or do nothing?

    - by Jayapal Chandran
    I want to make sure my product listing is 10 products per page which are not in a series (link). They have explained how to use canonical or rel prev for pagination when a long page has been divided into multiple page and the multiple pages becomes a series were as my condition is not that. They are unique listing which are not related to each listing... All the listing links leads to a product profile page. So lets say my site is all about cars and I have a Used Audi page with 1000 Audi's for sale. There are 10 used audi cars on each page so there's 100 pages in the series. If I start to utilise Rel="prev" and rel="next" should I set page 2 onwards as index,follow or noindex,follow? The content on Page 2 all the way to 100 only changes ever so slightly as different cars will be for sale on different pages but from a "Panda" point of view the pages are incredibly similar as they'd hold the same meta data as page 1 in the series along with duplicate reviews & news etc. I want Page 1 in the series as the Main page for Google to send users too and I don't see the point in Google indexing page 2 100. What's everyone's view on this? Lastly with the rel="canonical" tag should page 2 to 100 all point back to page 1 in the series or the individual page itself? E.G: /used-audi/page-3/.

    Read the article

< Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >