Search Results

Search found 9724 results on 389 pages for 'pro zeck'.

Page 237/389 | < Previous Page | 233 234 235 236 237 238 239 240 241 242 243 244  | Next Page >

  • Could multiple uses of the same keywords in image alt attributes hurt SEO?

    - by saratogahiker
    Let's say on an e-commerce site that sells unique pens, on a particular pen's product page, the image of the pen has an alt attribute value of "unique red-striped pen"... and another product has "unique blue-spotted pen", etc... The keywords across all products being "unique" and "pen", which would also be helpful when it comes to SEO. However, if the person just goes to the general "unique pens" category page and sees a list of thumbnail images, each with the words "unique" and "pen" in the alt attribute, would that potentially have a negative impact with regards to SEO by having the same keywords too many times?

    Read the article

  • Get exact time of each click - AdWords

    - by almo
    I have an AdWords campaign running and want to get a list with all clicks and the exact time the click occured. And I want to do this without having a tracking code on my homepage. So it needs to be via the API (AdWords/Analytics). You might think know that this is not possible. But big Bid Management tools (e.g. Kenshoo) know all details of every click and they don't have tracking code of my website (only on the success page of my contact form). How is that possible? The dimensions in AdWords only let me group the clicks at the hour level.

    Read the article

  • MySQL can only log in as root, even after creating new users with their own database

    - by ionFish
    Problem: I just set up a Debian Wheezy installation for testing, and installed the LAMP packages and PMA. I can log in as root with my pre-defined password, create/edit/delete both databases and users. The problem comes when I create a new user 'something', set a password for it, and grant it all privileges on a table 'something' (same as the username). Upon connecting, it denies access to the user. Details: Host: localhost using MySQL 5.5.24-8 Creating user: CREATE USER 'something'@'%' IDENTIFIED BY '***';GRANT USAGE ON *.* TO 'something'@'%' IDENTIFIED BY '***' WITH MAX_QUERIES_PER_HOUR 0 MAX_CONNECTIONS_PER_HOUR 0 MAX_UPDATES_PER_HOUR 0 MAX_USER_CONNECTIONS 0;CREATE DATABASE IF NOT EXISTSsomething;GRANT ALL PRIVILEGES ONsomething.* TO 'something'@'%'; Checking privileges: GRANT USAGE ON *.* TO 'something'@'%' IDENTIFIED BY PASSWORD '*92F9DAF5F5129554509489FDB6A433510223C799'; Result: Access denied for user 'something'@'localhost' (using password: YES) More Info: I use this same exact procedure for the Squeeze distribution, and it works perfectly. Is there a chance it's because of Wheezy, or something else? I need to continue using Wheezy because of the updated packages (for this test server -- the others work fine), so 'just use Squeeze' is not an option. Note: I HAVE tried flush privileges; to no avail.

    Read the article

  • backlinks: Two domains with same IP

    - by Michal
    I run several different web pages on different servers (with different IP addresses). These pages are linking to each other in order to boost number of back links pointing to my pages. I would like to move all those projects to a single virtual host (with a single IP address). My question is, how google handles links within different domain names but same ip address. Is there some penalization for it? Could this lead to lower page rank?

    Read the article

  • I think there is a problem with my url encoding

    - by TheGateKeeper
    I took someone's advice and started encoding my image sources, but Google doesn't seem to be able to decode them. I probably did something wrong, because basically Google is taking the full path as the image's name. See this page as an example. If you go on the top most thumbnail and do "Save as", you will see the path is not being decoded. Should I stop decoding or am I doing it wrong? Should I encode only the image name itself? Thanks!

    Read the article

  • Comparisons of Javascript 'data grids'?

    - by Joe
    I've found plenty of questions between here and StackExchange of people asking for the 'best' data grid / data table, or one that has a particular feature, and plenty of lists out there (of various ages) listing the various data grid implementations ... but is anyone aware of any matrix of what features the various solutions implement? (eg, allow shift-click to select multiple; support checkboxes for selection; can update a regular table in-place; allow editing of cells; support websql or indexeddb for local caching; which browsers they support; infinite scroll; etc.) There's a generic 'javascript framework' comparison on wikipedia, which would be the sort of thing I'm looking for, but it doesn't go into detail on data grids. (which makes sense, as so many are extensions, not core features of those frameworks, and in the case of jQuery, there's lots of 'em.)

    Read the article

  • Improving FAQ SEO with multiple pages?

    - by asdfasdf
    I have a client who has over 200 Question/Answer style content blocks. Neither the questions or answers are very long and most of them have almost the same question but with a word or two differentiating themselves from the rest of the questions. Would SEO be helped or hurt if I would to put each QA on its own page with the title of the page the question being asked etc... Or, would that be considered "farming"? If not, what would be the best way (in SEO world) do present all these QAs? Thanks for any advice..

    Read the article

  • Making a calendar tool with Google API. Should I use OAuth Service Account?

    - by Goluxas
    I'm creating a calendar tool for a client. He has his work calendars in a Google account, which I have access to. I'll be using the Google Calendar API for the tool, but I'm new to Google APIs and the OAuth stuff. I'll be using PHP. Do I need to use an OAuth2 Service Account, or should I make a regular OAuth2 Web Application Client ID from that Google account? In case this information is helpful, the tool I will be making will do this: 1) Allow my client to fill out a form which will create a new calendar and populate it with several events. 2) Send out an email to a mailing list twice a week listing changes that have been made to these calendars. I'll also be making a page that displays the iframes for each calendar, so his clients may see them even if they do not have a Google account.

    Read the article

  • Good site building for little kids [closed]

    - by guy mograbi
    I am teaching kids to write 3d games with Unity. Now I want to publish their games online along with some other stuff. I don't want to teach them HTML, CSS etc.. I don't mind buying the domain (after a bad experience with "TK" domains I concluded buying one is better), so all I need it hosting and possibly with a builder with a nice interface. Couldn't find one which seems to be the right fit. Can you recommend of anyone? Static HTML hosting will do, but I prefer PHP support and DB just in case we will need to implement a login mechanism.

    Read the article

  • htaccess url rewrite

    - by user761396
    I used to have a rewrite rule to RewriteRule ^([^/]+)\.htm$ index.php?c=$1 [NC] RewriteRule ^([^/]+)\.htm/([0-9.]+)$ index.php?c=$1&amt=$2 [NC] Now, i have to change to RewriteRule ^1/([^/]+)\.htm$ index.php?c=$1 [NC] RewriteRule ^1/([^/]+)\.htm/([0-9.]+)$ index.php?c=$1&amt=$2 [NC] RewriteRule ^2/([^/]+)\.htm$ index2.php?c=$1 [NC] RewriteRule ^2/([^/]+)\.htm/([0-9.]+)$ index2.php?c=$1&amt=$2 [NC] (the difference is adding a subdirectory) My question is how to redirect my old ones to the 1/ subdirectory? thank you

    Read the article

  • Why are new pages not being indexed and old pages stay in the index?

    - by ZakGottlieb
    I currently have a site that was recently restructured, causing much of its content to be reposted, creating new URL's for each page. To avoid duplicates, all of the existing pages were added to the robots file. That said, it has now been over a week - I know Google has recrawled the site - and when I search for term X, it is stil the old page that is ranking, with the new one nowhere to be seen. I'm assuming it's a cached version, but why are so many of the old pages still appearing in the index? Furthermore, all "tags" pages (it's a Q&A site, like this one) were also added to the robots a few months ago, yet I think they are all still appearing in the index. Anyone got any ideas about why this is happening, and how I can get my new pages indexed?

    Read the article

  • Making HTML5 videos stored on AWS S3 **difficult** to download (because I cant make it impossible)

    - by Jimmery
    I am building a website that hosts video's stored on AWS's S3 service. The videos are played thru a HTML5 player we have built. Ive just been asked to make sure "nobody can steal our video's". Now I know that if you really don't want something stolen, don't put it up on the internet. However I just need to secure these videos as good as possible, the videos need to at the very least resist someone going thru the source code and trying to download them manually. One option available to me is to completely rebuild the video player in flash. This is not ideal, for several reasons, notably because I would also then have to build an App for mobile devices to be able to view this site. So I am looking for other options. I have heard about using a token to make the file available only during certain times. I have heard of using a separate file to serve the videos that sits between the HTML5 page and the video file. I am also having a look at IAM, the Secure AWS Access Control, in the hopes AWS can solve this problem for me. Can anyone here recommend any of these options? Or perhaps suggest other options available to me? Any help would be greatly appreciated.

    Read the article

  • How to edit thousands of html pages at once? [on hold]

    - by Johnsy Omniscient
    I need to edit thousands of pages for a website with dynamic content added manually by the owner throughout 3 years, it has thousands of pages and I'm sure there is a better way to edit them without spending hours opening each one of them. I know it would be easy to just edit the styles.css but page dynamics like the positions of the google ad-boxes are individually edited inside the html of each page, so there is no way to solve this through css. Is there some sort of code, script and macro that can edit the pages at once?

    Read the article

  • How do I forward/redirect a website from a folder in a subdomain to another server?

    - by dozza
    I have a client with a site at: subdomain.theirdomain.com/folder It's a 50Gb gallery site that i've now cloned and have locally in MAMP. Once i've made some changes to it I need to host it at alternative physical server/hosting (I intend using a dedicated server I have access to with a technical domain name currently). However, the client would ideally like to keep the existing URL as it has been used extensively in marketing. I've done HTTP redirects and forwards and 301 redirects in the past, but I'm not sure how or even if I can do what the client wants. How can I achieve this, possibly using .htaccess and DNS entries? Caveats: I can't have the site at a 3rd party domain and the client isn't able/allowed to register any additional domains:

    Read the article

  • Can i create a private shop with a public gallery with magento?

    - by Sylario
    I want to do a shop with a public gallery listing all my products, with textes and images. The gallery is not a shop, you cannot buy. Behind it, i want a private section with a shop, where the same products are listed in tables with all the necessary options to buy. Each customer for the shop will have a separate account. In the shop and the gallery products are on different page according to their categories. Is magento able to do that ?

    Read the article

  • why my site cache 2 time par day?

    - by clarawood
    I have read the FAQs and checked for similar issues: YES My site's URL (web address) is: www.adultxdating.com Description (including timeline of any changes made): I have lost my top search listings from last 4 months. I am still working on this but not getting proper guidance. This site is caching 2 times in 24Hrs. Some times sites will back in to top 10 search listing on 100s keywords, some time its gone out 1000+. anybody can help me why its happening. I have more than 200K+ incoming links and updating the site regularly. Please help. Thanks clara wood

    Read the article

  • Why am I getting this message "Some important page is blocked by robots.txt"?

    - by Rounak
    My site's URL is: www.hackinguniverse.org From some day in Google Webmaster tool a message is showing that says "Some important page is blocked by robots.txt". My robots.txt is: User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Allow: / Sitemap: http://www.hackinguniverse.org/feeds/posts/default?orderby=updated Now for other information - I hosted this website on blogger. I have some other sitemaps too where I had only included "/Search" in Disallow like in this robots.txt file. But those sites are ok. I mean no message is showing on those. So why am I getting that message telling that I have blocked some important page via robot.txt?

    Read the article

  • Site returning 404 header to google, not sure why

    - by Damon
    A Drupal site that works fine for regular users returns a 404 not found error when I try to use the W3C validator on it; it is also not being indexed by google at all (which is the main issue but I suspect there is a connection). It is a https:// site with .htaccess rule to redirect any http:// request to the https://. I had had it running in google webmaster tools and thought it was fine, but it turns out I had not added the https domain. After adding the https domain it's also returning the header as HTTP/1.1 404 Not Found Date: Mon, 15 Oct 2012 19:37:43 GMT Server: Apache Expires: Sun, 19 Nov 1978 05:00:00 GMT Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0 Robots.txt just has User-agent: * Crawl-delay: 10 # Files Disallow: /cron.php How can I check what the issue is here?

    Read the article

  • How to set up a local webDAV server for use with Goodreader on the ipad? [migrated]

    - by confused-about-webdav
    I need to know how to set up a local webDav server on my PC so that goodreader on ipad can automatically sync with it over the local wifi network? I am really a rookie when it comes to setting up a web server and have tried various guides on the internet. I tried setting up a webdav server using IIS and forwarded the required ports and enabled webdav publishing but goodreader can find it in the local wifi network automatically nor is it able to connect even after manually entering the credendtials. So i'll be really greateful if someone who has successfully setup a webdav server for use with goodreader can point me on how to do it?

    Read the article

  • Problem in domain name of site in google search [on hold]

    - by Jayadratha Mondal
    My domain provider dont supports advanced DNS at this moment. So I have used iframe to forward to my webserver. Suppose my domain abc.com have a simple html which is opening xyz.com via Iframe. Google was showing abc.com on the search. But from last two days it is showing xyz.com, other things like description, name of the site are ok only the domain changed to xyz.com, but I want to show abc.com. When I'm typing the full domain then google showing the domain as I want. But If I type a part or any keyword then it showing that xyz.com. Anyone have any idea how to show abc.com again? EDIT: I dont know the domain name provider because its not mine. When I'm trying to set new A record its showing that "you need advanced dns to do this currently you are using simple DNS". I'm doing this from cpanel. Normally as per I know there are 3 sections. 1 domain name 2 A record 3Cname. But I dont have any domain name section.

    Read the article

  • Redirect subdomain to local pc

    - by user1188570
    I have a home webserver which is constatly running. Is it possible to create a subdomain which would redirect traffic to another local pc? For example I have 1 Server and one notebook(with webserver installed for developing) Now I can access to notebook only from local network with IP. Server is also hosting domain example.com. Now I would like to visit laptop.example.com, which would be my laptop.

    Read the article

  • Google analytics campaign advice

    - by Drewsdesign
    I am buying traffic from a broker not one source and sending to various landing pages. I would like to know the best way to structure a campaign so I can find which referrering site/url is performing the best (time on site, bounce etc) Should the utm_campaign be the 'brokername' and the utm_source be the 'landingpagename' or should this be the other way around? Also what would be the best way to create a custom report to show all the referrers metrics by each landing page ? Thank guys really appreciate any help on this.

    Read the article

  • A drop in SERP after following webmaster guidelines [on hold]

    - by digiwig
    So here's a puzzle for all you SEO gurus out there. I recently launched my own site. I had target keywords which were ranking very well for about 1 month, within the top five and even appearing in first place. In an attempt to maintain good positioning, I followed guidelines by adding robots.txt, an xml sitemap redirecting non-www to www redirecting index.php to root domain adding htaccess 301 redirect for old pages I added rich snippets created a google+ account, verified my picture to appear, I went through each of the webmaster issues with duplicate titles and meta descriptions and improved header tag document outlines i even created a few more blog posts to keep the content freshing and moving. So now my website appears on page 2 with my target keywords - and all because I followed the guidelines. What is happening? I see competitors with stagnant content superglued to position 1.

    Read the article

  • What to use for an event listing site? [on hold]

    - by Vykintas
    I have a site which lets users buy & sell tickets, but it's built on wordpress & buddypress. So it's very heavy and messy. I would like to re-do the whole site on something lighter, cleaner and solid. The main functionality for user would have to be as follows: Register or login via Facebook. Create events and sell tickets to them. See ticket sales statistics Upload photos and associate those with events. Buy event tickets, print pdf ticket. Comment, favourite and like events. What would be your suggestions? PHP framework? CMS? CMF? I must say that I'm a front-end dev so building a system from scratch on my own would take a while. I'd be interested more in a "skeleton" app solution or something similar.

    Read the article

  • Somehow Google considers a properly 301'd URL as 200 and is still indexing the new content in old page?

    - by user2178914
    We redirected all the old URL's to new ones properly using htaccess. The problem is Google, somehow is still finding content in the old page(which it shouldn't) and stores it in the cache rather than the new URL. For eg: Old Page- http://www.natures-energies.com/iching.htm New Page- http://www.natures-energies.com/index.php?option=com_content&view=article&id=760 If you type the old URL into the browser it redirects If you fetch the old URL as Googlebot in the webmaster tools the header says 301/permanently redirected. If I try to crawl as any other bot it still says 301 redirected. Even if you click the old link in Google it redirects to the new URL. Only in its cache it shows the old URL and moreover it shows the new content in it! I am stumped on how Google manages to grab the new content and puts in the old URL instead of the new one! One more interesting thing is that if I try a cache for the new page it shows the cache of the new content with old URL! Any help would be appreciated. I am at end of my wits. I think i have tried almost everything. Is there anything that I'm missing to see? You can use this search to find the old url's. Maybe you'll some patterns that i missed. site:www.natures-energies.com inurl:htm -inurl:https|index

    Read the article

< Previous Page | 233 234 235 236 237 238 239 240 241 242 243 244  | Next Page >