Search Results

Search found 9717 results on 389 pages for 'pro'.

Page 189/389 | < Previous Page | 185 186 187 188 189 190 191 192 193 194 195 196  | Next Page >

  • What effect does using itemprop="significantLinks" on anchors have for SEO?

    - by hdavis84
    So as I've described in a previous post about span tags within head tags, I'm practicing application of microdata via http://schema.org. Anyone who's browsed the documentation there knows that there's a lot of need for improvement for more clear understandings on use for each property. My question on this post is more about the "significantLinks" property and how it effects SEO for on page, in content anchored text. Does anyone have any more information regarding whether its good to use for link optimization? I understand what schema.org means that it's to be used on "non-navigational links" and those links should be relevant to the current page's meaning. But will using this property hurt SEO or make SEO better for each page? Thanks in advance, as by answering this with accurate information you are helping not just me, but many people who are trying to make their customers more successful through helping their rank for relevant keywords to their business, bringing them more search engine traffic.

    Read the article

  • How to deploy ASP.NET application with MS SQL server database

    - by Maddy
    I want to deploy my website with MS SQL server database. It's my first time and I have never done it before. What I have come to know from my googling is that I must have a domain(.com/.net/.co) and a host(for my web pages .aspx & .cs(confusion here if I can also deploy my database)). Now, I am not getting to where I have to deploy my database. If I also have to buy a seperate SQL Server database or a host consisting of every thing (means I can deploy both my ASP.NET application & database as well).

    Read the article

  • Disable outbound links without letting others know that

    - by tadoman
    Is there a way I can tell google not to follow external links ( pointing to other sites) without letting other know. I know you can disable outbound links by putting rel=nofollow or something in robots.txt. But that's something others can see as well. I'm just wondering if there's a way to tell google not to follow those links without letting others know that... like a setting in webmaster tools or something similar ( there's definetly one way. I could set an exception in my conf file for my server to check the user agent to be "googlebot" and then serve a different version of robots.txt. So that when a different user would check that link it would return a different robots.txt thant the one served to googlebot. However I'm not too sure google would be too happy about this) Thank you

    Read the article

  • Can I show a table of one custom variable against another?

    - by Simon
    We have a number of custom variables set up in google analytics. We'd like to show a table of event occurrences broken down by two custom variables, e.g. if variable one can be A, B, or C and variable two can be J, K or L: Events | A | B | C | -------+-----+-----+-----+ J | 345 | 65 | 12 | K | 234 | 43 | 7 | L | 123 | 21 | 4 | -------+-----+-----+-----+ How do I get the information in that format?

    Read the article

  • Google Analytics: How can I traffic and referrals from iPad applications?

    - by kayaker243
    In Google Analytics, there is extensive information on the mobile device, version and browser version. However, this doesn't seem to go beyond the mobile browser. I would like to determine which application is responsible for visits to my site. Specifically, I want to know how many visits are coming from zite. http://www.handsetdetection.com/properties/vendormodel/Apple/iPad/page:4 seems to indicate this information is probably available, where/does Google Analytics expose this?

    Read the article

  • Is Azure Compatible with JPEG XR?

    - by Shawn Eary
    I just put an F#/MVC app into a Windows Azure solution as a Web Role. Before migration, my JPEG XR (*.WDP) files were getting displayed on the client in IE9 without issue via my local and hosted sites. Now, after migration into Windows Azure, my JPEG XR files neither get displayed in my local Windows Azure compute emulator nor do they get displayed when they are deployed to http://*.cloudapp.net. Is there some sort of conflict with Widows Azure and (JPEG XR) *.wdp files? If so, what is the accepted best practice for overcoming this conflict?

    Read the article

  • Why do my websites have a first page rank on Bing and Yahoo but not Google? [closed]

    - by Linda Cullum
    I have 3 websites suffering from a drop in ranking with Google and hence a huge drop in traffic. The instant drop ocurred in September and I have not been able to remedy it. For the past 6-10 years my main website http://LearnToSail.Net has ranked from #3 to #1 on the 1st page of Google and all the other engines with the search term "learn to sail" Now it shows on the 1st page of Bing and Yahoo but does not show up on ANY pages of Google. The only way it does come up is if I add "cd" to the "learn to sail" phrase. We sell a sailing cd on that website. The other websites are http://LearnToSailOnLine.com ..search terms are "learn to sail online" or learntosailonline and historyofthepilgrims.com search terms are "history of the pilgrims" "historyofthepilgrims" I get the same result. Gone on Google but 1st pages for Bing and Yahoo. I have researched, edited,updated blogs, made sitemaps, prayed to the universe and use Google Webmaster tools but nothing is changing and I have lost alot of business. I host with 1and1.com and have been back and forth with them but to no avail and no change in traffic. I thought maybe some DNS mapping was off. I used to have alot of traffic now I have hardly any. Any advice would be greatly appreciated. I am still in the process of working on the issue of course! This is a really great website here and I am glad I came across it. Thank you, LS Cullum Little Pines Multimedia

    Read the article

  • What to use for "localhost" that includes PHP/SQL functionality?

    - by Jack
    I really do not want to install Linux at the moment, I would have to borrow a USB key, move files, format it, flash it, format it again, move files back, give it back... What would you recommend that is lightweight, easily and cleanly uninstallable afterwards (will install Linux when I get a new DVD-ROM, which will be in ~2 weeks), that also supports PHP and SQL? To be precise, I want to install a Wordpress blog, a few plugins, etc, and develop a theme. If there is no such thing for Windows (7, x64 if that matters), let me know too, I will borrow the USB key then (even though it's a pain).

    Read the article

  • Why are the stats for HTML Improvements in Google Webmaster Tools not decreasing?

    - by Kookoriko
    I have read that resolving HTML Improvements in Google Webmaster Tools can take as long as 6 weeks to show up, but those numbers seem to increase without decreasing even though I've been fixing almost everything that Google points out. I have checked some sites with the View as Google tool and the bugs are resolved (let's say "short meta descriptions" for example). Any idea why is this happening?

    Read the article

  • Redirect subdomain (weblog) to new domain without access to .htaccess

    - by fafa
    I've a problem that I can't find the solution for on the web. I have a blog that has PR 1 and it's subdomain "aaaa.domain.com" that "domain.com" is a blog server. Now I want buy a domain "newdomain.com" and I want tell google webmaster to redirect the old subdomain to this new domain and send traffic to my new domain. I can't access .htaccess to use a 301 redirect. The only thing that I can do is put html code in the html. How can I do this? When I use "Change of Address" in google webmaster it say:"Restricted to root level domains only".

    Read the article

  • Build My Own Advertising Network

    - by clifgray
    I have a few ideas that I think would be pretty game changing for online advertising and I would like to build my own network but I don't know where to start. I know it will take a lot of time for major publishers to get on board but I am more curious about the technical side. What language/database model and framework are modern ad networks built on? Basically I want to build an advertising network that registers views per page and allows publishers to manage the look of their own ads and let's the users interact with the ads. Is there any good information on doing something like this or any framework you can suggest to build on? I know this would get complicated pretty fast so if you have suggestions for ad networks that let you customize them heavily I would be glad to hear your suggestions.

    Read the article

  • Rewrite img and link paths with htaccess and serve the file from rewritten path?

    - by frequent
    I have a static mockup page, which I want to "customize" by switching a variable used in image-src and link-href attributes. Paths will look like this: <img src="/some/where/VARIABLE/img/1.jpg" alt="" /> <link rel="some" href="/some/where/VARIABLE/stuff/foo.bar" /> I'm setting a cookie with the VARIABLE value on the preceding page and now want to modfiy the paths accordingly by replacing VARIABLE with the cookie value. I'm a htaccess newbie. This is what I have (doesn't work): <IfModule mod_rewrite.c> # get cookie value cookie RewriteCond %{HTTP_COOKIE} client=([^;]*) # rewrite/redirect to correct file RewriteRule ^/VARIABLE/(.+)$ /%1/$1 [L] </IfModule> So I thought my first line gets the cookie value and stores this in %1. And on the second line I'm filtering VARIABLE, replace it with the cookie value and whatever comes after VARIABLE in $1. Thanks for sheeding some light on what I'm doing, doing wrong and if I can do this at all using htaccess. EDIT: I'm sort of halfway through, but it's still not working... Mabye someone can apply the finishing touches: <IfModule mod_rewrite.c> # check for client cookie RewriteCond %{HTTP_COOKIE} (?:^|;\s*)client=([^;]*) # check if an image was requested RewriteCond %{REQUEST_FILENAME} \.(jpe?g|gif|bmp|png)$ # exclude these folders RewriteCond %{REQUEST_URI} !some/members/logos # grab everything before the variable folder and everything afterwards # replace this with first bracket/cookie_value/second bracket RewriteRule (^.+)/VARIABLE/(.+)$ $1/%1/$2 [L] </IfModule> Still can't get it to work, but I think this is the correct way of doing it. Thanks for help!

    Read the article

  • Free, specific Ip2Location Database

    - by Andresch Serj
    I am searching for a free db (like an updated xml or csv file) that relates ip adresses to specific locations. I want more information than just the Country. I want some sort of region or city refference, even if that ends up to be a number that makes no sense to me. Doesn't have to be super correct or always up to date either. It is just to distinguish between usergroups and not to monitor or spy on them.

    Read the article

  • ecommerce options for 5-6 products [closed]

    - by user5252
    Possible Duplicate: Which Ecommerce Script Should I Use? We're looking to develop a simple e-commerce solution to sell 5-6 products. We'd rather not have to use PayPal's buttons (buy it now!) if there's an existing alternative, but would also for budget/time constraints don't want to roll our own. Are there any small, basic ecommerce solutions available that would allow this? I did look at Foxy Cart but the monthly fee was a bit of a turn off. (I must sound extremely fussy I'm aware!) Something like Zen would just be overkill for our needs. Thanks for any suggestions.

    Read the article

  • htaccess 301 redirect for payment page

    - by Chris Robinson
    I have a client who currently runs a venue and has ticket purchases made available through a third party. The way the site currently works is that there is a standard href in the nav menu to the ticket purchasing site. <a href'http://example.com/events'>Events</a> <a href'http://example.com/about'>About</a> <a href'https://someticketvendor.com/myclient?blah'>Tickets</a> They claim that they want to improve their SEO by appearing to integrate the ticket pages into their site. Having spoken to the ticket vendor, they only offer integration through iframes which is just horrible. I don't really know much about SEO but I'm wondering if I can create an htaccess rule to have http://example.com/tickets forward to href'https://someticketvendor.com/myclient?blah Are they are any negative SEO implications to doing this? Is there a better way this could be done?

    Read the article

  • Browser window size statistics?

    - by Litso
    I was wondering, are there any statistics available on what size users have their browser set to nowadays? I know the screen resolutions (we have analytics, which shows those as well) but I doubt a lot of people with 1280*xxx and higher still browse full-screen though. My boss is determined to keep our website 900px wide though, because that way people with 1800*xxx resolutions can have two browser windows next to eachother without having to scroll horizontally. I have never seen anyone browse with two adjacent browser windows like that except here at my current job, so I'm kind of doubting whether this is the best decision or just his personal preference. Anyone that can help out here?

    Read the article

  • Significant number of non-HTTP requests hitting my site

    - by Mark Westling
    I'm seeing a significant number of non-HTTP requests hitting a site I just launched. They show up in the server (nginx) logs as non-ASCII and get rejected (correctly) with a 400 status. Here are some lines from the log: 95.132.198.189 - - [09/Jan/2011:13:53:30 -0500] "œ$A\x10õœ²É9J" 400 173 "-" "-" 79.100.145.126 - - [09/Jan/2011:13:57:42 -0500] "#§i²¸oYi á¹„\x13VJ—x·—œ\x04N \x1DÔvbÛè½\x10§¬\x1E0œ_^¼+\x09ÜÅ\x08DÌÃiJeT€¿æ]œr\x1EëîyIÐ/ßýúê5Ǹ" 400 173 "-" "-" 79.100.145.126 - - [09/Jan/2011:13:58:33 -0500] "¯Ú%ø=Œ›D@\x12¼\x1C†ÄÀe\x015mˆàd˜Û%pÛÿ" 400 173 "-" "-" What should I make of this? Is this some sort of scripted attack? Or could these be correct requests that have somehow been garbled? They're not affecting the performance of the site and I'm not seeing any other signs of attacks (e.g., no strange POSTs) so at this point I'm more curious than afraid.

    Read the article

  • visit counts in advanced segments not consistant

    - by user671201
    My organization has recently noticed an issue when applying advanced segments to visit counts during different time ranges. With no advanced segments turned on, here are the visit counts for Oct 1st - Oct 4th during the time range Sept 8th - Oct 8th: Oct 1 - 7 Oct 2 - 7 Oct 3 - 8 Oct 4 - 5 Again, with no advanced segments turned on, here are the visit counts for Oct 1st - Oct 4th but I've changed the time range to Oct 1st - Oct 4th. As expected, the numbers are the exact same as above: Oct 1 - 7 Oct 2 - 7 Oct 3 - 8 Oct 4 - 5 Now, I turn on the "Non paid search traffic" advanced segment. Here are the visit counts for Oct 1st - Oct 4th during the time range Sept 8th - Oct 8th: Oct 1 - 0 Oct 2 - 0 Oct 3 - 0 Oct 4 - 2 Here is where it gets weird. I keep the advanced segment on, and change the time range to Oct 1st - Oct 4th. This is what I get for the exact same dates as above: Oct 1 - 4 Oct 2 - 2 Oct 3 - 6 Oct 4 - 5 We've found the same inconsistency in our other GA profiles that get much more traffic (the above numbers come from one of our specialized topic blogs), but the inconsistency is less pronounced where there are more visits. My question is: why are the visit counts different for different time ranges when advanced segments are turned on, but exactly the same when no advanced segments are applied? Is this a GA bug or am I missing something about how the advanced segments work?

    Read the article

  • Should I have link rel=next & prev on URLs which have query variables?

    - by user21100
    For example, I have link rel prev & next set up on these pages of products: site.com?page=2 site.com?page=3 (this is my preferred structure by the way and I'm trying to get all the ugly URLs which are littered with query variables deindexed as they are causing duplicate content). So the above URLs are fine but once a filter to narrow product results is selected, like "price", the URL shows like this: site.com?price[1000-1499]=on site.com?page=2&price[1000-1499]=on As of right now, I am having the link rel prev & next dynamically added to the header of these pages but since I am working on getting these query variable URLs pages deindexed, I am wondering if I should get rid of it on these pages? Any thoughts?

    Read the article

  • Does Google Analytics exclude Campaign traffic from Facebook in the Social reports?

    - by user1612223
    For a while we have used campaign tags when putting posts on Facebook so that we can run campaign reports in Google analytics on those links. However it appears that traffic from those links are being excluded in Google's Social reports. For example between 7/20 and 8/19 I'm seeing 123 Visits where Facebook is the source in my Campaigns report, but only 29 Visits where Facebook is the source in my Social Sources report. Main questions: Does Google exclude campaign traffic from it's social reports? If it does, is there any way to reconcile that so that the traffic shows up in both reports? If it doesn't, what could be causing the vast discrepancy? One observer noted that we are setting the Medium to "Post" when passing the campaign parameters, and that Google may only allow "Referral" traffic in it's social reports (Just speculation). In that case we could potentially change the Medium to "Referral", but that would undermine some of our strategy in being able to set different mediums. I have also considered that maybe the campaign traffic came to the site several times, and the social report may count the same user as less visits, however over 70% of the Facebook campaign traffic is new traffic, so at a minimum there would need to be over 85 Visits on the Social side for that argument to be valid. I've done several searches for any information on this topic, and haven't run across much of anything. I did post the same question on Google's Product Forum and have not gotten a response. The title of that question was 'Facebook Campaign Traffic Not Showing in Social Reports'. The inability to pass campaign data on Facebook posts would make evaluating the performance of those specific posts very difficult, so I'm hoping there is a solution to this.

    Read the article

  • Dedicated Servers: Is one better then two for LAMP pseudo HA setup? [closed]

    - by bikedorkseattle
    Possible Duplicate: How to find web hosting that meets my requirements? I know there are zillions of commentary about hosting out there, but I haven't read much about this. Our current well known host is having too many problems, the hardware we are on it subpar, and I'm ready to leave. A day of downtime can cost as much as our monthly hosting bill. A month of bad performance is just killing us right now, user and google wise. I'm wondering about running two dedicated boxes for LAMP, one running as the primary Nginx/Apache (proxy pass), and the other as the MySQL box. Running a single box scares the bejesus out of me because who knows how long it will take anyone to fix a raid card or whatever. The idea is to set this up using some sort of failover system using pacemaker and heartbeat. If one server goes down the other can take over for the other running both web and db. There are some good articles over at Linode about this. I have a few DBs that are 1GB+ and would like to load them into memory. Because of this, I'm shying away from a Linode HA setup because for the price I could do it with two dedicated like I described. Am I mad or an idiot? What are people out there doing for pseodu high availability good performance setups under $400/month? I'm a webmaster; I do a lot of things none of it that well :)

    Read the article

  • auth_mysql and php [migrated]

    - by user1052448
    I have a directory with auth_mysql in a virtualhost file password protected using a mysql user/pass combo. The problem I have is one file inside that directory needs to be accessed without a user/pass. Is there a way I can pass the user/pass within the php file? Or excluse the one file? What would I put between the code below? <Location /password-protected> ...mysql password protection require valid-user </Location>

    Read the article

  • Blocking path scanning

    - by clinisbut
    I'm seeing in my access log a number of request very suspicious: /i /im /imaa /imag /image /images /images/d /images/di /images/dis They part from a known resource (in the above example /images/disrupt.jpg). All comming from same IP. Requests varies from 1/sec to 10/sec, seems somewhat random. It's obviously they are trying to find something and seems they are using a script. How do I block this kind of behaviour? I though of blocking the IP request, at least for a given time. Keeping in mind that: Request intervals seems legitimate (at least I think so). I don't want to end blocking a search engine bot, which may find 404 urls too (and that's a different problem, I know). ¿Do they use always same IP?

    Read the article

  • Layout Columns - Equal Height

    - by Kyle
    I remember first starting out using tables for layouts and learned that I should not be doing that. I am working on a new site and can not seem to do equal height columns without using tables. Here is an example of the attempt with div tags. <div class="row"> <div class="column">column1</div> <div class="column">column2</div> <div class="column">column3</div> <div style="clear:both"></div> </div> Now what I tried with that was doing making columns float left and setting their widths to 33% which works fine, I use the clear:both div so that the row would be the size of the biggest column, but the columns will be different sizes based on how much content they have. I have found many fixes which mostly involve css hacks and just making it look like its right but that's not what I want. I thought of just doing it in javascript but then it would look different for those who choose to disable their javascript. The only true way of doing it that I can think of is using tables since the cells all have equal heights in the same row. But I know its bad to use tables. After searching forever I than came across this: http://intangiblestyle.com/lab/equal-height-columns-with-css/ What it seems to do is exactly the same as tables since its just setting its display exactly like tables. Would using that be just as bad as using tables? I honestly can't find anything else that I could do. edit @Su' I have looked into "faux columns" and do not think that is what I want. I think I would be able to implement better designs for my site using the display:table method. I posted this question because I just wasn't sure if I should since I have always heard its bad using tables in website layouts.

    Read the article

  • Monitor offline adwords conversions

    - by Frank Meulenaar
    I'm trying to evaluate the usefulness of Google Adwords for a friend's site. I'm trying to count the number of sales per month, and see how many have found her page because of the Adwords campaign. Her site has an online order system, but she also gets customers that buy just via the email contact and never use the online order system. There aren't many conversions per month (usually only one to three), so I don't want to miss any conversions when I want to gauge the effectiveness of a campaign. Is there a good way to also include those conversions?

    Read the article

< Previous Page | 185 186 187 188 189 190 191 192 193 194 195 196  | Next Page >