Search Results

Search found 9728 results on 390 pages for 'zee pro'.

Page 168/390 | < Previous Page | 164 165 166 167 168 169 170 171 172 173 174 175  | Next Page >

  • One to many problem with implementing 301 redirect after changed urls

    - by user16136
    I have a problem. I had an old dynamic url which I have now split into multiple static urls. e.g. www.mydomain.com/product.php?type=1&id=2 www.mydomain.com/product.php?type=2&id=3 www.mydomain.com/product.php?type=2&id=4, etc which I have changed to something like www.mydomain.com/electronics/radio www.mydomain.com/electronics/television www.mydomain.com/mobile/smartphone, etc. Google has previously indexed the dynamic urls and search results show the old urls. I want search to point to the new urls. I have kept the old url active, so both urls work. How can I set up a 301 redirect in this case? I run IIS and it only allows a page to be redirected to 1 url. Should I deactivate the old dynamic url? In that case I lose all the previous seo rankings..

    Read the article

  • Tracking first visit date in Google Analytics

    - by dusan
    I want track a site's visitor retention using Google Analytics, to see if unregistered users are returning to it, within a time frame of 2+ months from now. This blog post seems to be on the right track, but I want to track unregistered users, so I don't have a "join date" or similar variable at my disposal. This other blog post suggests using all 5 GA custom variables, using the first variable slot on the first week, variable 2 on week 2, etc. This method will allow me to track 5 weeks of visitors. I want to track more than 5 weeks of visitors, so I was thinking on using two custom variables in GA: visitor's first visit date, and visitor's last visit date. How I can save the first visit date? Because if I save another value in the same slot the old value will be overwritten, and I don't know how reliable is to save that variable conditionally (reading the __utmv cookie to check whether the "first visit date" is set, if it isn't set I save the current date)

    Read the article

  • URL hex characters in .htaccess

    - by Steve
    There is an old page with a space in the filename, and this is no longer found on the website. So I need to redirect this page to another page using a 301 redirect in .htaccess. If I place the filename directly into .htaccess (Bouquets%20%26%20Loose.html), the redirect does not work. If I escape the % sign like this (Bouquets\%20\%26\%20Loose.html), the redirect still does not work. How do I get this redirect to work in .htaccess? Thanks.

    Read the article

  • Password protect an alias virtual difrecory

    - by Jason
    I have a main domain being hosted through CPanel. I also have a sub-domain that I would like to appear as a path under the main domain instead of as a sub-domain. So I have: http://example.com/ pointing to the main hosted file. http://example.com/mydir pointing to the subdomain files. This is achieved by a httpd.conf include from the main domain section to set an alias: alias /mydir /path/to/subdomain/files/ Now, that works fine so far. The problem is that if a .htaccess file under /path/to/the/subdomain/files/ contains an error, the alias is completely skipped, and /mydir goes instead to the main host files. That is kind of surprising to me - I would expect an error to return an error instead. Now the killer: if I try to password protect /path/to/subdomain/files/, then trying to access http://example.com/mydir will again attempt to deliver from under the main hosted files and not from /path/to/subdomain/files/ I am not seeing any errors reported on the .htaccess file in the apache error log, so I am assuming the .htaccess is valid: AuthUserFile /path/to/valid/readable/.htpasswd AuthName "Secure Access" AuthType Basic Require valid-user This kind of behaviour does not seem right to me. Is there something obvious that could be causing it? Or is this just the way it works? Perhaps using an alias is the wrong way to go?

    Read the article

  • how to get new site indexed by alexa

    - by JohnMerlino
    When I search my site on alexa, it says "Alexa Traffic Rank: No Data". So I google the issue. I come across this page: http://www.loudable.com/my-website-data-not-showing-in-alexa-get-your-website-crawled-by-alexasolution.html It says to get site indexed, click Crawl my site on the webmasters page. However there is no longer a link that says "Crawl my site". So as of now, does anyone know how to get site indexed by alexa so that my traffic rank will display on alexa index?

    Read the article

  • Hosting advice for a write-heavy dynamic website

    - by Rahul Rawat
    I have built a website using PHP and MySQL and now I am looking for a hosting service. I am expecting about a 1000 users registering and about 5-10k pageviews/day in a week's time. So which host should I opt for? It will let users submit contents of type blobs and submit around 10 pictures per users. I hope that traffic will increase so can justhost's or bluehost's shared hosting serve that purpose or should I go for more dedicated ones. Basically the site is write heavy and there are average 2-3 MySQL queries per page and it is quite dynamic. So depending on these requirements which web hosting will be optimal for me.

    Read the article

  • IE8 HTTPs Download Issue

    - by Jon Egerton
    I have a problem with a system I develop related to IE8 downloading over SSL (ie on sites using https://...) and is described on this MS kb article: http://support.microsoft.com/kb/323308 We use the HTTPCacheability.NoCache option as the data being downloaded is sensitive, and is downloaded from a secured site. I don't want that data to be cached on any of the proxies etc that the response passes through back to the client. The article describing the issue details a fix to the client side registry changing a BypassSSLNoCacheCheck setting. I don't want to loosen the system security just for IE8, as the system works fine on anything more upto date. Getting all the clients to apply the hotfix is difficult at best, and impossible at worst. We need to support IE8 in the system, at least for now. So: 1: Does the detailed hotfix have any implications for the security at the browser end in IE8 - does it mean the file will be cached? (in a place other than where the user saves the file). 2: Is there some way I can get these files downloadable with a change at the server end that doesn't break the security side of things?

    Read the article

  • Adsense ads are not a good fit for my site

    - by Ryan Grush
    I run an academic network for college students to communicate at particular universities and we run Google Adsense. The site pulls in a decent amount for a side project but our CTR is horrible <0.2% and our RPM is equally low. The problem lies in the fact that Google pegs us as an education site (which we are) but shows our users ads for U of Phoenix, Devry U and other for-profit universities. All of our users are students of the more higher-caliber institutions and therefore have no use for these ads. I've known about this problem for some time but I don't know what to do to show more relevant ads instead (i.e. Spring Break, school apparel, poker, sports, etc). What would be the best way to change this?

    Read the article

  • Can I include a robots meta tag outside of the head in HTML snippets indeded to be SSIed?

    - by Dan
    I have a number of files in my site which are not intended for independent viewing, but rather to be AJAXed into content within the site. They obviously don't meet HTML standards (no body, head, etc.) as independent entities. I would like to prevent search engines from indexing these pages, but do not have access to /robots.txt (which would be much more ideal). My question is, could I include the following at the top of these partial HTML files and get the desired results? <meta name="robots" content="noindex, noarchive"> I guess there are two parts to this question. Will this cause any rendering issues in any browsers? Will search engines (at least Google & Bing) interpret this as intended?

    Read the article

  • Which is better for search engines, repeated phrases or different phrases with the same meaning?

    - by George Botros
    When I'm designing an ads website I have two options: Let the advertiser to choose from some predefined lists to create the new ad. For Example: product list ( T-Shirt, Shorts, Suit, .....) Color list ( Black, Red, .....) Let the advertiser to write his own descriptive content for the product For Example "Amazing suit with a good price" I like the first Scenario but which is better for search engine optimization [SEO], repeated phrases or different phrases with the same meaning? Note : assuming each page will contain one or more ads

    Read the article

  • How to filter traffic coming to particular page from other page?

    - by BishKopt
    I've got page A linking to page B. There are also other pages linking to B. How can I see traffic that is coming to page B be ONLY form page A? I can somehow do it via Behavior flow: Behavior Behavior flow [Right click on anything] Explore traffic through here [Click edit icon] Define a page group [Right click] Group details [Dropdown] Incoming traffic But how do I do it in normal reports? Is there any way to filter out only the traffic coming from particular page?

    Read the article

  • Creating date based back entries for a blog and its site registration

    - by open_sourse
    So I am showing a blog to a colleague and telling him how the author has been regularly blogging for over ten years now. My colleague tells me that anyone can register a domain name and start entries from say circa 2000. When I argued that the site registration date can easily show that the registration was done recently he put forward two arguments: The author can claim that he moved from an old domain name which was registered many years ago and lapsed. So he took the data and rebuilt it in the new site. The author can buy an expired domain which was on the internet for many years. I am not sure if these ways can work to con someone to believing you have been a blogger for over a decade. But I do not have enough expertise in the topic to refute him. So I thought I would ask the wise community here at StackExchange. Can anyone give me some insight?

    Read the article

  • Mod Rewrite not working on my addon domain

    - by Ogugua Belonwu
    have a wordpress website on my main domain For the wordpress website i have this in my .htaccess file # BEGIN WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php </IfModule> # END WordPress I just created an addon domain and wanted to use new rules for it I created a .htaccess file and put it inside the addon folder eg /newaddon In the .htaccess file i have: Options -Indexes <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^readjob/(.*)/(.*)/(.*)/$ readjob.php?id=$1&amp;cat=$2&amp;title=$3 </IfModule> The url stucture i have is this: http://www.website.com/readjob/3/jobs/web-designers-potech-integrated-services/ But it keeps telling me link is broken I dont know what to do, pls i need assistance (pls i just learnt mod rewriting today, so clarity will be highly appreciated) Thanks

    Read the article

  • SEO tool is telling me title, description and keywords don't exist, but they do. Where is the problem?

    - by DaveDev
    I'm using the following tool to analyse how 'optimal' a site that I'm working on is for search engines: http://tools.seobook.com/general/spider-test/ I enter the URL for the site - http://ftmsuat.moneymate.com - into the search bar, and it returns a breakdown of the contents of the page. I'm a little confused by what I see though. According to the results, the page doesn't have a title, description or keywords. But if you check the source of the page, those elements are definitely there. So I'm wondering now, which is wrong? seobook.com or my page?

    Read the article

  • Web Safe Area (optimal resolution) for web app design?

    - by M.A.X
    I'm in the process of designing a new web app and I'm wondering for what 'Web Safe Area' should I optimize the app layout and design. By Web Safe Area I mean the actual area available to display the website in the browser (which is influenced by monitor resolution as well as the space taken up by the browser and OS) I did some investigation and thinking on my own but wanted to share this to see what the general opinion is. Here is what I found: Optimal Display Resolution: w3schools web stats seems to be the most referenced source (however they state that these are results from their site and is biased towards tech savvy users) http://www.w3counter.com/globalstats.php (aggregate data from something like 15,000 different sites that use their tracking services) StatCounter Global Stats Display Resolution (Stats are based on aggregate data collected by StatCounter on a sample exceeding 15 billion pageviews per month collected from across the StatCounter network of more than 3 million websites) NetMarketShare Screen Resolutions (marketshare.hitslink.com) (a web analytics consulting firm, they get data from browsers of site visitors to their on-demand network of live stats customers. The data is compiled from approximately 160 million visitors per month) Display Resolution Summary: There is a bit of variation between the above sources but in general as of Jan 2011 looks like 1024x768 is about 20%, while ~85% have a higher resolution of at least 1280x768 (1280x800 is the most common of these with 15-20% of total web, depending on the source; 1280x1024 and 1366x768 follow behind with 9-14% of the share). My guess would be that the higher resolution values will be even more common if we filter on North America, and even higher if we filter on N.American corporate users (unfortunately I couldn't find any free geographically filtered statistics). Another point to note is that the 1024x768 desktop user population is likely lower than the aforementioned 20%, seeing as the iPad (1024x768 native display) is likely propping up those number (the app I'm designing is flash based, Apple mobile devices don't support flash so iPad support isn't a concern). My recommendation would be to optimize around the 1280x768 constraint (*note: 1280x768 is actually a relatively rare resolution, but I think it's a valid constraint range considering that 1366x768 is relatively common and 1280 is the most common horizontal resolution). Browser + OS Constraints: To further add to the constraints we have to subtract the space taken up by the browser (assuming IE, which is the most space consuming) and the OS (assuming WinXP-Win7): Win7 has the biggest taskbar footprint at a height of 40px (XP's and Vista's is 30px) The default IE8 view uses up 25px at the bottom of the screen with the status bar and a further 120px at the top of the screen with the windows title bar and the browser UI (assuming the default 'favorites' toolbar is present, it would instead be 91px without the favorites toolbar). Assuming no scrollbar, we also loose a total of 4px horizontally for the window outline. This means that we are left with 583px of vertical space and 1276px of horizontal. In other words, a Web Safe Area of 1276 x 583 Is this a correct line of thinking? I'm really surprised that I couldn't find this type of investigation anywhere on the web. Lots of websites talk about designing for 1024x768, but that's only half the equation! There is no mention of browser/OS influences on the actual area you have to display the site/app. Any help on this would be greatly appreciated! Thanks. EDIT Another caveat to my line of thinking above is that different browsers actually take up different amounts of pixels based on the OS they're running on. For example, under WinXP IE8 takes up 142px on top of the screen (instead the aforementioned 120px for Win7) because the file menu shows up by default on XP while in Win7 the file menu is hidden by default. So it looks like on WinXP + IE8 the Web Safe Area would be a mere 572px (768px-142-30-24=572)

    Read the article

  • How do I get categories for a product in Magento [closed]

    - by Joost de Valk
    I'm trying to add product_type to my Magento Google Base output based on the product's categories, but I seem to be unable to. I have the following code: // Get categories from product to include as product_type $categoryIds = $object->getCategoryIds(); foreach($categoryIds as $categoryId) { $category = Mage::getModel('catalog/category')->load($categoryId); $this->_setAttribute('product_type', $category->getName(), 'text' ); } The issue is that it returns all of the categories, not just the ones the product is in. Anyone have a solution?

    Read the article

  • Gaming Community CMS, with forum integration [closed]

    - by Tillman32
    Possible Duplicate: Which Content Management System (CMS) should I use? I've had a simple website that I coded myself for a while now, the site is a gaming community. It's very forum and news driven. It was a HORRIBLE idea to take on coding this thing myself. Although we've used it for about a year now, we're just getting too big, and I need to streamline our work. I need writers to post news, etc. I've been doing it through code. ( A year ago I thought it would be a cool idea ) Anyway, I've been messing with just about every CMS out there, and I'm struggling to get something that I really like. The main issue I'm facing, is a good news system, and good forum integration. I'm sort of picky when it comes to looks, its a curse. Reading on here, I see a lot of people saying Drupal is the best for the 3 things I need, community interaction, and forums. I think the main issue that I ran into with drupal, was ease of use, and themes. I am not a web designer, and I need a good theme. For an idea of what I'm looking for, go check out http://www.clgaming.net, they have forums integrated, a nice news area on home page/news section, and nice user accounts. It looks very professional, and I doubt I'll get close to that with a free theme, but their functionality is exactly what I need. Any ideas would be greatly appreciated.

    Read the article

  • Why Google Analytics is displaying wrong landing pages?

    - by Salman
    I see all of my pages as Landing Pages in Google Analytics which cannot be true as I did not post those pages anywhere and I don't see any traffic hitting directly to that page. Also, I am using virtual page views on few buttons and I see those virtual pages as Landing pages too. For example, /click/request-a-quote 35000 views 35000 is too big a number to be ignored. Even if I ignore Virtual Pages Views, I see a lot of pages as Landing Pages that I am 100% sure that visitors ( atleast not so many users) are NOT hitting directly. Any advice, how to debug it? PS: I'm using the following code: var _gaq = _gaq || []; _gaq.push(['_setAccount', '<']); _gaq.push(['_setDomainName', 'none']); _gaq.push(['setLocalGifPath', '/images/_utm.gif']); _gaq.push(['_setAllowLinker', true]); _gaq.push(['_trackPageview','account/phase1']);

    Read the article

  • Where can a list of Desktop web browsers be found at?

    - by Sn3akyP3t3
    I have another question posted in regards to the practicality of whitelisting. In this question I'm simply looking for an frequently updated list of top known used Desktop web browsers to use as part of my whitelist. I'm not trying to target any specific OS so please show one, show all. The list of browsers for desktops isn't exploding, but it does grow. I've only recently been made aware of other browsers that have multiple rendering engines. I'm not always on top of the text based browsers found out there either. I'm aware of the mobile browser platform and there is an active list used with regular expression for identification purposes that I will use as well as whatever I can find for the desktop platforms.

    Read the article

  • navigation menus and SEO

    - by Rodolfo
    I've always have my doubts about navigation menus effect on SEO. You know, the vertical menus on the top that show in every page in the site linking to main sections and subsections. My issue is that if not done dynamically (i.e. after page is loaded or something), from a search engine's point of view it probably looks like a whole bunch of links in the beginning of the page, and links that probably have nothing to do with the page being analyzed, so it's probably not only confusing it, but also giving link 'juice' to the wrong pages or reducing its value. When I've asked SEO people about this, I usually get a "Google is smart, they'll recognize it as a menu and ignore it" response, but I'm not convinced (and the 'Google is smart' argument sounds almost like religion discussion to me). So does it affect SEO negatively or not? Are there any official posts on this topic?

    Read the article

  • use subdomain on different host

    - by Roy
    I want to accomplish something that I thought was simple. My wish is as follows: I have a domainname with hosting, a WordPress multisite (with subfolder setup) installed and running: gangleri.nl. I have another domain at another host and without hosting: monas.nl I created a subdomain on gangleri.nl: monas.gangleri.nl and the domain redirects to that subdomain. Now what I want is to have monas.nl act like a website, not a website in a subdomain. I would like to have post urls as in monas.nl/posttitle. I first thought to do this with the DNS settings of Monas.nl. I now have an URL forward, CURL is not what I want and I did not manage to get A-records or CNAMEs to work. I tried using the htaccess file of the WP installation in monas.gangleri.nl. I tried 301, rewrite and whatnot, but also without success. Meanwhile, I have been reading so much that I no longer have a clue what to do. A-record doesn't sound probable, since I have no IP for the subdomain, so an A-record would point to gangleri.nl rather than using the subdomain. Also I have no idea if I should do something in the DNS settings of gangleri.nl or monas.nl, both, one of them and something somewhere else. I have the idea that I've tried everything, but the more I try and read about it, the less I can get my head around. People talking about A-records to subdomains while I can only use IPs, CNAME settings that my host doesn't support or something. Could somebody tell me if what I want is possible and if so, take me by the hand and guide me through it?

    Read the article

  • Is it illegal to use content in such a way?

    - by MHZ
    I have a couple of questions about the legality of the content of some websites. I am currently working on two websites and I would like to make sure I am not breaking any laws, by using some content like I am... Do I need to get a license to use images from the Internet (such as google.images.com) in my site, assuming they aren't a company logo belonging to another company? If not, am I allowed to use it after I modify it with a image editing software? If content such as phone numbers, e-mail addresses, website addresses, and text from websites can be found for free online, and I gather this information for a search engine based site that I am working on and offering this information on a paid basis (similar to google, but more specialized), is something that is legal? Note: I am not 'copying' or redirecting business from anywhere, to my site. The exact opposite, the site I am working on actually helps advertise businesses and make it easier for customers to find them.

    Read the article

  • Multiple TOC with MediaWiki using section headings in single page

    - by user1704043
    I'm running my own installation of MediaWiki, which has been great! I haven't been able to find the answer to this small problem in any post, how to, etc. Here's the setup: Article TOC (limited to showing only H1 and H2) ==H1== ===H2=== ====H3==== ====H3==== I don't want the H3 to show up on the main table of contents, because it would make the list very long. Instead, under the H2, I would like to display another TOC with all the H3's under that listing. From my understanding, you cannot have multiple table of contents on a single page. I've thought about making a template for each H2 that has the H3 links, but that seems like it duplicates a lot of work and creates loads of pages. I'd love a template that sucks all subsection names and spits them out, but I don't see how to do that. Alternatively, is there a way to enable multiple TOCs in a custom install of MediaWiki that I'm missing? Even that would get closer to what I'm trying to do.

    Read the article

  • Does having a Google "stop word" in a domain name have less SEO benefit than not having it?

    - by Dan
    Let me explain. Let's say my keyword I want to optimize is "green giraffes". But the domain greengiraffes.com (singular, plural, no hyphen, hyphen, etc.) is not available. I know that the search results for "green giraffes" and "about green giraffes" are essentially the same because "about" is a "stop word". Does that therefore also mean that the domain name "aboutgreengiraffes.com" is as good as "greengiraffes.com" in terms of SEO value? Are all stop words equal in that regard, or a shorter one (such as "e" or "z") is better?

    Read the article

  • Can I use a list of blog ping services for a portal?

    - by Ivanhoe123
    I'm setting up a list of ping services for a portal. It has a blog, forum, articles, restaurants, hotels and many other information, so it is far beyond a blog. I have a list of standard ping services for WP blogs - but I do not know if this should be literally only for blogs. My questions are: Is it recommended to ping blog services from a portal, such as http://blogsearch.google.com/ping/RPC2? Are there any penalties for sites that are not recognized as blogs? Is there some list of ping services for regular websites and not only blogs? Thanks!

    Read the article

< Previous Page | 164 165 166 167 168 169 170 171 172 173 174 175  | Next Page >