Search Results

Search found 9728 results on 390 pages for 'zee pro'.

Page 180/390 | < Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >

  • Multiple URL's going to same page - Kosher for Google?

    - by Ashoka15
    I hear conflicting answers from people about this, and I'm a developer by trade, and my SEO knowledge is not what it should be. Here's my situation: I run a website that lists hotels, restaurants, bars, shops, etc for a small Asian beach town. Lots of establishments here are hotels with a restaurant and bar, as well as restaurants that are also bars. As en example, a Mexican restaurant that also functions as a full cocktail bar. I first set it up so each establishment has one page, but can create multiple pages based on their other areas of business. This forces people to create TWO listings under the same name, and most just add the exact same information onto each page, making things redundant. I am re-arranging the database so that a establishment has only ONE listing (one unique page referenced by the unique code '12345ABCDEF') that is accessible from browsing under "Restaurants" and "Bars", and has the URL structures: site.com/dining/mexican/12345ABCDEF/business-name.html site.com/bars/cocktail_bars/12345ABCDEF/business-name.html I could easily simplify the URL to just the unique code and name: site.com/12345ABCDEF/business-name.html But, I found that Google has parsed by URL structure and lists like this on their SERP: Home > Dining > Mexican With each pointing to the default page for homepage, restaurants and Mexican restaurants. If I simplify the URL structure, will I lose these associations? Could Google also be picking up this structure from my breadcrumb trail at the top of the page? What is the best way to set up URL's on these pages so I am not penalized by Google for having identical information on two URL's, while still being able to have places show up as they did with the old system?

    Read the article

  • google chrome : http authentication issue on iframe

    - by Daniel Dzussa
    I have an HTML file with 2 links http://versionplus.in/pass/new.html, both links are in an iframe to load the contents inside the iframe. I have two protected directories one on the same server and other on another server. If you click on either link it will popup the login box, same for both links in all browsers except google chorme. google chrome doesn't show login box for the protected folder on another server, how can I fix this ?

    Read the article

  • Suggestions for a Self-serv advertising service

    - by Mystere Man
    I am seeking a self-serv advertising service for my websites, but I have a few restrictions that seem to make what i'm looking for hard to find. Specifically, I want to place "advertise here" links on my pages and allow end-users to purchase advertising on that site, page, and location. These ads will not be part of a national network. Supports multi-tenancy - That is, I have a number of domains using the same "web application" but with customized content per domain. When a customer wants to advertise on a given domain, then the ads will only appear on that domain and on that page of the domain (even though the page name may be the same across multiple domains). Supports fixed ad prices, not just CPC. I need monthly and quarterly pricing regardless of performance. Integrates with OpenX and other ad networks, so that if there is no self-serv on a given zone, it will use national advertising or direct advertising. Shiny Ads has much of this, but i'm looking for alternatives, as their prices are a bit crazy (20%) and can only do PayPal.

    Read the article

  • How do I stop Google indexing my main page as https [duplicate]

    - by user2897488
    This question already has an answer here: https:// search results appearing on Google for purely http:// site 2 answers Due to historic reasons, we have things set up so that "www.mydomain.com" redirects to "store.mydomain.com". This has worked perfectly fine until recently, when Google appears to be sending visitors to "https:// www.mydomain.com" which doesn't have an SSL-certificate (and never has). Strangely, its only the first link that goes to "https:// www.mydomain.com", all other links point correctly to "http:// store.mydomain.com". Because there is no certificate on the "www" version, users are getting an error message. How do I make Google revert to pointing the main link at "http:// store.mydomain.com" (or even "http:// www.mydomain.com.") If I remove "https:// www.mydomain.com" from Google webmaster tools, will this also remove the redirected page ("http:// store.mydomain.com)? Thanks.

    Read the article

  • Google Analytics: How long does it take users to trigger an event

    - by Stephen Ostermiller
    I implemented Google Analytics event tracking on my currency conversion website. The typical user flow is: User lands on a page about two currencies. User enters an amount to be converted. The site shows the user the value in the other currency. The JavaScript sends Google Analytics an "converted" event when the currency conversion is done. Because most of the sessions on my site are single page, the event tracking is very important to me to be able to know if users find my page useful. I'm looking for a way to be able to figure out how long it typically takes users to enter a value in the form. I expect that this data would form a bell curve with around a specific amount of time after page load. If I can't get a graph, I could make do with a median value. I would like to be able to use this as a core metric around usability testing. Is there a way to get this information out of Google Analytics?

    Read the article

  • Getting the masked URL values in Mediawiki

    - by Kalai
    I have successfully masked the URL in Mediawiki. By using the following scripts in .htaccess and localsettings.php files in Mediawiki, i.e.: .htaccess: Options +FollowSymLinks RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)/(.*)$ /mediawiki/index.php?title=$1&actions=$2 [L] Localsettings.php: $wgScriptPath = "/lib/mediawiki"; $wgArticlePath = "/lib/mediawiki/$1/$2"; It is working fine with required URL. But my problem is I want to consider the second parameter as a querystring for my pages. But I could not get the second parameter in my file. I tried with $wgrequest function but it is only giving the first parameter as title. I tried with $_REQUEST also, it is sometimes give the value of $_REQUEST['actions']. But many times not. I cant understand what is the problem.

    Read the article

  • Make Apache encode or replace quotes instead of escaping them?

    - by mplungjan
    In the dcoumentation I read Format Notes For security reasons, starting with version 2.0.46, non-printable and other special characters in %r, %i and %o are escaped using \xhh sequences, where hh stands for the hexadecimal representation of the raw byte. Exceptions from this rule are " and \, which are escaped by prepending a backslash, and all whitespace characters, which are written in their C-style notation (\n, \t, etc). In versions prior to 2.0.46, no escaping was performed on these strings so you had to be quite careful when dealing with raw log files. This is a problem for Analog which is still the handiest analyser I use. I get .... "GET /somerequest?q=\"quoted string\"&someparm=bla" in the logfile and it is of course flagged as corrupt since Analog expects .... "GET /somerequest?q=%22quoted string%22&someparm=bla" or similar. I realise I can pre-process using something like perl -p -i.bak -e 's/\\"/%22/g' logfile But I'd rather not have to add this step to these files which are 50-90MB zipped per day Thanks for any pointers

    Read the article

  • Working with different URL structures

    - by Dane411
    As I'm quite newbie to this field, I've doubts and there are some I couldn't find on Google, i.e: If I'm not wrong, index.html makes it possible to avoid to add the filename to the url, www.example.com/ is equal to www.example.com/index.html. And that works for the following subdirectories, right? www.example.com/music/ Is there any other way to achieve this without using an index.html file? (I've read smth about converting dynamic urls to static: ./?var1=value1&varN=valueN - ./value1/valueN) How can I convert www.example.com/music/ to music.example.com/ and why should it be used? Thanks in advance!

    Read the article

  • Keeping files private on the internet (.htaccess password or software/php/wordpress password)

    - by jiewmeng
    I was asked a while ago to setup a server such that only authenticated users can access files. It was like a test server for clients to view WIP sites. More recently, I want to do something similar for some of my files. Tho they are not very confidential, I wish that I am the only one viewing it. I thought of doing the same, Create a robots.txt User-agent: * Disallow: / Setup some password protection, .htpasswd seems like a very ugly way to do it. It will prompt me even when I log into FTP. I wonder if software method like password protected posts in Wordpress will do the trick of locking out the public and hiding content from Search Engines? Or some self made PHP script will do the trick?

    Read the article

  • Lakeside Sunset in the Mountains [Wallpaper]

    - by Asian Angel
    Note: Original image size is 5808*3786 pixels and 13.76 MB when downloaded. Sunset-HDR [DesktopNexus] Latest Features How-To Geek ETC Macs Don’t Make You Creative! So Why Do Artists Really Love Apple? MacX DVD Ripper Pro is Free for How-To Geek Readers (Time Limited!) HTG Explains: What’s a Solid State Drive and What Do I Need to Know? How to Get Amazing Color from Photos in Photoshop, GIMP, and Paint.NET Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Have You Ever Wondered How Your Operating System Got Its Name? Lakeside Sunset in the Mountains [Wallpaper] Taskbar Meters Turn Your Taskbar into a System Resource Monitor Create Shortcuts for Your Favorite or Most Used Folders in Ubuntu Create Custom Sized Thumbnail Images with Simple Image Resizer [Cross-Platform] Etch a Circuit Board using a Simple Homemade Mixture Sync Blocker Stops iTunes from Automatically Syncing

    Read the article

  • What does Enable/Disable mean in Bing's URL Normalization feature?

    - by DisgruntledGoat
    I'm in Bing Webmaster Tools, under Index URL Normalization. Many parameters are listed in the table with 3 other columns: Status, Source, Date. The "Source" column says "Webmaster" where I have added parameters, and "Bing" where I assume the parameter has been auto-detected. "Date" is probably the last date it detected the parameter. I've tried searching the help files but I can't find what the Status column means. The top of the page says: This feature allows you to specify query parameters for Bing’s crawler to ignore. But it's not clear whether "Enable" or "Disable" is related to this, and if so what happens in each case. Does anyone know?

    Read the article

  • Are there any web-sites out there that block IE altogether?

    - by Šime Vidas
    Since IE8 is such a backward browser, I was wondering if there are any web-sites on the Internet that just don't support IE altogether (and block it via conditional comments, for instance)? I remember stumbling upon web-sites that block Firefox in the past (like ~2004). The justification of blocking IE is (obviously): You don't want to deal with IE bugs, and you don't want to have to maintain IE-specific hack and workarounds.

    Read the article

  • Search engine friendly, SEO blog software

    - by Steve
    Is there a comparison of the SEO capabilities of different blogging software/blogging plugins? I'd like things to be as optimised as possible. I have a basic grasp of SEO principles, probably 12-24 months old. I'm about to start a blog, after having a few previously. Also, I'm not up to speed on what pings are in the blogging world. What are they, and how do they work? I assume it is best to have blogging software that automatically pings.

    Read the article

  • How can I access profile fields with a % variable in Drupal Actions?

    - by Rob Mosher
    I have an action setup in drupal to e-mail me when a new user registers for the site. Right now it is only telling me their user name (%username). Is there a variable that can access added fields so I can get their real name (First Last), or another way to add this info to the action message? So instead of my new user action having a message like: "%username created an account" - "jschmoe created and account" I could have: "%first_name %last_name (%username) created an account" - "Joe Schmoe (jschmoe) created an account". I'm using Content Profile module for the first and last name fields, though have few enough users at the moment that I could switch to Profile module fields.

    Read the article

  • How should I track multi-valued page attributes (e.g. tags) using custom variables?

    - by Simon
    Our pages can each have many tags, e.g 'football', 'sms', 'nsfw', etc.. wich we would like to track in google analytics. We're already tracking things like category using google analytics custom variables. We've used three of the five available slots so far. How can we track tags the same way? If we just mush them all together - e.g. 'football, sms, nsfw' then can we track the ones that are tagged 'football'? What's the right way to track multi-valued page attributes using custom variables?

    Read the article

  • page rank 0 penalty

    - by mark
    I have a wordpress blog and a www-website on the same domain for about one year. Together it is about 170 pages. The page rank is still 0. I understand that page rank 0 is a penalty for duplicate content. The pages are indexed in google but still no page rank. In google webmaster tools there is no indication for any problem. I asked for reconsideration of both blog and website a month ago. Google accepted the reconsideration but it did not change anything. Other pages of similar size and similar audience earn PR 4-6. Is there something I can do in order to get a fair page rank? A coworker told me that it might be the case that a link farm is using the content and I can do nothing about it. Is there a reliable way to check for something like that? I do not like to give up so quickly is there a chance to fix this by for example moving to another domain?

    Read the article

  • Recommended flexible website solution?

    - by Omega
    My site has a MyBB forums installation, and that is pretty much all I need. Forums. However, I need a homepage and a couple other static pages, for showing relevant information, links, etc. I don't need something fancy, all I need is something very flexible regarding theme and style editing, and just a couple simple modules, like public polls. That's all. I am very graphical, and I am looking for something to let me edit pretty much every aspect of the site. These are static pages, mainly, so I don't need something very complex. Some people tell me to use Dreamweaver, but quite honestly, that is not what I am looking for, even thought it does offer a lot of flexibility. I want something like, you know, Drupal or.. some other simple, deeply-editable in terms of graphics and style web platform. What would you recommend to me? Thank you.

    Read the article

  • In Linux, which tools are free to use to make Web site mockups?

    - by user11173
    I am using Ubuntu/Fedora. Which available mock-up builders i can use before making a website? Follow up: Adobe AIR for Linux is no longer supported. To access older, unsupported versions, please read the AIR archive. Different operating system? Downloaded: http://www.balsamiq.com/download Direct Links Mockups for Desktop: Cross-Platform: MockupsForDesktop.air Windows: MockupsForDesktop.exe Mac OSX: MockupsForDesktop.dmg Linux 32bit: MockupsForDesktop32bit.deb Linux 64bit: MockupsForDesktop64bit.deb Windows with Adobe Air bundled: MockupsForDesktopInstallerWin.zip (for offline installations).

    Read the article

  • problem with template: negative margins on float

    - by Fuxi
    i'm having a very strange problem with the wordpress template. i'd like to place 2 divs besides each other like this: <div style='float:left;'> left div </div> <div style='float:right'> right div </div> normally this works as it should - both divs should stick directly to each other - but something in the style.css (which uses css reset) causes the right div to overlap the left div with ~ 5pixels. i searched the whole .css for it but couldn't find out :(( it's just a fact that it must be something with the default css. anyone knows what is causing this - some fix? thanks

    Read the article

  • Managing accounts on a private website for a real-life community

    - by Smudge
    I'm looking at setting-up a walled-in website for a real-life community of people, and I was wondering if anyone has any experience with managing member accounts for this kind of thing. Some conditions that must be met: This community has a set list of real-life members, each of whom would be eligible for one account on the website. We don't expect or require that they all sign-up. It is purely opt-in, but we anticipate that many of them would be interested in the services we are setting up. Some of the community members emails are known, but some of them have fallen off the grid over the years, so ideally there would be a way for them to get back in touch with us through the public-facing side of the site. (And we'd want to manually verify the identity of anyone who does so). Their names are known, and for similar projects in the past we have assigned usernames derived from their real-life names. This time, however, we are open to other approaches, such as letting them specify their own username or getting rid of usernames entirely. The specific web technology we will use (e.g. Drupal, Joomla, etc) is not really our concern right now -- I am more interested in how this can be approached in the abstract. Our database already includes the full member roster, so we can email many of them generated links to a page where they can create an account. (And internally we can require that these accounts be paired with a known member). Should we have them specify their own usernames, or are we fine letting them use their registered email address to log-in? Are there any paradigms for walled-in community portals that help address security issues if, for example, one of their email accounts is compromised? We don't anticipate attempted break-ins being much of a threat, because nothing about this community is high-profile, but we do want to address security concerns. In addition, we want to make the sign-up process as painless for the members as possible, especially given the fact that we can't just make sign-ups open to anyone. I'm interested to hear your thoughts and suggestions! Thanks!

    Read the article

  • PHP Image gallery that integrates well into custom CMS

    - by Thorarin
    I've been trying to find an image gallery that plays nice with our custom CMS. I've evaluated a number of them, but none of them seems to have the feature list that I would like: Run on LAMP environment Free software or low license costs (the website belongs to a non-profit organisation) Multi-user support Multiple albums. We're posting concert pictures, and would like an album per event. Pluggable authentication system. I want to reuse the accounts we have for our CMS. Permissions can be done inside the gallery itself, but I want to have a single sign on solution in a maintainable manner, by writing my own plugin/add-on for the software. Upload support (multiple images at the same time) And preferrably also: Can be integrated into a PHP page layout without IFRAMEs Automatic resizing of uploaded images to a maximum size Ability for visitors to place comments This combination is proving hard to find, especially the authentication requirement. I don't want to mess around all over the place in the source code to make it use the existing authentication. A plugin would be ideal, but alternatively a well thought out software design that allows for maintainable surgical changes would be acceptable. Any suggestions on which software I should take a closer look into?

    Read the article

  • Does SEO optimisation count on the responsive side of a site?

    - by Rick Donohoe
    I'm looking at making some SEO optimisation fixes, and at this point I'm sorting out the heading structure and keywords - H1's, H2's etc We have a site where there are a number of similar blocks, and one is always visible, and one is hidden depending on the screen size. This is our method of making a single site responsive. Firstly, how does this technique affect the SEO, and in general does the responsive side of a site matter at all to search engines? What I mean by this is if the site has different content depending on screen sizes, then which content would the search spider crawl?

    Read the article

  • Is it a good idea to add robots "noindex" meta tags to deep low content pages, e.g. product model data

    - by Cognize
    I'm considering adding robots "noindex, follow" tags to the very numerous product data pages that are linked from the product style pages in our online store. For example, each product style has a page with full text content on the product: http://www.shop.example/Product/Category/Style/SOME-STYLE-CODE Then many data pages with technical data for each model code is linked from the product style page. http://www.shop.example/Product/Category/Style/SOME-STYLE-CODE-1 http://www.shop.example/Product/Category/Style/SOME-STYLE-CODE-2 http://www.shop.example/Product/Category/Style/SOME-STYLE-CODE-3 It is these technical data pages that I intend to add the no index code to, as I imagine that this might stop these pages from cannibalizing keyword authority for more important content rich pages on the site. Any advice appreciated.

    Read the article

  • Recommendations for a network of student-related content

    - by Javier Marín
    I am running a network of websites with notes, homeworks, essays, etc. where users share their own content. I'm having real trouble with the latest Google updates (penguin, panda, etc) because the content is mainly poor-quality and with the same topic. For that reason, I want to create more websites and have more probabilites to appear in the SERPs. My question is: does Google analyzes related websites in order to exclude it from the results? I've think about distribute the websites around the world, in different hostings, but I'm afraid that Google would link it by their analytics, webmaster tools or adsense account, is that possible? What other recommendations do you have?

    Read the article

  • 301 re-direct all external links to new domain

    - by Dean Legg
    I have changed the main domain to a sub-domain & would like to re-direct all external links to the new sub domain. Have read a few articles but having no luck editing the .htaccess as it might be interfering with all the rules in there. Old: www.example.co.uk New: https://secure.example.co.uk The current rules are quite handy because it seems to have sorted out the structure for all internal links. It has even updated the file path for images (or this could just be wordpress as the url was updated under general settings). This is the current .htaccess <files wp-config.php> order allow,deny deny from all </files> # BEGIN WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> # END WordPress

    Read the article

< Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >