Search Results

Search found 9717 results on 389 pages for 'it pro'.

Page 168/389 | < Previous Page | 164 165 166 167 168 169 170 171 172 173 174 175  | Next Page >

  • How do I get categories for a product in Magento [closed]

    - by Joost de Valk
    I'm trying to add product_type to my Magento Google Base output based on the product's categories, but I seem to be unable to. I have the following code: // Get categories from product to include as product_type $categoryIds = $object->getCategoryIds(); foreach($categoryIds as $categoryId) { $category = Mage::getModel('catalog/category')->load($categoryId); $this->_setAttribute('product_type', $category->getName(), 'text' ); } The issue is that it returns all of the categories, not just the ones the product is in. Anyone have a solution?

    Read the article

  • use subdomain on different host

    - by Roy
    I want to accomplish something that I thought was simple. My wish is as follows: I have a domainname with hosting, a WordPress multisite (with subfolder setup) installed and running: gangleri.nl. I have another domain at another host and without hosting: monas.nl I created a subdomain on gangleri.nl: monas.gangleri.nl and the domain redirects to that subdomain. Now what I want is to have monas.nl act like a website, not a website in a subdomain. I would like to have post urls as in monas.nl/posttitle. I first thought to do this with the DNS settings of Monas.nl. I now have an URL forward, CURL is not what I want and I did not manage to get A-records or CNAMEs to work. I tried using the htaccess file of the WP installation in monas.gangleri.nl. I tried 301, rewrite and whatnot, but also without success. Meanwhile, I have been reading so much that I no longer have a clue what to do. A-record doesn't sound probable, since I have no IP for the subdomain, so an A-record would point to gangleri.nl rather than using the subdomain. Also I have no idea if I should do something in the DNS settings of gangleri.nl or monas.nl, both, one of them and something somewhere else. I have the idea that I've tried everything, but the more I try and read about it, the less I can get my head around. People talking about A-records to subdomains while I can only use IPs, CNAME settings that my host doesn't support or something. Could somebody tell me if what I want is possible and if so, take me by the hand and guide me through it?

    Read the article

  • Desktop Software to monitor online status of web site and web-based application

    - by pansp
    I'm basically looking for a desktop-based software which can monitor my company's website and the web application's online availability. I know there are few online applications like Uptime Robot which does the same work but I have been asked to find a desktop based software which can monitor running in system tray and notify any down-time. A free software would be great. Any help would be appreciated. Thanks!

    Read the article

  • SSL on site which asks API via HTTPS

    - by Larry Cinnabar
    For example I have a site site.com. It has its own http json api: api.site.com. API has authorisation and it runs under https. Now, I need to make visualization of some functionality of json api - so I need to make a profile section on site.com: Authorisation form, and user profile section with actions. All actions will be done via cURL requests to https://api.site.com. Have I use SSL on site.com too?

    Read the article

  • How to copy or replicate a complex website to local file and modify then

    - by Andre Chenier
    I am not good at designing the visual side of a website. I found a website which I gave 10 over 10 because its functionality suits my aims and also it seems very esthetical. I know HTML, PHP, mySQL and some degree of CSS. I don't know JS, Ajax, Jquery. So I want to replicate this web site (save completely) on my local and then modify it. (content, colors, icons etc.) I saved this web site in Chrome and IE. After clicking the site from my local folder, a saw an ugly & non-working site. My aim is to understand the functions of the parts that I don't know. For example when I delete a js in its page what will happen as the result of the deletion operation. Since the page is too complex it has lots of css, js files to download inside. I don't want to deal it manually. Is there any alternative and easy way to get the web page completely to my local which also works like a charm from local? regards

    Read the article

  • SEO for images: can I use a different (cookieless) domain?

    - by Oliver
    Hello, We want to increase the value of some of our important images by means of SEO, and we want to start serving them from a different, i.e. cookieless, domain. We want to go from http://www.example.com/images/1234.jpg to http://www.example.com/germany/bavaria/landscape.jpg which can easily be done via URL rewriting. Then on the other hand, we would like to serve the image from a completely different domain, let's say http://www.examplestatic.com/germany/bavaria/landscape.jpg, to save the overhead of sending the cookie from www.example.com. Somehow I feel that this is not a good idea because I move the image away from the content by putting it on a different domain. Can anyone shed some light on this problem? Naturally, I would just use a different subdomain, e.g. img.example.com, but we already use subdomains for languages and our cookies are valid for all subdomains of example.com, so this won't help. I'd really appreciate any hints. Cheers,

    Read the article

  • Mod Rewrite not working on my addon domain

    - by Ogugua Belonwu
    have a wordpress website on my main domain For the wordpress website i have this in my .htaccess file # BEGIN WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php </IfModule> # END WordPress I just created an addon domain and wanted to use new rules for it I created a .htaccess file and put it inside the addon folder eg /newaddon In the .htaccess file i have: Options -Indexes <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^readjob/(.*)/(.*)/(.*)/$ readjob.php?id=$1&amp;cat=$2&amp;title=$3 </IfModule> The url stucture i have is this: http://www.website.com/readjob/3/jobs/web-designers-potech-integrated-services/ But it keeps telling me link is broken I dont know what to do, pls i need assistance (pls i just learnt mod rewriting today, so clarity will be highly appreciated) Thanks

    Read the article

  • Does having a Google "stop word" in a domain name have less SEO benefit than not having it?

    - by Dan
    Let me explain. Let's say my keyword I want to optimize is "green giraffes". But the domain greengiraffes.com (singular, plural, no hyphen, hyphen, etc.) is not available. I know that the search results for "green giraffes" and "about green giraffes" are essentially the same because "about" is a "stop word". Does that therefore also mean that the domain name "aboutgreengiraffes.com" is as good as "greengiraffes.com" in terms of SEO value? Are all stop words equal in that regard, or a shorter one (such as "e" or "z") is better?

    Read the article

  • URL hex characters in .htaccess

    - by Steve
    There is an old page with a space in the filename, and this is no longer found on the website. So I need to redirect this page to another page using a 301 redirect in .htaccess. If I place the filename directly into .htaccess (Bouquets%20%26%20Loose.html), the redirect does not work. If I escape the % sign like this (Bouquets\%20\%26\%20Loose.html), the redirect still does not work. How do I get this redirect to work in .htaccess? Thanks.

    Read the article

  • can canonical links be used to make 'duplicate' pages unique?

    - by merk
    We have a website that allows users to list items for sale. Think ebay - except we don't actually deal with selling the item, we just list it for sale and provide a way to contact the seller. Anyhow, in several cases sellers maybe have multiple units of an item for sale. We don't have a quantity field, so they upload each item as a separate listing (and using a quantity field is not an option). So we have a lot of pages which basically have the exact same info and only the item # might be different. The SEO guy we've started using has said we should put a canonical link on each page, and have the canonical link point to itself. So for example, www.mysite.com/something/ would have a canonical link of href="www.mysite.com/something/" This doesn't really seem kosher to me. I thought canonical links we're suppose to point to other pages. The SEO guy claims doing it this way will tell google all these pages are indeed unique, even if they do basically have the same content. This seems a little off to me since what's to stop a spammer from putting up a million pages and doing this as well? Can anyone tell me if the SEO guy's suggestion is valid or not? If it's not valid, then do i need to figure out some way to check for duplicated items and automatically pick one of the duplicates to serve as an original and generate canonical links based off that? Thanks in advance for any help

    Read the article

  • Displaying the same page, no matter what URI

    - by jgauffin
    We have moved a webapplication and would like to display a message in the old IIS. Let's say that the application was in http://oldserver/appname/. How do I make sure that our moved.html is displayed to the user no matter which uri the user browsed in to (in that virtual folder)? http://oldserver/appname/some/path.aspx --- should display http://oldserver/appname/moved.html http://oldserver/appname -- should display http://oldserver/appname/moved.html

    Read the article

  • Will adding q&a help my site's rankings, and if so, what are the implications of a sub-domain for q&a rather than a path on the site? [closed]

    - by ElHaix
    Possible Duplicate: Subdomain versus subdirectory One of our web properties is doing quite well without any additional links being created on the site, and our link inventory is tightly managed - no user-generated links. To introduce a community aspect to the site, we want to implement a q&a forum. Once in place, new links will populate our link inventory with keywords that are not necessarily targeted to the site. With the q&a on a sub-domain, would that not affect the main site's rankings? What's the best approach for this?

    Read the article

  • Why is Google still not indexing my !# website?

    - by Zubair
    I have been working on a website which uses #! (2minutecv.com), but even after 6 weeks of the site up and running and conforming to the Google hash bang guidelines stated here, you can still see that Google still hasn't indexed the site yet. For example if you use Google to search for 2MinuteCV.com benefits it does not find this page which is referenced from the homepage. Can anyone tell me why Google isn't indexing this website? Update: Thanks for al lthe help with this answer. So just to make sure I understand what is wrong. According to the answers Google never actually indexes the pages after the Javascript has run. I need to create a "shadow site" which google indexes (which google calls HTNL snapshots). If I am right in thinking this then I can pick a winner for the bounty

    Read the article

  • Which CMS for photo-blog website?

    - by Gacek
    I need to add photo-blog to a site that I'm recently working on. It is very simple site so the blog doesn't have to be very sophisticated. What I need is: a CMS that allows me to create simple blog-like news with one (or more) images at the beginning and some description/comment below. Preferably, I would like to create something that works like sites like these two: http://www.photoblog.com/dreamie or http://www.photoblog.pl/mending/ it must be customizable. I want to integrate it's look as much as possible with current page: http://saviorforest.tk preferably, it should provide some mechanizm for uploading and storing images at the server. I thought about wordpress, but it seems to be a little bit too complicated for such simple task. Do you know any simple and easy in use CMS that would work here?

    Read the article

  • Affiliate software to attract incoming customers

    - by Steve
    I am close to starting a new website for a small business which imports products from USA to Australia. The wholesaler says he will allow my client to be the sole distributor for Australia & New Zealand. I'm not sure what CMS or shopping cart software to use yet, but it will need to include an affiliate system to allow advertisers to push customers our way. Do you have any suggestions for robust, flexible affiliate software?

    Read the article

  • Wierd Results A/B Test in Google Website Optimizer

    - by Yisroel
    I set up a test in Google Website Optimizer that has a 3 variations - original (A), B, and C. In order to further validate the results of the test, I added a variation C that is exactly the same as the original. And thats where the results get weird. 6 days in to the test, the best performing variation is C. It outperforms the original by 18.4%! How is that possible? Do I now discount the results of this test entirely?

    Read the article

  • Google Analytics not working for multiple domains

    - by syalam
    I have a webapp that allows users to embed an iframe on their website. This iframe contains a Google Analytics snippet that is logging an event that captures the website the iframe is embedded on. Google Analytics isn't reporting anything, even though I am clearly embedding this iframe on numerous websites (on multiple domains as well). Does Google Analytics not allow tracking for multiple domains?

    Read the article

  • Where can I safetly search domain whois without worrying about the search engine parking on the domain immediately after the search?

    - by Evan Plaice
    There are a lot of companies that provide domain whois but I've heard of a lot of people who had bad experiences where the domain was bought soon after the whois search and the price was increased dramatically. Where can I gain access to a domain whois where I don't have to worry about that happening? Update: Apparently, the official name for this practice is called Domain Front Running and some sites go as far as to create explicit policies stating that they don't do it. This is where a domain registrar or an intermediary (like a domain lookup site) mines the searches for possibly attractive domains and then either sells the data to a third-party, or goes ahead and registers the name themselves ahead of you. In one case a registrar took advantage of what's known as the "grace period" and registered every single domain users looked up through them and held on to them for 5 days before releasing them back into the pool at no cost to themselves. Source: domainwarning.com And apparently, after ICANN was notified of the practice, they wrote it off as a coincidence of random 'domain tasting'. Source: See for yourself

    Read the article

  • Is there a way I can filter traffic by page-type based upon URL structure in Google-Analytics or Google Webmaster Tools?

    - by Felix
    I have a local business directory site. I'm trying to segment my incoming traffic by page-type such that I can find out what percentage of traffic is going to zip code pages exclusively and what percentage is going to city/state level pages. I basically want to filter by URL structure to find out what percentage of total traffic zip code pages account for. The reason for doing this is to find out if Google Tag Manager can help with this? Here are the two URL paths: http://www.example.com/ny/new-york/10011/ http://www.example.com/ny/new-york

    Read the article

  • Should the English website use href="x-default" when it doesn't auto-redirect to the user's language or country?

    - by Noam
    For each URL on my site, I'm auto-redirecting according to header accept language. The site arch is English version: http://mydomain.com/page Spanish version http://es.mydomaina.com/page etc.. The english version is displayed unless I'm seeing a specific language other than en and that I support in the header, and then a redirect occurs. Google says this: For language/country selectors or auto-redirecting homepages, you should add an annotation for the hreflang value "x-default" as well: My pages aren't language selectors, nor are they the homepage. But I am auto-redirecting. My question is, should my english version be hreflang="x-default" or/and hrefland="en"?

    Read the article

  • SEO tool is telling me title, description and keywords don't exist, but they do. Where is the problem?

    - by DaveDev
    I'm using the following tool to analyse how 'optimal' a site that I'm working on is for search engines: http://tools.seobook.com/general/spider-test/ I enter the URL for the site - http://ftmsuat.moneymate.com - into the search bar, and it returns a breakdown of the contents of the page. I'm a little confused by what I see though. According to the results, the page doesn't have a title, description or keywords. But if you check the source of the page, those elements are definitely there. So I'm wondering now, which is wrong? seobook.com or my page?

    Read the article

  • Which of these URL scenarios is best for big link menus? [seo /user friendly urls]

    - by Sam
    Hi folks, a question about urls... me and a good friend of mine are exploring the possibilities of either of the three scenarios for a website where each webpage has a menusystem with about 130 links.: SCENARIO 1 the pages menu system has SHORT non-descriptive hyperlinks as well as a SHORT canonical: <a href:"design">dutch design</a> the pages canonical url points to e.g.: "design" OR SCENARIO 2 the pages menu system has SHORT non-descriptive hyperlinks wwith LONG canonical urls: <a href="design">dutch design</a> the pages canonical url points to: dutch-design-crazy-yes-but-always-honest OR SCENARIO 3 the pages menu system has LONG descriptive hyperlinks with LONG canonical urls: <a href="dutch-design-crazy-yes-but-always-honest">dutch design</a> the pages canonical url points to: dutch-design-crazy-yes-but-always-honest Currently we have scenario 2... should we progress to scenario 3? All three work fine and point via RewriteMod to the same page which is fetched underwater. Now, my question is which of these is better in terms of: userfriendlyness (page loading times, full url visible in url bar or not) seo friendlyness (proper indexing due to the urls containing descriptive relevant tags) other concerns we forgot like possible penalties for so many words in link hrefs?? Thanks very much for your suggestions: much appreciated!

    Read the article

  • Gzip compress offline?

    - by shoosh
    I've configured my site to serve compressed content by putting this line in .htaccess AddOutputFilterByType DEFLATE text/html text/plain text/xml text/javascript text/css application/javascript application/json This works perfectly for almost all files except a few large JSON files that are above 200Kb. For some reason they are not being compressed. I see that they don't using the net tab in firebug and the Network section in chrome. So as a workaround I thought I could compress these files offline and have Apache read them compressed. What tool should I use to compress them? is the linux gzip the one? any special flags or something I should use? What should I put in .htaccess so that the server would know to serve these files with content-encoding gzip ?

    Read the article

  • Can I include a robots meta tag outside of the head in HTML snippets indeded to be SSIed?

    - by Dan
    I have a number of files in my site which are not intended for independent viewing, but rather to be AJAXed into content within the site. They obviously don't meet HTML standards (no body, head, etc.) as independent entities. I would like to prevent search engines from indexing these pages, but do not have access to /robots.txt (which would be much more ideal). My question is, could I include the following at the top of these partial HTML files and get the desired results? <meta name="robots" content="noindex, noarchive"> I guess there are two parts to this question. Will this cause any rendering issues in any browsers? Will search engines (at least Google & Bing) interpret this as intended?

    Read the article

  • How long should I keep 301 redirecting pages from a deprecated domain?

    - by ElHaix
    I had an old domain that I have deprecated, but 301 redirected all results from it to my new site. The new site is now receiving a decent amount of traffic, but I don't know if it's 301 redirected from the old site, and doing a site:[old site] still shows several thousand pages indexed. Since all pages from the old site are 301 redirected, will they ever be removed from the index, as long as the old domain name is active? As a rule of thumb, somewhere I got 90 days for any significant site changes. When is it safe to burn the old domain?

    Read the article

< Previous Page | 164 165 166 167 168 169 170 171 172 173 174 175  | Next Page >