Search Results

Search found 11896 results on 476 pages for 'smart pro'.

Page 108/476 | < Previous Page | 104 105 106 107 108 109 110 111 112 113 114 115  | Next Page >

  • 301 url rewrite loop

    - by anyvendetta
    I need to do a 301 rewrite to force all urls to become lowercase i put in htaccess (RewriteMap lc int:tolower in httpd.conf) RewriteCond %{REQUEST_URI} [A-Z] RewriteRule . ${lc:{REQUEST_URI}} [R=301,L] Everything works just fine except to urls with subcategories which in this case are: /category-1256-Product-page-example.html the numer 1256 refers to a "subcategory" So when i try to access /category-1256-Product-page-example.html gives me a loop error message I think another redirect rules are making the loop but dunno how to fix it because are just this urls rewrite rules that don't work with the above rewrite. Rewriterule ^main-site-url/category-([0-9]*)-([-_a-zA-Z0-9]*)\.html$ /subcategories.php?idcategory_main=1&idcategory=$1&category=$2 [L] Rewriterule ^main-site-url/([0-9]*)-([-_a-zA-Z0-9]*)-([0-9]*)\.html$ /file.php?idcategory_main=1&idsubcategory=$1&product=$2&idproduct=$3 [L]

    Read the article

  • Gracefully terminate a request based service on server

    - by Jatin
    In our web application, for each http-request there is a lot of computation that happens on back end. Output can vary from 10 sec - 1 Hour. In the mean time when it is computed, "Waiting.." is shown on the website for the respective user. But it so happens, that a user might cut down the service in between. So what all can be done on the back end so that the computation can be stopped in between to save resources? What different tactics can be applied here? And if better (instead of killing the thread directly), then a graceful termination policy should make wonders.

    Read the article

  • Highly SEO optimised forum posts

    - by Tom Gullen
    Given the following forum post: Basics of how internals of Construct work I've used GameMaker in the past. And I know some C++ and have used a few 3d engines with it. I have also looked at Unity, though I didn't get too much into it. So I know my way around programming etc... My question is, how does construct work internally? I know it allows python scripting, which itself is "technically" interpreted, though python is pretty fast as far as being interpreted goes. But what about the rest? Is the executable that gets cre... The forum software will take the first 150 chars of the first post as the page meta description, and the title will be the thread title. All ok. So in Google it will appear as: Basics of how internals of Construct work I've used GameMaker in the past. And I know some C++ and have used a few 3d engines with it. I have also looked at Unity, though I didn't get too much... http://www.domain.com/forum/basics-of-how-internals-of-construct-work.html Now the problem is (not so much with this thread, but other ones) is the first 150 chars don't always create the best meta description. Is it worth my time to cherry pick threads and manually set their description/title tags so they read like: Internal workings of Construct 2 Events aren't converted to any other language. The runtime is a standalone compiled EXE application, which is optimised and actually very fast. Your events... http://www.domain.com/forum/basics-of-how-internals-of-construct-work.html The H1 on the page is still the original title, but we have overridden the title and description to look more friendly on search results. Is this advantageous forgetting the obvious time cost?

    Read the article

  • Has anyone been able to convert a site's media content from flash to html5?

    - by Muhammad
    I'm looking to convert a site from Flash based video to HTML5, the current video uses time marks to display slides (kind of like how youtube has ads on their videos). But the difference between Youtube and my site is that it doesn't show up inside the video, the slides are displayed next to the video. Is there any way I can accomplish this with HTML5? Or do I have to use Javascript for this? If this isn't clear enough, please let me know.

    Read the article

  • SEO Service - Refresh SEO

    - by Dan
    I've been approached to possible take over SEO/marketting work for a site. The guy is currently using a paid service at http://refreshseo.com/ and paying around $80p/m. From what I can make out all refreshseo does is automatically generate keyword rich content pages and attach them to the site. These pages aren't actually linked to from within the site. So I'm wondering two things has anyone had any experience with this particular company or similar types - has it been worth it? How do you think the recent Google Panda updates impacts on this kind of strategy? Thanks in advance

    Read the article

  • How is this site so fast?

    - by user8628
    how is the website http://dftba.com/ so fast? when i click a link it loads right then? what makes it work like this? how do i make it work like this on my site? some of the objects on the site are being hosted by a website called ecogeek-cdn.net? who is this company and why do they host the images of this site? i have been looking into this site some time because i want this site to be like mine site they site use Apache they site use Python (when asked the developer told me this) they site use jquery and jqueryui they site is custom built not using wordpress they site is ownedhosted by liquidweb they site gets a million users a month they site launched in january they site uses cpanel they site does not have SSH or FTP (i tried to connect but it denied me all) they does have SSH and FTP but only allowed by their addresses Please; my english is not as good as yours

    Read the article

  • Choosing a CSS grid/framework

    - by jonallard
    There are many grids and framework to choose from. A Google search for CSS frameworks will return a dozen articles that themselves list a number of frameworks to choose from. When it comes to choosing one, it's easy to be lost without having an intimate knowledge of all of them. What are the main factors that go into choosing a CSS framework, and how will those choices map to certain frameworks? More generally, how does one choose a CSS framework? Note 1: I'm using "grid" and "framework" almost interchangeably here, but there is probably one I should use over the other. Corrections on this are welcome. Note 2: I am well aware that some choices will depend on taste and accordingly, this question can turn into a "best of" contest/subjective topic. I'm trying to keep it as answerable as possible, as I'm pretty sure many have this problem/question of choosing a framework and an answer to that would benefit the community. As such, improvements to this question are welcome rather than just closing it.

    Read the article

  • Strategy for managing lots of pictures for a website

    - by Nate
    I'm starting a new website that will (hopefully) have a lot of user generated pictures. I'm trying to figure out the best way to store and serve these pictures. The CMS I'm using (umbraco) has a media library that puts a folder on the server for each image. Inside of there you can have different sizes of that same image. That folder has an ID on it and the database has additional information for that image along with the ID of the folder. This works great for small sites, but what if the pictures get up to 10,000, 100,000 or 1,000,000? It seems like the lookup on the directory would take a long time to find the correct folder. I'm on windows 2008 if that makes a difference. I'm not so worried about load. I can load balance my server pretty easily and replicate the images across the servers. The nature of the site won't have a lot of users on it either, but it could have a lot of pics. Thanks. -Nate EDIT After some thought I think I'm going to create a directory for each user under a root image folder then have user's pictures under that. I would be pretty stoked if I had even 5,000 users, so that shouldn't be too bad of a linear lookup. If it does get slow I will break it down into folders like /media/a/adam/image123.png. If it ever gets really big I will expand the above method to build a bigger tree. That would take a LOT of content though.

    Read the article

  • Multiple Users Sharing Email Accounts - How to make messages "unread" for each user?

    - by Ralph N
    Sorry if this isn't appropriate for webmasters stackexchange, but i'm not sure where it would go since it has to do with the logistics of setting up a website/email for a small company. I have an ecommerce store with multiple addresses such as: - [email protected] - [email protected] I'm not the only person managing the store. My partner and I both have those email accounts set up on each of our respective computers (right now it's using IMAP - he's on outlook and i'm on thunderbird). We have one major problem with this setup - If I read an email before he does, he doesn't know it ever arrived because the IMAP tells the server that the email has been "read". Of course, the same problem exists if he reads an email before I do. I understand that i can fix this problem by switching to POP3 and leaving the messages on the server. I'm okay with that... but it creates a NEW PROBLEM. If he replies to an email, it is no longer sync'd back to the server and therefore I can't read any emails that he has sent out. Same problem vice versa. This is also important for us because we want to know if one of us has already replied to an email so that the same thing isn't being done twice. With IMAP, all our sent emails are put in the Sent folder and synced to the server. What's a good way to set this up for a small company where email sharing is involved? My website is hosted on arvixe business ASP...

    Read the article

  • Stop bots from crawling old links with extensions

    - by Jared
    I've recently switched to MVC3 which is extension-less for the URL's, but Google and Bing have a wealth of links that they are crawling which no longer exist. So I'm trying to find out if there is a way to format robots.txt (or by some other method) to tell google/bing that any link that ends in an extension isn't a valid link... Is this possible? On pages that I'm concerned about a User having saved as a fav I'm displaying a 404 page that lists the links to take once they are redirected to the new page (I decided to not just redirect them as I don't want to maintain these forever). For Google/Bing sake I do have the canonical tag in the header. User-agent: * Allow: / Disallow: /*.* EDIT: I just added the 3rd line (in text above) and it APPEARS to do what I'm wanting. Allow a path, but disallow a file. Can anyone confirm this?

    Read the article

  • Canonicalization of single, small pages like reviews or product categories

    - by Valorized
    In general I pretty much like the idea of canonicalization. And in most cases, Google explains possible procedures in a clear way. For example: If I have duplicates because of parameters (eg: &sort=desc) it's clear to use the canonical for the site, provided the within the head-tag. However I'm wondering how to handle "small - no to say thin content - sites". What's my definition of a small site? An Example: On one of my main sites, we use a directory based url-structure. Let's see: example.com/ (root) example.com/category-abc/ example.com/category-abc/produkt-xy/ Moreover we provide on page, that includes all products example.com/all-categories/ (lists all products the same way as in the categories) In case of reviews, we use a similar structure: example.com/reviews/product-xy/ shows all review for one certain product example.com/reviews/product-xy/abc-your-product-is-great/ shows one certain review example.com/reviews/ shows all reviews for all products (latest first) Let's make it even more complicated: On every product site, there are the latest 2 reviews at the end of the page. So you see, a lot of potential duplicates. Q1: Should I create canonicals for a: example.com/category-abc/ to example.com/all-categories/ b: example.com/reviews/product-xy/abc-your-product-is-great/ to example.com/reviews/product-xy/ or to example.com/review/ or none of them? Q2: Can I link the collection of categories (all-categories/) and collection of all reviews (reviews/ and reviews/product-xy/) to the single category respectively to the single review. Example: example.com/reviews/ includes - let's say - 100 reviews. Can I somehow use a markup that tells search engines: "Hey, wait, you are now looking at a collection of 100 reviews - do not index this collection, you should rather prefer indexing every single review as a single page!". In HTML it might be something like that (which - of course - does not work, it's only to show you what I mean): <div class="review" rel="canonical" href="http://example.com/reviews/product-xz/abc-your-product-is-great/"> HERE GOES THE REVIEW</div> Reason: I don't think it is a great user experience if the user searches for "your product is great" and lands on example.com/reviews/ instead of example.com/reviews/product-xy/abc-your-product-is-great/. On the first site, he will have to search and might stop because of frustration. The second result, however, might lead to a conversion. The same applies for categories. If the user is searching for category-Z, he might land on the all-categories page and he has to scroll down to the (last) category, to find what he searched for (Z). So what's best practice? What should I do?

    Read the article

  • DNS slows down on development environment

    - by Sequenzia
    I have a local development environment setup on my Mac. I am running an Ubuntu Web Server inside of a Virtual Box VM. I setup a host file on my Mac that points my dev site to the IP of the Ubuntu Virtual Server. Everything works good other than the fact a lot (not all) of the time it takes more than 5 seconds to load a page. I used firebug to track down where the problem is and when it's slow the DNS part of my request is taking over 5 seconds. Like I said it's not all the time. Sometimes it resolves and loads the page within milliseconds. The same page one click will be super fast and then the next time it takes over 5 seconds. It's really slowing me down and I am not sure what is causing it.

    Read the article

  • Unindexing my tumblr blogs content and moving it to another tumblr blog

    - by sam
    ive been writing a tumblr blog for the past yr or so, ive writen about 300 articles, but now i need to move the blog to another site. (before it was running under blog.mysite.com and i now want it to run under blog.my*new*site.com) I want to keep the archived articles and have them on the new site, so what i was hoping to do was export the blog from tumblr, go into webmaster tools remove all the blogs indexed urls from google webmaster, then make a new tumblr blog and import the posts. Would google see this as new content as ive deleted their indexed copy ? Could i just move the mapping of the tumblr blog to the new subdomain, but in doing this i would lose all the pr and it would still look like duplicate content whats the best way to approach this ?

    Read the article

  • Web developer has become uncooperative, what should we do to rescue our site? [closed]

    - by TOM
    What can an individual or a company do if the web designer who has designed their website becomes completely uncooperative? In our case He refuses to meet with us for discussions. He refuses to give us training on the effective use and management of the website he has designed for us (and has been paid in full for) despite this training being part of the original contract. Last year he disappeared for a number of months ,refused to answer emails or phobe calls and was totally unavailable to help us He refuses to give us the details of the hosting of our website. We have totally lost faith in this arrogant and unreasonable guy and would like to break off all relationships with him but ,it appears, he's got us over a barell

    Read the article

  • Best way to prevent Google from indexing a directory [duplicate]

    - by Gkhan14
    This question already has an answer here: Stopping Google index some web pages I have 5 answers I've researched many methods on how to prevent Google/other search engines from crawling a specific directory. The two most popular ones I've seen are: Adding it into the robots.txt file: Disallow: /directory/ Adding a meta tag: <meta name="robots" content="noindex, nofollow"> Which method would work the best? I want this directory to remain "invisible" from search engines so it does not affect any of my site's ranking. In other words, I want this directory to be neutral/invisible and "just there." I don't want it to affect any ranking. Which method would be the best to achieve this?

    Read the article

  • What is the right way to do this HTML: header with icon linked

    - by Hell Awaits
    I know how to make these examples look and behave the same. But I would like to know which is the right way to build a HTML structure. <a><img><h1></h1></a> - looks wrong because an inline element is inside of a block element <a><img></a><h1><a></a></h1> - the same a-element is defined twice. Also I'm not sure about markup inside headers Any other solutions ?

    Read the article

  • I need a webpage to host my javascript!

    - by Amir Reza
    Does anyone know a website that hosts javascripts on their page? I have a research project that needs to collect some RTT from all over the world and compare them together. I have written the javascript code for that but I do not have a high hit rate website to put it on to collect data. I know it is a little bit odd question to ask, but do you know any website or any trick that can help me? Note that the script would not do any harm to anybody! :-) Thanks, Decad is right, I basically need some people to put my script on their "high-hit rate" website ... so I can collect data from large number of clients... Of coarse, the script is run on the background with no harm to the page. It basically measures some RTT and submit it to a server. I already have some pages, but they barely got a hit from outside! Thanks,

    Read the article

  • Email provider - suggestions needed

    - by Christian Fazzini
    We are looking for a good way to have email support. In theory, we need to allow end-users to send emails directly to support and careers. i.e. support@domain_name_here.com and careers@domain_name_here.com. Second, we need to provide emails to our staff. So each staff member has their own email address. i.e. joe@domain_name_here.com, meghan@domain_name_here.com, etc. Google Apps is one that we are considering. However, they are charging $50 per user, per year. Not so bad, considering the quality and the features they offer. However, there are also cheaper alternatives. i.e. my domain registrar offers an email plan for $20 / year / 10 emails. Go Daddy has a number of plans and still a lot more affordable than Google Apps. So far Namecheap and Go Daddy are the only ones I have looked at for email plans. Is it worth signing up with Google Apps? Or are there better alternatives? Your thoughts?

    Read the article

  • How do I handle having too many links on a webpage because of my menu

    - by RandomBen
    I am developing a website that has a drop-down menu at the top of it. The Menu has around 100 links in it that are repeated on every page. Every page also has some number of links below the Menu that may or may not be in the menu itself. My issue is that Google says they generally don't like pages with more than 100 links on them. Is there any way to change the links on the menu so that they no longer "count" towards my max of 100 links? It seems like there should be an easy way to do this but their really doesn't seem to be. the rel=nofollow still counts towards the number of links on the page at least according to Google, so what other options do I have? I looked into where the 100 comes from and I found that it used to be here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35769#2 but that is no longer the case. I found a more definitive and frankly muddier answer here: http://www.seomoz.org/blog/questions-answers-with-googles-spam-guru from Matt Cutts from 2007. Long story short, in 2007 they still felt 100 links was a good number but they stated you could go far beyond that. In fact, they said that pages with high PageRank could have 2-300. It did sound like having many links could reduce the PageRank of the page with all of the links or possibly all of the items linked to. Also, I know IIS7's SEO 1.0 toolkit suggests that pages should have no more than 250 links.

    Read the article

  • Can I enable GZIP on Godaddy?

    - by Dave
    One of my sites lives on GoDaddy's bottom-of-the-line cheap hosting, I have the correct code in my .htaccess file, but it's not compressing because mod_deflate is not loaded. How do I enable that? The best I've found is this article which suggests I use PHP to zip everything (which is going to be more work than just changing hosting companies): " Add the following code to the very top of your Web pages above the DOCTYPE: if (substr_count($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip')) b_start("ob_gzhandler"); else ob_start();

    Read the article

  • Handling SEO for Infinite pages that cause external slow API calls

    - by Noam
    I have an 'infinite' amount of pages in my site which rely on an external API. Generating each page takes time (1 minute). Links in the site point to such pages, and when a users clicks them they are generated and he waits. Considering I cannot pre-create them all, I am trying to figure out the best SEO approach to handle these pages. Options: Create really simple pages for the web spiders and only real users will fetch the data and generate the page. A little bit 'afraid' google will see this as low quality content, which might also feel duplicated. Put them under a directory in my site (e.g. /non-generated/) and put a disallow in robots.txt. Problem here is I don't want users to have to deal with a different URL when wanting to share this page or make sense of it. Thought about maybe redirecting real users from this URL back to the regular hierarchy and that way 'fooling' google not to get to them. Again not sure he will like me for that. Letting him crawl these pages. Main problem is I can't control to rate of the API calls and also my site seems slower than it should from a spider's perspective (if he only crawled the generated pages, he'd think it's much faster). Which approach would you suggest?

    Read the article

  • What do I do if a user uploads child pornography?

    - by Tom Marthenal
    If my website allows uploading images (which are not moderated), what action do I take if a user uploads child pornography? I already make it easy to report images, and have never had this problem before, but am wondering what the appropriate response is. My initial thought is to: Immediately delete (not just make inaccessible) the image File a report with the National Center for Missing and Exploited Children with all information I have on the user (IP, URL, user-agent, etc.), identifying myself as the website operator and providing contact information Check any other images uploaded by that IP user and prevent them from uploading in the future (this is impossible, but I can at least block their account). This seems like a good way to be responsible in reporting, but does this satisfy all of my legal and moral responsibilities? Would it be better not to delete the image and to just make it inaccessible, so that it can be sent to the National Center for Missing & Exploted Children, the police, FBI, etc?

    Read the article

  • Can I remove visible referer from link?

    - by Andreas
    I use referer info to track which of my campaigns works the best. So instead of <a href="someweb.com">someweb</a> I have a link like <a href="http://someweb.com?utm_source=john&utm_medium=email&utm_content=NAME&utm_campaign=campaing">someweb</a> Now when a user cliks "someweb" the whole URL string is shown in the adressbar. Is this possible to mask/hide somehow? Maybe via .htaccss? Thanks in advance

    Read the article

  • How can I cap cloud-costs?

    - by Joe Simpson
    I am looking into launching a cloud-based product for consumers where with a prepaid account they can start a server with a simple click and load up the software and access it remotely. The technical side of that I can manage, but I am worried about the costs escalating ridiculously high for both me and my customers Is there a way I can Limit how much each server can cost me before it will be deactivated See how much a server is currently costing me (so I can deduct it from their account) with it being extremely reliable as I don't want to have to have a giant bill in any possibility.

    Read the article

  • adwords traffic shows as 'not set' in google analytics

    - by sam
    in google analytics if i go to traffic sources search paid all i get is "not set". instead of the usual list of keywords that people have used to find the ad, this makes it really difficult to understand whats going on in the campaign.. ive gone into adwords and turned on auto tagging but still the same problem any idea how i can fix this so under the paid tab i get the phrases people have used to find my ad in ppc ?

    Read the article

< Previous Page | 104 105 106 107 108 109 110 111 112 113 114 115  | Next Page >