Search Results

Search found 17124 results on 685 pages for 'final cut pro'.

Page 178/685 | < Previous Page | 174 175 176 177 178 179 180 181 182 183 184 185  | Next Page >

  • What is the typical example of old school website design?

    - by Pierre 303
    I want to build a website for a retro thing that was popular in the mid 90s (beginning of the commercial internet). So I want use old designs that was very popular at that time. The first thing that comes to my mind was those "under construction" animated gifs. People often put animated gifs everywhere. But also those awful repeating backgrounds. So yes, I want my website to look exactly like in the mid nineties ;) (please suggest practical and usable features, I guess an Java Applet menu would not work today, or saying on the bottom that this website is optimized for Netscape 3) EDIT: for those that wants to see the result: Retrology

    Read the article

  • CMS for Blog site

    - by Yau Leung
    I would like to create a blog site with features like Engadget. The editor can upload blog and albums while users can comment. I know it's even easier to use blogspot but it's blocked in China. I have tried Joomla before. It seems a bit slow even after removing most of the modules and the memcache plugin doesn't help much either. Is there any other option? Do I need other plugins to run WordPress as blog?

    Read the article

  • Anonymouse VS Logged in users on my site & Google Analytics

    - by Flowpoke
    I'd like to be able to run two different 'tracks' for Google Analytics; One for anonymous users of the site and another for Users whom are logged-in. I say "track" because Im not sure of the term--but I definitely know I want it to all be in the same "Analytics Account", I just want to segregate my logged-in users... In the site template, I can very easily add a conditional to display one or the other (Analytics code snippet)... Which Im hoping this comes down to and although Im not sure, it seems that the last digit in your Analytics ID (e.g. UA-15XXXX0-X) could be incremented to gain such additional 'tracks'....? Any tips? Am I doin it wrong? My current footer snippet: <script type="text/javascript"> var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); </script> <script type="text/javascript"> try { var pageTracker = _gat._getTracker("UA-XXXXXXX-1"); pageTracker._trackPageview(); } catch(err) {} </script>

    Read the article

  • best way to cleanly display 15-20 youtube videos in a webpage?

    - by Phil
    I am currently building a website for a client who has about 20 or so videos that he wants put on a video gallery page and I was wondering if you guys could help me out and give me some advice on how to go about this. I have found yoxview, possibly lightbox... but I don't know if that popup window is really good for video browsing. Also, should/could this be asked in the webmasters instead of stackoverflow?

    Read the article

  • What to do with random pages after a 301 redirect?

    - by Alex
    Hello, I did a standard 301 redirect for a domain, but the original domain has about 300 pages that have some strength. It doesn't make sense to make them all point back to the new home page because the individual pages are about some topics. Also, there aren't the same pages in the new domain, so where should the original random pages redirect to? I would like to have them rank for the same topics they used to, but without having the original domain giving them strength, they will just stop ranking and die off. What should I do? Thanks, Alex

    Read the article

  • Wordpress Queue like Tumblr?

    - by Michael Hopkins
    Hi. Is there a way to give Wordpress the queue functionality that Tumblr has? Tumblr's queue, for those who don't know, is a way to space posts out without assigning specific post dates. For example, a Tumblr queue might be set to post every four hours between 9am and 5pm. Tumblr would drop the front post in the queue at 9am, 1pm and 5pm every day. Posts are added to the queue by clicking "add to queue" instead of "publish." It's quite simple. How can this feature be added to Wordpress?

    Read the article

  • How to make this CSS design of words in headings look clean and well desinged? [closed]

    - by kacalapy
    I am trying to get the lipstick on the pig and not wearing my UI developer hat often is making this impossible. Can someone give me nice alternatives to the code below. this is what i have now. <style> .FirstLetter:first-letter{font-family: arial; font-size: 14pt; font-weight: bold;color:White; background:Blue; border:1px black solid; padding-top:8px; padding-left:8px; padding-bottom:3px;} .Spaced{letter-spacing: 5px;font-family: arial; font-size: 14pt; font-weight: bold;} </style> <div class="FirstLetter Spaced headerFont"> Executive Summary </div> Here is the ugly result of the above code- i am lookign to make the header section look better ONLY that's where the first letter is blue:

    Read the article

  • Force www. on multi domain site and retain http or https

    - by John Isaacks
    I am using CakePHP which already contains an .htaccess file that looks like: <IfModule mod_rewrite.c> RewriteEngine on RewriteRule ^$ app/webroot/ [L] RewriteRule (.*) app/webroot/$1 [L] </IfModule> I want to force www. (unless it is a subdomain) to avoid duplicate content penalties. It needs to retain http or https Also This application will have multiple domains pointing to it. So the code needs to be able to work with any domain.

    Read the article

  • When to reply 400 Bad Request

    - by KajMagnus
    According to www.w3.org, a Web server should reply with status code 400 Bad Request if: "The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications" Does that mean only request that violates some HTTP spec? Or does it include a request that my particular Web app thinks is broken? When would you reply 400? For example, if my Web app expects a query string to always include a "function=..." parameter, would you reply code 400 Bad Request or 403 Forbidden? (403 means that "The server understood the request, but is refusing to fulfill it.")

    Read the article

  • Automated Website Testing/Sanity/Quality

    - by Jeff
    I am thinking about building a tool that starts from the root of a webpage and traverses the entire website gathering a list of resources such as CSS/HTML/Javascript files and then runs CSS/Javascript Lint + HTML Validator + Broken Link Finder. Before I start building something like this, I was wondering if this exists already? Thanks. I already searched Google quite a bit and couldn't find much.

    Read the article

  • Content appearing under multiple categories; anything I can do to prevent duplicate penalty?

    - by dave
    I'm working with a CMS that allows me to post content in to multiple categories. So, I have this link: www.site.com/category/green-cars Here are the GREEN cars TITLE: A Big green car INTRO: this is a great big green car. But then I have this link: www.site.com/category/big-cars Here are the BIG cars TITLE: A Big green car INTRO: this is a great big green car. So essentially - for every item of content, header and the intro sentence is the same regardless of the category the item appears in. Will a search engine penalise the site for having the same content in this way? I've looked at canonical links, but I don't think this is relevant here. All my content points to the same page - but the content may appear in multiple categories first. Or am I worrying about nothing? Thanks.

    Read the article

  • Redirects in .htaccess to avoid crawl errors

    - by user71698
    I am getting a lot of errors in Webmaster Tools and basically there's a lot of links ending like this: mydomainname.com/links.php How can I redirect these links, to shave off this part at the end? For instance, there is a link in Google: http://www.onlineglobalbiz.com/article-marketing/www.onlineglobalbiz.com/links.php This should be: http://www.onlineglobalbiz.com/article-marketing/ Using .htaccess, how can I redirect from the incorrect links?

    Read the article

  • Digital "Post It" notes for organizing content of sites/pages

    - by Alex
    We're restructuring our old intranet into a new one and are going through each site to find content and use our new standard structure/look-and-feel. Do you recommend a tool where you can do "digital Post-It" notes? It would provide a way to type some items on a "card" and be able to move it around and organize it quickly. Also, if you know of tools in general for this kind of task, please advise. Thank you.

    Read the article

  • Browser language detection & content ranking for new language on the same site.

    - by Arnaud
    I've been reading a lot about it but it's still really hard to make up my mind. My understand is that if your website provide a link to the other language, this should not be an issue for google as long as your links are clear and clean, google will be able to make his way through it. The website was orginaly in french and I added the english version and I'm just worry that english speaker will just leave if the site is not in the correct language, for the home page I just wanted to get the value from the browser and redirect it to /fr/ or /en/ for the first page. (using php this will be very easy) Could you guys have a look at it and tell me what you think about it http://tinyurl.com/bpc5bn9 I don't want to get it wrong and lost my ranking with google. Also the website has good rank on the french side and the english has been online for 2 weeks and only get few visit a day, is that because all the back link refer to /fr/ and google is cleaver enough to decide that they are 2 differantes website and the back link will have to point to /en/ to increase the ranking value? Or will take few more weeks for the website to grow? Thanks for your hep

    Read the article

  • What measures can be taken to increase Google indexing speed for a given newly created page?

    - by knorv
    Consider a website with a large number of pages. New pages are published regularly. When publishing a new page the website operator wants to get the newly created paged indexed in Google as soon as possible. The website operator wants to minimize the time spent between publication and indexing. Consider the site http://www.example.com/ with hundreds of thousands of pages. The page page http://www.example.com/something/important-page.html is created at say 12:00. How do I get important-page.html indexed as soon as possible after 12:00? Ideally within seconds or minutes. Or more generally: What options are available to try to get Google to index a specific newly created page as soon as possible?

    Read the article

  • Adwords: Is there a drawback to setting a really high CPC to learn what works faster?

    - by Rob Sobers
    I'm toying with increasing my max CPC really high on all my keywords so ensure my ad gets shown in the top spot on page one in order to draw more clicks. I think this will be a good way to quickly figure out whether the ads I'm writing have a decent CTR and, more importantly, whether the landing pages I'm building are converting. Since I can set a max daily budget for my campaign, I won't risk breaking the bank. I can't think of any drawbacks, personally. Am I missing any?

    Read the article

  • CSS alignment differs per page, cant find reason [migrated]

    - by Floran
    I list products on my homepage and on a company details page. I use the exact same HTML, but for some reason the product appears different: The productname is "Artikel 1". Here the product is displayed correctly: http://www.zorgbeurs.nl/ Notice how the green price area is right below the product. But here: http://www.zorgbeurs.nl/bedrijven/76/mymedical the green price area is all the way at the bottom of the page. Why?

    Read the article

  • What is the the maximum time for a user to return to Google for the visit to be flagged up as a bounce in GA?

    - by Anonymous
    I know that Google measures bounce rates by how fast a user returns to the results page after clicking-through to a website. Roughly what is the maximum duration of the visit for the user to then return for it to be considered a bounce? i.e. <5 seconds, <30 seconds? I'm mainly interested as it appears a lot of users clicking through my PPC adverts (Adwords) are bouncing, despite my ads having a high quality score and the page's being entirely related to the adverts copy and at as best tied to what I think user's may be searching for from the key phrases I've selected so the high bounce rate (100% on some keywords) seems a bit strange. If a bounce isn't determined by time, but simply whether a user returns to the SERP after visiting my site or not after any amount of time that would make more sense but the average duration of visit for my keywords with a 100% bounce rate in GA is 00:00:00, which suggests a user immediately returned to the SERPs, which again, is odd. Is my GA data being skewed by https or anything like that? Scratching my head here.

    Read the article

  • do you still get a bounce in google analytics if all the linked pages/content is loaded dyanmicly?

    - by sam
    Google analytics describes a bounce as a user that visits and leaves before after their first page. But if your site is a one page site, with content loaded dynamicly using javascript you could have a user one your site go through loads of info, text images but would that still count as a bounce ? Or once they click on an a-tag even if it is <a href="#"> can google analytics see that ? (im aware of click tracking in analytics) but i was wandering if google picks up these clicks by default..

    Read the article

  • Amazon Affiliate search using a movie title

    - by Matt Walker
    I am currently working on a movie trailer site. I have over 300 movies and I do not want to add an amazon affiliate link to each one individually. Does amazon offer any sort of api that will allow me to use a movie title to search for a dvd on amazon? Ex. For the movie skyfall, the amazon affiliate link would be amazon.com/search/dvd/skyfall/affiliateid ^ I just made the link up as i don't know how their system works, but I just want it to do a search on the movie title Thanks in advance for any help you can give me!

    Read the article

  • Foolproof way to ensure Google news pulls the correct image for it's thumbnails?

    - by Anthony
    Google news results have an acompanying thumbnail next to articles that show up in the results. If google's crawler can't find a thumbnail to pull from our site, it uses its next best guess from another site, therefore linking the image to another site but still uses our headline. Example: Headline from Reuters, Image from Livemint: Our pages absolutely have images, they are not massive in file-size or dimensions, yet we are not having them pulled / crawled correctly. We have read up on the suggestions from google, and from others around the web and nothing is panning out. Has anyone had any experience where they can ensure google news will pull a thumbnail of our choosing?

    Read the article

  • Does purposely linking to an invalid URL and then using 301 affect SEO?

    - by Mike
    On a section of my site, I am currently using .htaccess rewrites to put the ID as part of the URL instead of in the query, like so: RewriteRule ^([a-z_]+)?/?tours/([0-9]+)/(.*) /tours/tour_text.php?lang=$1&id=$2&urlstr=$3 [L] For example, if someone goes to /en/tours/12/some-text-here it will rewrite it to /tours/tour_text.php?lang=en&id=12&urlstr=some-text-here. However I don't want the users to be able to put just any text, so if they type in the wrong some-text-here part it will 301 redirect them to the right page. This works perfectly, but I can see a potential problem potential arising when localizing the website, so I just wanted to make sure it's not actually a problem. How it is now, if someone goes to /en/tours/12/some-text-here, the anchor to the Spanish version of that page will be /es/tours/12/some-text-here (i.e. only changing the "en" to "es"), and then the script will then 301 them to the correct Spanish text (something like /es/tours/12/algun-texto-aqui). And the reverse will also be the same. The anchor on the Spanish version to the English version would be /en/tours/12/algun-texto-aqui and then they will be forwarded with 301 back to /en/tours/12/some-text-here. Basically, the anchor changes the language and the 301 changes the string at the end. So I have two questions: Does purposely and permanently having invalid URLs on your site that get 301'ed to the correct ones have any effect on SEO? I could make it just show the correct URL to begin with, but this is a significant amount of work due to how I am handling the translations, so I would prefer just to 301 them. Will the invalid URLs that are contained in the links be added to the search engine indexes even if they get 301'ed to another page?

    Read the article

  • Track those visitors who come through a particular link

    - by busybee235
    I want to track visitors who come to my site through a particular link. For example, those visitors coming from http://www.domain.com/abc123, I can get their pageviews, time on site, bounce rate, referrer pages per visit etc. After that I can store that info into by database on daily basis. Can anyone suggest any service or api or any software for the same? I have used Google Analytics utm tags that work straight well for my requirement but I don't know how many links I can track with it. I have around 80-100 links to track a day and the number of links will be increasing. I couldn't find any documentation regarding limit of campaigns in GA. If there's no such limit, I can start this project. Thanks

    Read the article

  • De-index URL paremeters

    - by Doug Firr
    Upon reading over this question is lengthy so allow me to provide a one sentence summary: I need to get Google to de-index URLs that have certain parameters appended I have a website example.com with language translations. There used to be many translations but I deleted them all so that only English (Default) and French options remain. When one selects a language option a parameter is aded to the URL. For example, the home page: https://example.com (default) https://example.com/main?l=fr_FR (French) I added a robots.txt to stop Google from crawling any of the language translations: # robots.txt generated at http://www.mcanerin.com User-agent: * Disallow: Disallow: /cgi-bin/ Disallow: /*?l= So any pages containing "?l=" should not be crawled. I checked in GWT using the robots testing tool. It works. But under html improvements the previously crawled language translation URLs remain indexed. The internet says to add a 404 to the header of the removed URLs so the Googles knows to de-index it. I checked to see what my CMS would throw up if I visited one of the URLs that should no longer exist. This URL was listed in GWT under duplicate title tags (One of the reasons I want to scrub up my URLS) https://example.com/reports/view/884?l=vi_VN&l=hy_AM This URL should not exist - I removed the language translations. The page loads when it should not! I played around. I typed example.com?whatever123 It seems that parameters always load as long as everything before the question mark is a real URL. So if Google has indexed all these URLS with parameters how do I remove them? I cannot check if a 404 is being generated because the page always loads because it's a parameter that needs to be de-indexed.

    Read the article

  • Will these type of 403 errors affect my ranking?

    - by Gkhan14
    Let's say I have a directory that has a 403 forbidden error for all of the content in it, however a few of the images in the subdirectoies of the main diretory do NOT have a 403 forbidden error. Will this fact affect my ranking? For example: test.com/system/ (HAS 403 ERROR FOR ALL FILES) - test.com/system/pie/ (HAS 403 ERROR FOR ALL FILES) - test.com/system/pie/image.png (DOES NOT HAVE A 403 ERROR, AND THIS IMAGE IS EMBEDED ON A PAGE ON test.com e.g(test.com/pie/)) This sort of pattern repeats for about 10 different images. This directory is like a secret "system", however all of the content on the main site (test.com) is still accessible to everyone from the public.

    Read the article

< Previous Page | 174 175 176 177 178 179 180 181 182 183 184 185  | Next Page >