Search Results

Search found 17124 results on 685 pages for 'final cut pro'.

Page 121/685 | < Previous Page | 117 118 119 120 121 122 123 124 125 126 127 128  | Next Page >

  • Unindexing my tumblr blogs content and moving it to another tumblr blog

    - by sam
    ive been writing a tumblr blog for the past yr or so, ive writen about 300 articles, but now i need to move the blog to another site. (before it was running under blog.mysite.com and i now want it to run under blog.my*new*site.com) I want to keep the archived articles and have them on the new site, so what i was hoping to do was export the blog from tumblr, go into webmaster tools remove all the blogs indexed urls from google webmaster, then make a new tumblr blog and import the posts. Would google see this as new content as ive deleted their indexed copy ? Could i just move the mapping of the tumblr blog to the new subdomain, but in doing this i would lose all the pr and it would still look like duplicate content whats the best way to approach this ?

    Read the article

  • Parsing google site speed in analytics

    - by Kevin Burke
    I'm having a hard time making heads or tails of the Site Speed graphs in Google Analytics. Our site speed is fluctuating wildly from month to month, despite a large sample (the report is "based on 100,000's of visits) and a consistent web set up (static files served from an EC2 instance running nginx behind a load balancer). Here's our site speed, with each datapoint representing a week worth of data. Over this time period we modified our source and HTTP headers to increase our cache hits on static resources by 5x. Why would it fluctuate so much? Is there any way to get more reliable information from those graphs?

    Read the article

  • Web Hosting Backup/Disaster Recovery Plan - Which Company?

    - by Harry Muscle
    I've been asked to look after consolidating all of our various company websites onto one host and also provide a disaster recover plan in case the chosen host goes down/out of business/etc. We're most likely going to go with HostGator as our chosen host, however, I'm not sure who to pick for our backup host. HostGator uses cPanel and has the functionality to provide regular full (ie: including configuration) backups of all the sites we host. Ideally I'm looking for a solution where we can provide these backups to another company and within a short period of time they restore all the sites onto their servers and we're back up and running. The whole disaster recover process has to be fairly straight forward from the point of view of what we need to do in case I am unavailable to assist in the disaster recovery process and no one else overly technical is available to assist (ie: take these backup files, send them to this company, and ask them to do this). Any suggestions on which company would be a good choice for this backup solution would be highly appreciated. Thanks, Harry

    Read the article

  • My site works with www.example.com but not example.com

    - by toocool
    this is the site that I am going to develop: http://www.juve-news.com/ it works like this but it doesnt when I try to open it without the www prefix. it give me 400 bad request! I have used other web host before now I am trying a new one and I have to add some dns entries like in the picture here: http://cloudcontrol.com/developers/documentation/add-ons/aliases/ I dont know what I have done wrong. If anyone knows what could be the problem then please give me a tip.

    Read the article

  • How do I handle having too many links on a webpage because of my menu

    - by RandomBen
    I am developing a website that has a drop-down menu at the top of it. The Menu has around 100 links in it that are repeated on every page. Every page also has some number of links below the Menu that may or may not be in the menu itself. My issue is that Google says they generally don't like pages with more than 100 links on them. Is there any way to change the links on the menu so that they no longer "count" towards my max of 100 links? It seems like there should be an easy way to do this but their really doesn't seem to be. the rel=nofollow still counts towards the number of links on the page at least according to Google, so what other options do I have? I looked into where the 100 comes from and I found that it used to be here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35769#2 but that is no longer the case. I found a more definitive and frankly muddier answer here: http://www.seomoz.org/blog/questions-answers-with-googles-spam-guru from Matt Cutts from 2007. Long story short, in 2007 they still felt 100 links was a good number but they stated you could go far beyond that. In fact, they said that pages with high PageRank could have 2-300. It did sound like having many links could reduce the PageRank of the page with all of the links or possibly all of the items linked to. Also, I know IIS7's SEO 1.0 toolkit suggests that pages should have no more than 250 links.

    Read the article

  • SEO Service - Refresh SEO

    - by Dan
    I've been approached to possible take over SEO/marketting work for a site. The guy is currently using a paid service at http://refreshseo.com/ and paying around $80p/m. From what I can make out all refreshseo does is automatically generate keyword rich content pages and attach them to the site. These pages aren't actually linked to from within the site. So I'm wondering two things has anyone had any experience with this particular company or similar types - has it been worth it? How do you think the recent Google Panda updates impacts on this kind of strategy? Thanks in advance

    Read the article

  • How is this site so fast?

    - by user8628
    how is the website http://dftba.com/ so fast? when i click a link it loads right then? what makes it work like this? how do i make it work like this on my site? some of the objects on the site are being hosted by a website called ecogeek-cdn.net? who is this company and why do they host the images of this site? i have been looking into this site some time because i want this site to be like mine site they site use Apache they site use Python (when asked the developer told me this) they site use jquery and jqueryui they site is custom built not using wordpress they site is ownedhosted by liquidweb they site gets a million users a month they site launched in january they site uses cpanel they site does not have SSH or FTP (i tried to connect but it denied me all) they does have SSH and FTP but only allowed by their addresses Please; my english is not as good as yours

    Read the article

  • Highly SEO optimised forum posts

    - by Tom Gullen
    Given the following forum post: Basics of how internals of Construct work I've used GameMaker in the past. And I know some C++ and have used a few 3d engines with it. I have also looked at Unity, though I didn't get too much into it. So I know my way around programming etc... My question is, how does construct work internally? I know it allows python scripting, which itself is "technically" interpreted, though python is pretty fast as far as being interpreted goes. But what about the rest? Is the executable that gets cre... The forum software will take the first 150 chars of the first post as the page meta description, and the title will be the thread title. All ok. So in Google it will appear as: Basics of how internals of Construct work I've used GameMaker in the past. And I know some C++ and have used a few 3d engines with it. I have also looked at Unity, though I didn't get too much... http://www.domain.com/forum/basics-of-how-internals-of-construct-work.html Now the problem is (not so much with this thread, but other ones) is the first 150 chars don't always create the best meta description. Is it worth my time to cherry pick threads and manually set their description/title tags so they read like: Internal workings of Construct 2 Events aren't converted to any other language. The runtime is a standalone compiled EXE application, which is optimised and actually very fast. Your events... http://www.domain.com/forum/basics-of-how-internals-of-construct-work.html The H1 on the page is still the original title, but we have overridden the title and description to look more friendly on search results. Is this advantageous forgetting the obvious time cost?

    Read the article

  • URL rewrite and domain frame

    - by Dennis
    I have registered the domain www.posti.sh at nic.sh. The website is on the server www.myskoob.com/postish. Unfortunately, nic.sh does not support frames, i.e. that the domain stays posti.sh as it forwards to www.myskoob.com/postish - so I thought about a URL rewrite on the server. Unfortunately I have no idea how rewriting works - I am thankful for explanations - but I would also like to ask whether this is generally possible. What I need is: The server needs to recognize that the folder postish is accessed Depending on the file that is opened, it needs to rewrite the url to www.posti.sh/<-according filename here- Also, the server needs to understand that a link to www.posti.sh/about.php links to www.myskoob.com/postish/about.php and likewise for other files - at the moment, when I type in posti.sh/about.php it redirects to http://www.myskoob.com/postishabout.php, which does not exist All this should be possible irrespective of whether the url contains a "www" at the beginning or not A plus but not necessary would be that it does not display the .php extensions Would that generally be possible? If not, what would be the alternatives? If anyone knows how to do it, any code and/or way to do it would be much appreciated!

    Read the article

  • How can I cap cloud-costs?

    - by Joe Simpson
    I am looking into launching a cloud-based product for consumers where with a prepaid account they can start a server with a simple click and load up the software and access it remotely. The technical side of that I can manage, but I am worried about the costs escalating ridiculously high for both me and my customers Is there a way I can Limit how much each server can cost me before it will be deactivated See how much a server is currently costing me (so I can deduct it from their account) with it being extremely reliable as I don't want to have to have a giant bill in any possibility.

    Read the article

  • Can I remove visible referer from link?

    - by Andreas
    I use referer info to track which of my campaigns works the best. So instead of <a href="someweb.com">someweb</a> I have a link like <a href="http://someweb.com?utm_source=john&utm_medium=email&utm_content=NAME&utm_campaign=campaing">someweb</a> Now when a user cliks "someweb" the whole URL string is shown in the adressbar. Is this possible to mask/hide somehow? Maybe via .htaccss? Thanks in advance

    Read the article

  • How can I screen clients that try to register multiple times?

    - by Aba Dov
    My company offers a bonus to every client that register. We would like to prevent people from abusing this by registering several times. we thought about filtering clients by ip (there is a problem with workplaces where all stations have the same ip) cookies (if cookies are not allowed we might lose a client) I would like your opinions on these two methods and will be glad to hear about new ones. thanks

    Read the article

  • Filtering your offices IPs from Google Analytics when each has a dynamic IP?

    - by leeand00
    I found the documentation for filtering IPs from Google Analytics, but the address of the several locations of our company all have dynamic IP addresses that change every 30 days from what I'm told. I know from working with Dynamic DNS that the provider usually gives you a script that you configure your router to run when it's IP address changes or when it is restarted, which passes the new IP address to the DDNS server. I'm wondering if there might be a way to write or use a preexisting script to do the same thing with the Google Analytics API.

    Read the article

  • Is this form of cloaking likely to be penalised?

    - by Flo
    I'm looking to create a website which is considerably javascript heavy, built with backbone.js and most content being passed as JSON and loaded via backbone. I just needed some advice or opinions on likely hood of my website being penalised using the method of serving plain HTML (text, images, everything) to search engine bots and an js front-end version to normal users. This is my basic plan for my site: I plan on having the first request to any page being html which will only give about 1/4 of the page and there after load the last 3/4 with backbone js. Therefore non javascript users get a 'bit' of the experience. Once that new user has visited and detected to have js will have a cookie saved on their machine and requests from there after will be AJAX only. Example If (AJAX || HasJSCookie) { // Pass JSON } Search Engine server content: That entire experience of loading via AJAX will be stripped if a google bot for example is detected, the same content will be servered but all html. I thought about just allowing search engines to index the first 1/4 of content but as I'm considered about inner links and picking up every bit of content I thought it would be better to give search engines the entire content. I plan to do this by just detected a list of user agents and knowing if it's a bot or not. If (Bot) { //server plain html } In addition I plan to make clean URLs for the entire website despite full AJAX, therefore providing AJAX content to www.example.com/#/page and normal html to www.example.com/page is kind of our of the question. Would rather avoid the practice of using # when there are technology such as HTML 5 push state is around. So my question is really just asking the opinion of the masses on if it's likely that my website will be penalised? And do you suggest an alternative which avoids 'noscript' method

    Read the article

  • Shopping Cart URL Structure

    - by Drew
    In regards to URL structure when it comes to guests and authenticated users, am I able to track traffic associated with both paths, but at the same time track total conversions going through the shopping cart? I have set up the following URL structure: Authenticated users follow this path: /cart /checkout /checkout-confirmation-ty Guests go like such: /cart /checkout-guest /checkout-confirmation-guest-ty can I track the authenticated and guest users separately? is this possible with Google Analytics?

    Read the article

  • Retroactively applying a Piwik goal to visitors

    - by Andrew Aylett
    I started receiving a large (for me) amount of traffic on one of my pages yesterday. Today, I thought that it would be useful to track goals from that page -- there's a link to my blog from it. I added the 'visited external link' goal to Piwik, and new visits are being recorded. However, it seems to me that there must be enough data in the database to retroactively apply the goal to past users -- is there a way to achieve that?.

    Read the article

  • What do I do if a user uploads child pornography?

    - by Tom Marthenal
    If my website allows uploading images (which are not moderated), what action do I take if a user uploads child pornography? I already make it easy to report images, and have never had this problem before, but am wondering what the appropriate response is. My initial thought is to: Immediately delete (not just make inaccessible) the image File a report with the National Center for Missing and Exploited Children with all information I have on the user (IP, URL, user-agent, etc.), identifying myself as the website operator and providing contact information Check any other images uploaded by that IP user and prevent them from uploading in the future (this is impossible, but I can at least block their account). This seems like a good way to be responsible in reporting, but does this satisfy all of my legal and moral responsibilities? Would it be better not to delete the image and to just make it inaccessible, so that it can be sent to the National Center for Missing & Exploted Children, the police, FBI, etc?

    Read the article

  • Why Facebook profiles are Google-searchable?

    - by Jose
    Facebook has around 1B user profiles. They can be found by searching in Google. However, I don't think these profiles are linked from anywhere, so how could Google discover them? As far as I know, sitemaps are not enough for that (http://webmasters.stackexchange.com/a/5151), as all URLs should be crawlable anyway. I ask the question as I also have a site with user profiles and would like to make them discoverable.

    Read the article

  • Canonicalization of single, small pages like reviews or product categories

    - by Valorized
    In general I pretty much like the idea of canonicalization. And in most cases, Google explains possible procedures in a clear way. For example: If I have duplicates because of parameters (eg: &sort=desc) it's clear to use the canonical for the site, provided the within the head-tag. However I'm wondering how to handle "small - no to say thin content - sites". What's my definition of a small site? An Example: On one of my main sites, we use a directory based url-structure. Let's see: example.com/ (root) example.com/category-abc/ example.com/category-abc/produkt-xy/ Moreover we provide on page, that includes all products example.com/all-categories/ (lists all products the same way as in the categories) In case of reviews, we use a similar structure: example.com/reviews/product-xy/ shows all review for one certain product example.com/reviews/product-xy/abc-your-product-is-great/ shows one certain review example.com/reviews/ shows all reviews for all products (latest first) Let's make it even more complicated: On every product site, there are the latest 2 reviews at the end of the page. So you see, a lot of potential duplicates. Q1: Should I create canonicals for a: example.com/category-abc/ to example.com/all-categories/ b: example.com/reviews/product-xy/abc-your-product-is-great/ to example.com/reviews/product-xy/ or to example.com/review/ or none of them? Q2: Can I link the collection of categories (all-categories/) and collection of all reviews (reviews/ and reviews/product-xy/) to the single category respectively to the single review. Example: example.com/reviews/ includes - let's say - 100 reviews. Can I somehow use a markup that tells search engines: "Hey, wait, you are now looking at a collection of 100 reviews - do not index this collection, you should rather prefer indexing every single review as a single page!". In HTML it might be something like that (which - of course - does not work, it's only to show you what I mean): <div class="review" rel="canonical" href="http://example.com/reviews/product-xz/abc-your-product-is-great/"> HERE GOES THE REVIEW</div> Reason: I don't think it is a great user experience if the user searches for "your product is great" and lands on example.com/reviews/ instead of example.com/reviews/product-xy/abc-your-product-is-great/. On the first site, he will have to search and might stop because of frustration. The second result, however, might lead to a conversion. The same applies for categories. If the user is searching for category-Z, he might land on the all-categories page and he has to scroll down to the (last) category, to find what he searched for (Z). So what's best practice? What should I do?

    Read the article

  • adwords traffic shows as 'not set' in google analytics

    - by sam
    in google analytics if i go to traffic sources search paid all i get is "not set". instead of the usual list of keywords that people have used to find the ad, this makes it really difficult to understand whats going on in the campaign.. ive gone into adwords and turned on auto tagging but still the same problem any idea how i can fix this so under the paid tab i get the phrases people have used to find my ad in ppc ?

    Read the article

  • rich snippets ignored by google [closed]

    - by Thoir Fáidh
    Possible Duplicate: Why would Google Rich Snippets work for one site author but not another? I'm facing one problem here. I made rich snippets - microdata for the website but google ignores all of them. Here is how it looks like in testing tool . It doesn't detect any errors. I've read that google ignores the microdata in hidden fields. Unfortunately this is partially the case since I use jquery to interact with the contect, but nevertheless it is not hidden everywhere and I believe that google should recognize at least the microdata visible to the user permanently. Am I missing something here? It is now about 3 weeks since I updated website with rich snippets.

    Read the article

  • Remove IP address from the URL of website using apache

    - by sapatos
    I'm on an EC2 instance and have a domain domain.com linked to the EC2 nameservers and it happily is serving my pages if I type domain.com in the URL. However when the page is served it resolves the url to: 1.1.1.10/directory/page.php. Using apache I've set up the following VirtualHost, following examples provided at http://httpd.apache.org/docs/2.0/dns-caveats.html Listen 80 NameVirtualHost 1.1.1.10:80 <VirtualHost 1.1.1.10:80> DocumentRoot /var/www/html/directory ServerName domain.com # Other directives here ... <FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$"> Header set Cache-Control "max-age=290304000, public" </FilesMatch> </VirtualHost> However I'm not getting any changes to how the URL is displayed. This is the only VirtualHost configured on this site and I've confirmed its the one being used as I've managed to break it a number of times whilst experimenting with the configuration. The route53 entries I have are: domain.com A 1.1.1.10 domain.com NS ns-11.awsdns-11.com ns-111.awsdns-11.net ns-1111.awsdns-11.org ns-1111.awsdns-11.co.uk domain.com SOA ns-11.awsdns-11.com. awsdns-hostmaster.amazon.com. 1 1100 100 1101100 11100

    Read the article

  • Software for video subscription service

    - by Clinton Blackmore
    I'd like to sell instructional videos over the web. Primarily, I'd like uses to subscribe to the site and be allowed access to videos over the internet. Secondarily, I might sell DVDs for those who have poor internet connections or would like a physical copy, or possibly I'd sell eBooks and the like in the future. Regarding the subscriptions: I'd like a system that automatically sends out e-mails when it is time to renew I'd like to be able to offer free trials Users without a free trial or subscription should not be able to access the content Incidentally, I plan to host videos on my current web host and move them to a CDN when volume (and capital) make this a good idea. While I have no intention to go crazy with the DRM, it seems expedient not to directly link to the files -- how can I link to them indirectly? It would be nice to support multiple payment processors -- specifically, I'd like to avoid a PayPal only approach. Are there any web applications (or plugins) you'd recommend for something like this? While I've set up and administered several web technologies, I've never done anything with e-commerce. I see there are possibilities like osCommerce, one friend recommends using WordPress with plugins, and it really appears that for any given CMS, you can graft on components like this, although I imagine that not all are created equal. As I'm not tied to a particular web application (and, while open source software that can run on a LAMP [p=perl, python, php] stack is preferable), I'd like to make a good choice at the beginning.

    Read the article

  • Howto fix "[Errno 13] Permission denied" in mailman mailing lists

    - by Michael
    After migrating domains from one plesk server onto another, I got several of those mails every day: (the target mailbox does not exist, so I get those as undeliverable mail bounces) Return-Path: <[email protected]> Received: (qmail 26460 invoked by uid 38); 26 May 2012 12:00:02 +0200 Date: 26 May 2012 12:00:02 +0200 Message-ID: <20120526100002.xyzxx.qmail@lvpsxxx-xx-xx-xx.dedicated.hosteurope.de> From: [email protected] (Cron Daemon) To: [email protected] Subject: Cron <list@lvpsxxx-xx-xx-xx> [ -x /usr/lib/mailman/cron/senddigests ] && /usr/lib/mailman/cron/senddigests Content-Type: text/plain; charset=ANSI_X3.4-1968 X-Cron-Env: <SHELL=/bin/sh> X-Cron-Env: <HOME=/var/list> X-Cron-Env: <PATH=/usr/bin:/bin> X-Cron-Env: <LOGNAME=list> List: xyzxyz: problem processing /var/lib/mailman/lists/xyzxyz/digest.mbox: [Errno 13] Permission denied: '/var/lib/mailman/archives/private/xyzxyz' I tried to fix the permissions myself, but the problem still exists.

    Read the article

  • Prewritten App for Used Car Dealer?

    - by Shawn Eary
    Is there somewhere I can find a prewritten WebApp (with database) for a used car dealer? The application would need to support the following: Easy setup in a low cost Shared or Cloud Host Give potential customers easy way to browse current inventory (cars on lot) with suggested prices Give dealership easy way to login and update inventory (cars on lot) and suggested prices Give potential customers easy way to send the dealership an inquiry about a specific vehicle on the lot with CAPTCHA style SPAM protection I prefer ASP.NET MVC and Microsoft SQL Server, but I might consider other technologies such as WebForms and LightSwitch (HTML5). I am reasonably comfortable with MVC and WebForms, but I really don't want to waste a bunch of time writing an application that might already exist. I did find a few interesting templates via Bing that seem to control CSS and Layout, but I'm not sure if they contain any business logic or if they would integrate well into an MVC App.

    Read the article

< Previous Page | 117 118 119 120 121 122 123 124 125 126 127 128  | Next Page >