Search Results

Search found 17124 results on 685 pages for 'final cut pro'.

Page 218/685 | < Previous Page | 214 215 216 217 218 219 220 221 222 223 224 225  | Next Page >

  • visit counts in advanced segments not consistant

    - by user671201
    My organization has recently noticed an issue when applying advanced segments to visit counts during different time ranges. With no advanced segments turned on, here are the visit counts for Oct 1st - Oct 4th during the time range Sept 8th - Oct 8th: Oct 1 - 7 Oct 2 - 7 Oct 3 - 8 Oct 4 - 5 Again, with no advanced segments turned on, here are the visit counts for Oct 1st - Oct 4th but I've changed the time range to Oct 1st - Oct 4th. As expected, the numbers are the exact same as above: Oct 1 - 7 Oct 2 - 7 Oct 3 - 8 Oct 4 - 5 Now, I turn on the "Non paid search traffic" advanced segment. Here are the visit counts for Oct 1st - Oct 4th during the time range Sept 8th - Oct 8th: Oct 1 - 0 Oct 2 - 0 Oct 3 - 0 Oct 4 - 2 Here is where it gets weird. I keep the advanced segment on, and change the time range to Oct 1st - Oct 4th. This is what I get for the exact same dates as above: Oct 1 - 4 Oct 2 - 2 Oct 3 - 6 Oct 4 - 5 We've found the same inconsistency in our other GA profiles that get much more traffic (the above numbers come from one of our specialized topic blogs), but the inconsistency is less pronounced where there are more visits. My question is: why are the visit counts different for different time ranges when advanced segments are turned on, but exactly the same when no advanced segments are applied? Is this a GA bug or am I missing something about how the advanced segments work?

    Read the article

  • Weird keywords in google webmaster tools

    - by Argoron
    I just happened to check the keywords list on Google Webmaster Tools for my site, which is an educational content site about finance. To my big surprise, after the first keyword, which is 'finance', I found amongst the 20 highest ranked (!) entries words like: mysql, server, adobe, flash, player, homez. What (i'm tempted to add "the heck") does that mean ? Is that something I should worry about? If so, how did these get there and how can I eliminate these / avoid they get into that list ? Thanks very much in advance for your help

    Read the article

  • Best stats tool for cross-domain traacking

    - by kidbrax
    We build a webapp that allows users to run the app under their own subdomain. So we run the app under search.domainX.com, search.domainY.com and so on. They each have their own Google Analytics to track individual stats. But we want to know what general traffic for all clients of our app. So we want to know stuff like "among all our clients we had x number of views." What is the best way tool to track that sort of thing.

    Read the article

  • SEO & Multilingual: would be this a good practise?

    - by Younès
    I am currently making a bilingual website and I'd like to get nice SEO results of course. Here's my idea: The internal links would be composed of the "www" subdomain so that people can share links regardless of their language. Anyway, their language is determined by the HTTP_ACCEPT_LANGUAGE PHP variable. So, they would see http:// www.site.com/mydocument/123 in their adress bar and never see any links like "http:// fr.site.com/mydocument/123" or "http://en.site.com/mydocument/123" The user can always switch the page's language thanks to links in the footer. The switching language link would be : http:// fr.site.com/mydocument/123 , and clicking on it would change his language session and redirects the user to http:// www.site.com/mydocument/123 In case of a crawling bot: I read that if the HTTP_USER_LANGUAGE variable was missing then it's a crawling bot. So, in that case, we set the defaut language as English. Each page, as I mentionned earlier, has a link for another language: On the page: http:// www.site.com/document/1323, the link http:// fr.site.com/document/1323 can be seen by the bot and be crawled. What do you think about this practise ? Would I get good SEO results for each language ?

    Read the article

  • How can IIS 7.5 have the error pages for a site reset to the default configuration?

    - by Sn3akyP3t3
    A mishap occurred with web.config to accommodate a subsite existing. I made use of “<location path="." inheritInChildApplications="false">”. Essentially it was a workaround put in place for nested web.config files which was causing a conflict. The result was that error pages were not being handled properly. Error 500 was being passed to the client for every type of error encountered. Removal of the offending inheritInChildApplications tag from the root web.config restored normal operations of most of the error handling, but for some reason error 503 is a correct response header, but the IIS server is performing the custom actions for error 403.4 which is a redirect to https. I'm looking to restore defaults for error pages so that the behavior once again is restored. I then can re-add customizations for the error pages.

    Read the article

  • SEO with duplicate content

    - by user16831
    I have a nature photography site with multiple types of photo galleries. Each photo and associated caption on my site appears in several galleries. For instance, a photo of a goldfinch that was taken on a trip to New Mexico in 2008 will appear in the "goldfinch.php" gallery, in the "finches.php" gallery, and in the "New_Mexico_2008.php" gallery. This duplication is useful for my site visitors - User A may want to see goldfinch photos, whereas User B wants to see photos from New Mexico - but I am concerned about the SEO implications. The typical suggestions to deal with duplicate content, such as 301 redirects and canonical tags, probably won't work in this case, because the page content is substantially different (ranging from ~1% to ~90% duplication, depending on the specific example chosen). The obvious solution to me would be to edit robots.txt to only allow search engines to crawl one type of gallery - for instance, if they crawled only the galleries organized by species(e.g. goldfinch.php), all the photos on my site would be found exactly once. However, the Google content guidelines recommend against blocking crawler access to duplicate information. Should I go ahead and use robots.txt anyway? Or is there a better solution?

    Read the article

  • Google Analytics - include filter not working

    - by gerl
    I just added an include filter this morning in my domain (test.org). I have: Custom Filter Include Request URI ^/test-a/46212$|^/test-a/46212|^/test-a/46315 Now after I go to Content Site Content All Pages, I see stats for other pages that I didn't include in my filter. For example I see /somethingelse. I only want to see stats for /test-a/46212 and whatever else in my filter. Please let me know what I'm doing wrong.

    Read the article

  • Hosting a website from a dynamic IP

    - by nick
    I recently upgraded my internet to the point that it is much faster and more reliable than my current webhost. I would like to move my current domain to be hosted at home, but my IP address is dynamic. As far as I know, I only get a new IP when I restart my modem and or router (which is almost never) or when cable one (my ISP) pushes out a firmware update (rarely). There are a few ways I can see doing this 1) convince my ISP to give me a static IP 2) assign my router my current IP to force a static IP (which might work?) 3) set my dns record to my current IP address and update it on the rare occasions that it changes. Obviously I'm hoping that the first one works, but I don't want to pay a lot of extra money (if that's what it takes) to get a static IP address. Has anyone had any luck with something like that?

    Read the article

  • Recommend hosting with fast MySQL database please.

    - by Keith Groben
    I am frustrated to no end with my current hosting provider, mediaTemple. Yes, they are flashy, and have some decent degree of flexibility with their GS plan, which I have. But anytime I install a site that needs a database, it is slow. like really slow. Taking anywhere from 10 - 15 seconds just to load a page. I would host in house, but there are a lot of complications that come with a LAMP server that I don't want to deal with. Honestly, I'd rather spend the time developing. What can you recommend?

    Read the article

  • Looking for recommendations for a server-side newsletter program

    - by Sparky672
    Hello- I'm currently using a server-side SQL based mailing list program called Php-List on multiple sites and it works fairly well. But installation and setup is quite cumbersome, quirky and the interface is not well organized... neither is the code... with pieces all over the place in random fashion. Customizing the "look & feel" and full site integration are both tedious and painful. Upgrading the version is made more complex since multiple edits need to be manually transferred each time. Also, probably due to a poor English translation, descriptions and instructions within certain areas of the user interface are contradictory and unclear. You just have to play with it and remember what you did last time it worked. It's supposed to be so my customers can send out their own newsletters... after supplying a written tutorial, about half of them seem to stumble through it okay and the other half just hire me to do it for them. So not quite easy enough for most average people to use. I'm looking for something that's as easy for them as using a blog or discussion forum. It also must be easier to set up and integrate into a site than Php-List. I have no problem getting dirty and writing CSS or HTML by hand. Nor do I have any problem editing the program code. Perhaps what I'm looking for is a solution that is more organized, a better GUI, and template or "skin" based. Therefore, if I spend many hours customizing a skin, I can simply update the program and re-use my custom skin without having to reproduce the tedious setup over and over. (I currently maintain a list of about 25 things I must manually edit or add to multiple files in multiple directories each time I install or upgrade Php-List) A great example of what I'm looking for is very much like WordPress or phpBB. They're both easy to install and customize yet powerful and packed full of features. They're also VERY well organized making customization less painful. So enough yammering for now... anyone know of something, besides Php-List, with many of the same features as Php-List; maintaining a mailing list with a server-side database, custom sign-up pages, automatic opt-in opt-out, allowing custom HTML newsletter templates, etc? Thank-you!

    Read the article

  • Best way to lay-out the website when sections of it are almost identical

    - by Linas
    so, I have a minisite for the mobile application that I did. The mobile application is a public transport (transit) schedule viewer for a particular city (let's call it Foo), and I'm trying to sell it via that minisite. I publish that minisite in www.myawesomeapplication.com/foo/. It has the usual "standard" subpages, like "About", "Compatible phones", "Contact", etc. Now, I have decided to create analogue mobile application for other cities, Bar and Baz. These mobile applications (products) would be almost identical to the one for the Foo city, thus the minisites for those would (should) look very similar too (except for some artwork and Foo = Bar replacement). The question is: what do you think would be the most logical way to lay-out the website in this situation, both from the business and search engine perspective? In other words, should I just duplicate the /foo/ website to /bar/ and /baz/, or would it be better to try to create a single website under root path (/)? I don't want search engine penalties for almost-duplicate information under /foo/, /bar/ and /baz/, and also I don't want a messy, non-localized website (I guess the user is more likely to buy something if he/she sees "This-and-that is the application for NYC, the city you live in", not "This-and-that is the application for city A, city B, ..., NYC, ..., and city Z.")

    Read the article

  • wget not respecting my robots.txt. Is there an interceptor?

    - by Jane Wilkie
    I have a website where I post csv files as a free service. Recently I have noticed that wget and libwww have been scraping pretty hard and I was wondering how to circumvent that even if only a little. I have implemented a robots.txt policy. I posted it below.. User-agent: wget Disallow: / User-agent: libwww Disallow: / User-agent: * Disallow: / Issuing a wget from my totally independent ubuntu box shows that wget against my server just doesn't seem to work like so.... http://myserver.com/file.csv Anyway I don't mind people just grabbing the info, I just want to implement some sort of flood control, like a wrapper or an interceptor. Does anyone have a thought about this or could point me in the direction of a resource. I realize that it might not even be possible. Just after some ideas. Janie

    Read the article

  • Seperate .com domain name purchasing for a VPS

    - by adamk
    I am looking at getting a VPS with RackSRV, and they don't sell domain names, but are happy to set it up after I get one elsewhere. Can anyone recommend somewhere I can purchase just the domain, and not have any hassles moving it afterwards? (Or can I just purchase the domain and make it point at the RackSRV ip address, while still using the domain sellers' control panel? I don't really understand that part of it enough! :)) I want the domain name registered in my name, ideally with myself as the technical and administrative contacts for simple transfers.

    Read the article

  • Issue with image lightbox and enlargement / Jquery Mobile

    - by Matt
    I'm working on a redesign of my weather website using Jquery Mobile. I have it set up so that you drill down through a series of content containers to get to the weather info (each group of info opens in a dialog display). Everything's worked well, but I've run into an issue with my images. I have them sized so that they fit a mobile device's screen nicely, but because of that, when you look at them in a desktop browser, you can't really make out what the image is. I've tried several image lightbox / enlargement solutions, but for some reason, none of them have worked. Either nothing happens or the images open in a new window. I thought that this might be caused by Jquery Mobile somehow overwriting the scripts and css of the lightbox / enlargements I've tried. I'm not completely sure though that this is the case, and if it is, how I can get around it to be able to enlarge the images to their original size, preferably onclick. Here is a working (for the most part - still some kinks to work out) example. If you look under the "Tropical" section at the "Satellite-Derived Products", you'll see what I mean. http://www.suncoaststormwatch.com/Beta/Index.html

    Read the article

  • What is best way to manage all images in a big project, inline images, background images, css sprite images?

    - by metal-gear-solid
    How do you manage all images in a big project, inline images, background images, css sprite images? Do you follow any naming convention? Do you create sub-folders to manage images? In a big project how to make it easy to find for new people in the development team if any images which they want to use (because it's in new PSD they received from designer) is already available in images folder of project and how they can find it easily.

    Read the article

  • Using a CDN for CMS software (multiple sites)

    - by SmokeyPHP
    I'm currently researching ideas for the media management side of a CMS I'm writing. I was looking at having images served from a CDN which is fine on a single site, but I want all sites that run the CMS to make use of a CDN (which will most likely be a custom developed one, rather than a third party service like S3). My main question is: Is a multi-site CDN a good idea? I can't think of a downside, but have probably missed something - obviously they won't share the same folder, as I invisage the requests to be css.cdnsite.com/example.com/style.css or something along those lines. Having multiple sites in the same place will obviously make it easier for us to manage, as well as being cheaper, but then I wonder if it'll be worth it... Long story short: How should the CMS handle user uploaded media (separate installations) Just keep a local copy of all assets and serve them from the same site, like in days of yore? Keep a local copy, force site to use www. and have CDN subdomains per site? Or use a single separate CDN for all sites? Apologies for the length of this question, not sure if this should be multiple questions or not, as all parts are kind of related and could affect each other.

    Read the article

  • Wordpress Theme

    - by HotPizzaBox
    I'm trying to create a basic wordpress theme. As far as I know the basic files I need are the style.css, header.php, index.php, footer.php, functions.php. Then it should show a blank site with some meta tags in the header. These are my files: functions.php <?php // load the language files load_theme_textdomain('brianroyfoundation', get_template_directory() . '/languages'); // add menu support add_theme_support('menus'); register_nav_menus(array('primary_navigation' => __('Primary Navigation', 'BrianRoyFoundation'))); // create widget areas: sidebar $sidebars = array('Sidebar'); foreach ($sidebars as $sidebar) { register_sidebar(array('name'=> $sidebar, 'before_widget' => '<div class="widget %2$s">', 'after_widget' => '</div>', 'before_title' => '<h6><strong>', 'after_title' => '</strong></h6>' )); } // Add Foundation 'active' class for the current menu item function active_nav_class($classes, $item) { if($item->current == 1) { $classes[] = 'active'; } return $classes; } add_filter( 'nav_menu_css_class', 'active_nav_class', 10, 2); ?> header.php <!DOCTYPE html> <html <?php language_attributes(); ?>> <head> <meta charset="<?php bloginfo('charset'); ?>" /> <meta name="description" content="<?php bloginfo('description'); ?>"> <meta name="google-site-verification" content=""> <meta name="author" content="Your Name Here"> <!-- No indexing if Search page is displayed --> <?php if(is_search()){ echo '<meta name="robots" content="noindex, nofollow" />' } ?> <title><?php wp_title('|', true, 'right'); bloginfo('name'); ?></title> <link rel="stylesheet" type="text/css" href="<?php bloginfo('stylesheet_url'); ?>" /> <?php wp_head(); ?> </head> <body> <div id="page"> <div id="page-header"> <div id="page-title"> <a href="<?php bloginfo('url'); ?>" title="<?php bloginfo('name'); ?>"><?php bloginfo('name'); ?></a> </div> <div id="page-navigation"> <?php wp_nav_menu( array( 'theme_location' => 'primary_navigation', 'container' =>false, 'menu_class' => '' ); ?> </div> </div> <div id="page-content"> index.php <?php get_header(); ?> <div class="page-blog"> <?php get_template_part('loop', 'index'); ?> </div> <div class="page-sidebar"> <?php get_sidebar(); ?> </div> <?php get_footer(); ?> footer.php </div> <div id="page-footer"> &copy; 2008 - <?php echo date('Y'); ?> All rights reserved. </div> </div> <?php wp_footer(); ?> </body> </html> I activated the theme in wordpress. But it just shows nothing. Not even if I view the page source. Can anyone help?

    Read the article

  • Jquery calculator

    - by Nemanja
    I want to make calculator for summation. I have this jquery code: <script type="text/javascript"> $(document).ready(function() { $("#calculate").click(function() { if ($("#input").val() != '' && $("#input").val() != undefined) { $("#result").html("Result is: " + parseInt($("#input").val()) * 30 ); } else { $("#result").html("Please enter some value"); } }); }); </script> <div id="calculator"> <br /> <input type="text" style="left:20px;" id="input" /> <div id="result"></div> <input type="button" id="calculate" value="calculate" style="left:20px;" /> </div> I want something like this: I write number 5, then input + from keyboard and write another value. Then to click on summation button and to get value. Can anyone help me with this? Thank you very much!

    Read the article

  • Can web applications running on IIS7 Windows Server 2008 R2 be forced to immediately detect changes to hosts file?

    - by Brenda Bell
    We have several web applications running on several load-balanced servers. We want to have our web applications communicate with each other without first traversing outside the load balancer. For example: http://appA.example.com is running on 192.0.2.1 and 192.0.2.2 http://appB.example.com is also running on 192.0.2.1 and 192.0.2.2 The load balancer's public IP address is 198.51.100.3 By default, when appA on 192.0.2.1 makes a call to a WCF service hosted in appB, the HTTP request is routed to 192.51.100.3; this establishes a new session and the load balancer will direct the call to either of the two servers We want the call to be routed to the instance of appB running on the same server so we add 192.0.2.1 appB.example.com to the hosts file on 192.0.2.1. This eventually works, but we either have to wait for the app pool to naturally recycle or do a manual reset before appA sees the new address. Is there any way to have the change automatically detected without having to recycle the app pool?

    Read the article

  • Optimize SEO: 2 websites or 1 main website and subdomain? [duplicate]

    - by waanders
    This question already has an answer here: Subdomain versus subdirectory 4 answers I'm working on a WordPress website of a little company, let say: www.xxx.com. Now they want a different website for their workshops. What is the most optimal construction thinking of SEO? 1) www.xxx.com + www.xxx-workshops.com 2) www.xxx.com + www.xxx.com/workshops 3) www.xxx.com + workshops.xxx.com

    Read the article

  • Adding your website to free web directories as a link building strategy

    - by Man
    It's been two month I've launched a website. I recently ran into some websites which list directory of other websites. Some examples can be http://www.addgoogleurl.com/ and webdirectorieslist.com, etc. I was talking with my colleague and he says, adding the URL of my website to these kind of websites will negate the effect of other organic real links. Does google consider positive/negative points for these kind of links from web directory websites? Do you have any source for your answer to refer to? I found this question asked before on webmasters.SE, this is asking about many links from a website.

    Read the article

  • Using AWS or Azure, what to do about emails?

    - by Paul
    I'm coming from a background of paying a hosting company X amount per month for a server. This server comes with IIS, WebsitePanel and Smartermail all bundled together. When I create a new domain using WebsitePanel it automatically creates my email account. All I then need to do is configure my DNS to point to the server. I've decided that it is more cost efficient to move to AWS / Azure. Has anyone come from a similar background and moved onto a cloud system? I'd be interested to know what you did regarding emails. So far, these are the suggestions I've seen: Use Google Apps for each domain Use something like Elastic Email to sent out emails Launch a new instance and host an email server on that The first option seems like quite a lot of manual configuration, the second one works good with outgoing emails but what about receiving? Option 3 would make it less cost effective. What is your experience?

    Read the article

  • How to deploy ASP.NET application with MS SQL server database

    - by Maddy
    I want to deploy my website with MS SQL server database. It's my first time and I have never done it before. What I have come to know from my googling is that I must have a domain(.com/.net/.co) and a host(for my web pages .aspx & .cs(confusion here if I can also deploy my database)). Now, I am not getting to where I have to deploy my database. If I also have to buy a seperate SQL Server database or a host consisting of every thing (means I can deploy both my ASP.NET application & database as well).

    Read the article

  • What is good side of PageRank?

    - by SharkTheDark
    I am doing research about backlinks/PR/SEO/Search Result position and all I read about is that PageRank is not important, that it worth before but now it's not important at all. Only thing I found useful about it that is "change Search Result position", but ONLY if there are two sites with same keywords and same text content value, then Search Engine will check which site has higher PR and place that site above lowest one. Google counts PR importance as 20% for displaying search rankings, and Yahoo! is like 3%... Correct me if I am wrong... Is there any other good thing from it?

    Read the article

  • Hosting and domain registrations for multiple clients under a single hosting account of mine?

    - by letseatfood
    I am finally getting regular work designing, developing, and deploying websites for small businesses and individuals. So far the websites utilize single-user content management systems, so the websites create, as far as I know, minimal load on the shared servers. I have always required that each of my clients purchase annual shared hosting at Dreamhost. For domain registration, I ask that they register with Dreamhost, but some already have a registered domain elsewhere and this is fine with me. I do this so the billing issues are the client's responsibility, not mine. My question is: Since I can register unlimited domains and connect them to my one shared hosting account at Dreamhost, should I not be requiring clients to individually pay for shared hosting and a domain? Should I actually be paying for one hosting account and then hosting all of my client's websites on that account? As I said before, I currently have each client buy their own hosting, because I feel that, for example, if there is high traffic to their site, there would be less a chance of the site going down than if their site was hosted with many others on one account. I am famous for being long-winded, please let me know if I can clarify at all. Thanks!

    Read the article

< Previous Page | 214 215 216 217 218 219 220 221 222 223 224 225  | Next Page >