Search Results

Search found 28121 results on 1125 pages for 'microsoft surface pro'.

Page 396/1125 | < Previous Page | 392 393 394 395 396 397 398 399 400 401 402 403  | Next Page >

  • seo in relation to web-hosting [closed]

    - by jimmy obonyo
    Possible Duplicate: Does changing web hosting server affects SEO page ranking? I have two websites.one of the site though vigorous attempts to search optimize to certain google keywords or even the site name still performs poorly,while the other site does actually perform better and better.the two sites are hosted by different hosting companies...one bytehost.net the other by youhosting.com.So here is my question,does anyone know if there any relation of hosting company with indexing or not, and if there is a relationship how to choose a good company to get better seo indexing ,rating

    Read the article

  • Google analytics and 301 redirects

    - by Ilian Iliev
    We have a multi language website and the first page redirects to specific language page using 301 redirect based on some logic. For exmaple: http://mysite.com/ redirects to http://mysite.com/en/ The problem is that these redirects destroy the primary request so we do not get correct results for traffic sources in GA. How do you handle this case? Is there something that we can do? Any ideas will be appreciated

    Read the article

  • How to structure my AdWords campaign for testing and different groups of keywords?

    - by Romain Dorange
    I am starting an AdWords campaigns and I will measure conversion rates using the AdWords conversion tracking pixel. Conversion might be account creation or a concrete sale. As it will be a test campaign to have some insights on CTR, CR, etc... on the future, I am likely to try several configurations: Two different ads with different landing URL and messages: one with a focus on the product / the other will contains a discount embedded in the URL. 4 different groups or themes of keywords. I guess I have to build 4 ads groups based on the keywords 2 ads with the different messages assign the two ads to each ads groups follow the campaign precisely in the ads tabs where I can see the effectiveness of each Ads per Ads Groups (for a total of 8 lines of reporting) Also, what are the key performance indicators that I can have from an AdWords campaign to measure global effectiveness? measure of return on investment from concrete sales (tracking pixel with e-commerce tag on confirmation page) measure o return on investment from leads acquisition (tracking pixel on account creation) measure of traffic increase with the campaign

    Read the article

  • High Traffic Web Host Solution? [duplicate]

    - by Calsy
    Possible Duplicate: How to find web hosting that meets my requirements? Im currently shopping around for a web host for our website we are hoping to release in the near future. This is my first real step into this area. Just wondering what I should be looking for. It is an ASP.net MVC website with an MS SQL Server backend. I need to know that the server will not buckle if the traffic booms. Currently im looking at a managed dedicated server from singlehop. Does anyone know any better or have any advice.

    Read the article

  • Tumblr custom domain not redirecting properly

    - by Manic
    I decided to host my blog at Tumblr, using their custom domain setup (http://blog.smokingfishgames.com/ instead of http://smokingfishgames.tumblr.com). However, it's been 72 hours and I'm still getting spotty redirection. It works some of the time--I go and see the page and blog, and it's all fine. However, it occasionally just stops working and redirects back to my web host, which is a directory with nothing but a single file called BUGGER.html (which I stuck in to make sure that it was my web host and not some Tumblr empty directory). Clearing the Chrome DNS cache makes the problem go away--for a while. After a few minutes, or an hour, or however long, I'll start seeing BUGGER.html again. I clear the cache, and poof, the blog shows up. The thing that's curious to me is that when I clear the cache and get BUGGER.html again (which happens occasionally), I can look at my Chrome DNS cache and see assets.tumblr.com UNSPECIFIED blog.smokingfishgames.com UNSPECIFIED www.tumblr.com UNSPECIFIED IP addresses and expiration times omitted for brevity's sake--if they're important I'm sure I can replicate the issue. This implies, to me anyway, that my browser is reaching Tumblr but getting bounced back to my web host. Any reason why this would be happening, or is this a normal symptom of DNS propagation? If it is a problem, should I be bothering Tumblr or my host with it, or is this something I can fix myself?

    Read the article

  • parallels plesk 11 missing web presence builder

    - by NRGdallas
    after a recent upgrade to parallels plesk 11, we decided to start using their web presence builder tool, however every video, documentation, and instructional I view, shows the link should just be under websites and domains, or even on the homepage. It is in neither location. I have verified it is both installed and up-to-date, under server - updates and upgrades any idea how I access the web presence builder?

    Read the article

  • How to set which version of the VC++ runtime Visual Studio 2005 targets

    - by TallGuy
    I have an application that contains a VC++ project (along with C# projects). Previously, (i.e. during the last year or so) when a build has been done, Visual Studio 2005 appears to be targeting the VC++ runtime version 8.0.50727.762. At least, that is what the Assembly.dll.intermediate.manifest file is telling me: <?xml version='1.0' encoding='UTF-8' standalone='yes'?> <assembly xmlns='urn:schemas-microsoft-com:asm.v1' manifestVersion='1.0'> <dependency> <dependentAssembly> <assemblyIdentity type='win32' name='Microsoft.VC80.CRT' version='8.0.50727.762' processorArchitecture='x86' publicKeyToken='1fc8b3b9a1e18e3b' /> </dependentAssembly> </dependency> </assembly> This version number matches the Visual Studio 2005 version number. The application worked fine when deployed to the webserver. The sun was shining, the birds were singing and all was right with the world. Now something has changed. I don't know what - a security patch, an obscure Visual Studio setting or something. Now Visual Studio 2005 seems to be targeting the wrong version of the VC++ runtime: <?xml version='1.0' encoding='UTF-8' standalone='yes'?> <assembly xmlns='urn:schemas-microsoft-com:asm.v1' manifestVersion='1.0'> <dependency> <dependentAssembly> <assemblyIdentity type='win32' name='Microsoft.VC80.CRT' version='8.0.50727.4053' processorArchitecture='x86' publicKeyToken='1fc8b3b9a1e18e3b' /> </dependentAssembly> </dependency> </assembly> When I deploy the application to the webserver, I get the dreaded This application has failed to start because the application configuration is incorrect. Reinstalling the application may fix this problem. (Exception from HRESULT: 0x800736B1) error. This problem occurs even when I recompile previous versions of the application. I can absolutely guarantee that nothing at all has changed in the solution - we zip up the entire contents of the solution as part of the build process and archive it. I have unzipped a number of these to a temp directory, verified that the previous manifest file refers to 8.0.50727.762, recompiled using exactly the same command at the command line and then verified that the new manifest file now refers to 8.0.50727.4053. I am using Microsoft Visual Studio 2005 Version 8.0.50727.762 (SP.050727-7600) and Microsoft Visual C++ 2005 77646-008-0000007-41610. Why would Visual Studio revert to a previous version of the VC++ runtime? How do I specify which version it should use? What is going wrong here?

    Read the article

  • SEO on an existing platform

    - by Simon
    I'm given the task to increase user visits and conversions on for a recruitment website. Conversions would be interested job seekers submitting their CV. The manager would first like to increase the organic search results and optimize the website before starting with targeted campaigns. The problem is, they are using a proprietary recruitment software platform which I can barely add changes to. For example, the url's all look like dynamic url's without any semantic meaning and the markup is almost completely build automatically by that platform. I'm also confident that the lack of submitted CV's is due to a bad user experience of the website (no incentives or clear CTA to register) Besides optimizing the static texts and page titles, is there anything I can do? Thanks

    Read the article

  • Why subdomains of Blogspot/WordPress like sites are treated as different domains or sites?

    - by Thedijje
    As I know, maps.google.com or mail.google.com all comes under the same domain and its all are subdomain. Entire web treats these subdomain as the part of main domain and they have same Alexa rank, PageRank and all. But in another hand, take a look on blogspot.com/wordpress.com/webs.com; these are different sites but blogs or websites under those domains are treated as different sites. Its new URL, all have different PageRank and Alexa rank as well. Tts about millions of subdomains under those few domain, have almost similar IP address, hosting and CMS, still why they are called different domains?

    Read the article

  • Using only password to authenticate user (no "username" field)

    - by Guy
    I am creating a client access system, to allow manage invoices, make payments, access information about their products and information/functionality alike. Supposedly there are less than 1000 clients. Would there be any security threat to use only password (UUID v4 strings) to authenticate user? My thoughts: There is virtually no probability of collision or success with brute-force attack. http://en.wikipedia.org/wiki/UUID#Random%5FUUID%5Fprobability%5Fof%5Fduplicates User friendly (one click go) It is not intended to be remembered

    Read the article

  • Is this Anti-Scraping technique viable with Crawl-Delay?

    - by skibulk
    I want to prevent web scrapers from abusing 1,000,000 on my website. I'd like to do this by returning a "503 Service Unavailable" error code for users that access an abnormal number of pages per minute. I don't want search engine spiders to ever receive the error. My inclination is to set a robots.txt crawl-delay which will ensure spiders access a number of pages per minute under my 503 threshold. Is this an appropriate solution? Do all major search engines support the directive? Could it negatively affect SEO? Are there any other solutions or recommendations?

    Read the article

  • How to copy or replicate a complex website to local file and modify then

    - by Andre Chenier
    I am not good at designing the visual side of a website. I found a website which I gave 10 over 10 because its functionality suits my aims and also it seems very esthetical. I know HTML, PHP, mySQL and some degree of CSS. I don't know JS, Ajax, Jquery. So I want to replicate this web site (save completely) on my local and then modify it. (content, colors, icons etc.) I saved this web site in Chrome and IE. After clicking the site from my local folder, a saw an ugly & non-working site. My aim is to understand the functions of the parts that I don't know. For example when I delete a js in its page what will happen as the result of the deletion operation. Since the page is too complex it has lots of css, js files to download inside. I don't want to deal it manually. Is there any alternative and easy way to get the web page completely to my local which also works like a charm from local? regards

    Read the article

  • Two different websites in one remote hosting

    - by Kor
    My client asked me that a website that is hosted in one server (and pointing there through a domain) should also be accessed (into a specific directory) from another domain, which is not pointing there. For example: http://www.foo.com, hosted at GoDaddy, with the full website http://www.bar.com, hosted at Bluehost, needs to access http://www.foo.com/bar, as if it was the http://www.bar.com's root. So, if anybody enters through http://www.bar.com, it should internally load http://www.foo.com/bar, without visually changing the url. I amb not sure if this is possible using .htaccess or anything like this. Could anybody show me some light? Thanks in advance

    Read the article

  • Do search engines crawl PDFs and if so are there any rules to follow when making them

    - by RandomBen
    The website I am working on has a few hundred PDFs in it. I don't think I have ever seen any of them come back in a search but there are linked to directly from out site. They are also full of keywords because they are product documents. Is there anything special we need to do to get Google or other search engines to crawl them? Is there any hard and fast rules for making PDFs to help Google like them more? For instance should I run them through ghostscript to clean up broken PDF tags that Adobe creates during generation?

    Read the article

  • How to prevent a 404 Error when installing a subdomain using a wordpress multi-site installation

    - by Chris
    I have installed a multi-site instillation of WordPress onto my domain. I then added the necessary code to the wp-config.php file and .htaccess as instructed by WordPress. I also installed a plugin called Quick Page/Post Redirect Plugin which allowed me to place a 301 redirect onto the main domain as I only want to use the sub domain and not the main domain. Then I also added the following line of code to the wp-config.php file to redirect the main domain define( 'NOBLOGREDIRECT', 'URL Redirect Address' ); The site works fine with a redirect on the main domain and my subdomain runs fine when you type in subdomain.domain.com or http://subdomain.domain.com. However when I enter www.subdomain.domain.com or http://www.subdomain.domain.com the following error message is returned: Not Found The requested URL / was not found on this server. Apache/2.4.9 (Unix) Server at www.subdomain.domain.com Port 80 Any help with this would be much appreciated.

    Read the article

  • How long does it take to delete a Google Apps for Domain account and allow recreation?

    - by Wil
    Basically, I set up a Google Apps for domain account for one of my clients but upon trying to upload a CSV of users, only half were created and then I kept getting time outs, 404's and other errors which I never saw when setting up another client. I was not sure how, but thought I may have caused an error or there was an error Google Side when the account was created so I thought it may be best to delete the domain and start from scratch. Little did I know, you had to wait to recreate the domain! As far as I can tell from what I have read, there is five days to delete and allow recreation of similar user names, however, I can't see anything that shows how long you have to wait for the domain account. It has now been 5/6 days and I can't recreate it. The cancellation email I got shows "...If at any time you would like to sign-up for Google Apps for this or another domain, you can do so by visiting..." But it does not mention how long!!! I tried emailing Google directly but got no response. Does anyone know?

    Read the article

  • SEO disasters moving domain for a high traffic website?

    - by chrism2671
    We're looking at moving our website from http://www.wikijob.co.uk to http://www.wikijob.com/uk as we spread our wings internationally. Our .co.uk website has a PR6 and received around 1/2 million visitors a month, 40% international. The wikijob.com domain, while registered for a while, has not been used nor promoted. I am concerned that moving domain could really haemorrhage our traffic and result in a loss of goodwill from Google, even if we use a 301, but equally, if we could transfer that pagerank to the .com domain, that would give us a massive head start around the world. Should we do it, or should we start over with .com and leave .co.uk as is?

    Read the article

  • Google analytics is counting way to much

    - by Luticka
    I have a website using Google analytics but it is counting way to much. To test this i was logging all entry's to my database with time and IP address. My result for one day was: Google analytics: Visits: 4078 Absolute Unique Visitors: 3758 My Database: Visits: 4182 Unique Visitors(Only by IP): 905 I use the tracking option "One domain with multiple subdomains" because the website is accessible both on www.example.com and example.com. I'm i missing something or what could be wrong?

    Read the article

  • Is it costly to leave the Console and Script features enabled in Firebug?

    - by parisminton
    For some time now, I've run Firebug constantly enabled to do quick DOM inspections, leaving the Console and Script panels disabled. I'm just starting to use these two features so I don't have to keep using alerts for testing and debugging. I enable them while I use them and turn them back off when I'm done. I'd like to know if these particular features can slow things down such that they shouldn't be left on round-the-clock. Like do they slow down page loads, use inordinate chunks of memory or something? I don't see anything about it in the Firebug wiki.

    Read the article

  • Detecting dead proxies

    - by Afnan
    Is it possible to detect which proxy is active which is dead? using c# and a combo box containing list of proxies with port number is there any way we take every proxy one by one and determine as if it was dead or active? Microsoft.Win32.RegistryKey registry = Microsoft.Win32.Registry.CurrentUser.OpenSubKey("Software\\Microsoft\\Windows\\CurrentVersion\\Internet Settings", true); registry.SetValue("ProxyEnable", 1); registry.SetValue("ProxyServer", comboBox1.Text) ;

    Read the article

  • Why is Google still not indexing my !# website?

    - by Zubair
    I have been working on a website which uses #! (2minutecv.com), but even after 6 weeks of the site up and running and conforming to the Google hash bang guidelines stated here, you can still see that Google still hasn't indexed the site yet. For example if you use Google to search for 2MinuteCV.com benefits it does not find this page which is referenced from the homepage. Can anyone tell me why Google isn't indexing this website? Update: Thanks for al lthe help with this answer. So just to make sure I understand what is wrong. According to the answers Google never actually indexes the pages after the Javascript has run. I need to create a "shadow site" which google indexes (which google calls HTNL snapshots). If I am right in thinking this then I can pick a winner for the bounty

    Read the article

  • How to insert in a blog sharing links to visitors Tweet, Facebook and so on social networks?

    - by Andry
    I am developing a web blog using ASP.NET, but I guess that the tech details like this, here, is not important. My aim is to insert in every post I create those nice buttons to the social networks account of my visitors so that they can quote or post the link to the blog entry in their space. How can I do this? I guess it also de3pend on the social network I want to use. Lets say, now, that I want to have links to Facebook, Tweet and Google circle accounts. Thankyou.

    Read the article

  • PHP code works on Chrome, but not on Firefox or IE (send email via HTML form) [on hold]

    - by Cachirro
    My brother has this form: <form id="lista" action="lista2.php" method="post"> <input name="cf_name" type="text" size="50" hidden="yes" class="obscure"> <input name="cf_email" type="text" size="50" hidden="yes" class="obscure"> <textarea name="cf_message" cols="45" rows="10" hidden="yes" class="obscure"> </textarea> <input type="image" name="submit" value="Enviar Lista por Email" src="imagens/lista_email.png" width="40" height="40" onclick="this.form.elements['cf_message'].value = lista_mail;this.form.elements['cf_name'].value = prompt('Escreva o seu nome:', '');this.form.elements['cf_email'].value = prompt('Escreva o seu email:', '');"> <input name="submit2" type="submit" value="Enviar" hidden="yes" class="obscure"> </form> That calls this PHP file: <?php if ( isset($_POST['submit']) ) { // Dados de autenticacao SMTP $smtpinfo['host'] = 'localhost'; $smtpinfo['port'] = '25'; $smtpinfo['auth'] = true; $smtpinfo['username'] = 'xxx'; $smtpinfo['password'] = 'xxx'; // Dados recebidos do formulario $nome = $_POST['cf_name']; $email = $_POST['cf_email']; $mensagem = $_POST['cf_message']; // Inclusão de ficheiro PEAR. Certifique-se que o PEAR está activado no seu alojamento require_once "Mail.php"; // Corpo da mensagem $body = "Nome: ".$nome; $body.= "\n\n"; $body.= nl2br($mensagem); $headers = array ('From' => $email, 'To' => $smtpinfo["username"], 'Subject' => 'Encomenda Website'); $mail_object = Mail::factory('smtp', $smtpinfo); $mail = $mail_object->send($smtpinfo["username"], $headers, $body); if ( PEAR::isError($mail) ) { echo ("<p>" . $mail->getMessage() . "</p>"); } else { echo ('<b><font color="FFFF00">Mensagem enviada com sucesso.<br><br></b>Seu email: ' . $email . '<br><br></font>'); }} ?> This basically sends an email with some selected products, name and email. The problem is that it works perfectly on Chrome, but not on FF or IE. When the submit image is pressed, the URL changes to the PHP file, but it displays a blank page. After display errors activated: ini_set('display_errors',1); ini_set('display_startup_errors',1); error_reporting(-1) FF/IE display blank page and email isn't sent, Chrome sends the email and displays this: Strict Standards: Non-static method Mail::factory() should not be called statically in /var/www/vhosts/[site url]/httpdocs/lista2.php on line 33 Strict Standards: Non-static method PEAR::isError() should not be called statically, assuming $this from incompatible context in /usr/share/php/Mail/smtp.php , dont know if it helps So, what is causing the email to be sent on chrome and not on FF or IE? Thank you.

    Read the article

  • Apache virtual hosts - Resources on website not loaded when accessed from other hostname than localhost

    - by Christian Stadegaart
    Running virtual hosts on Mac OS X 10.6.8 running Apache 2.2.22. /etc/hosts is as follows: 127.0.0.1 localhost 3dweergave studio-12.fritz.box 255.255.255.255 broadcasthost ::1 localhost fe80::1%lo0 localhost Virtual hosts configuration: NameVirtualHost *:80 <VirtualHost *:80> DocumentRoot "/opt/local/www/3dweergave" ServerName 3dweergave ErrorLog "logs/3dweergave-error_log" CustomLog "logs/3dweergave-access_log" common <Directory "/opt/local/www/3dweergave"> Options Indexes FollowSymLinks AllowOverride All Order allow,deny Allow from all </Directory> </VirtualHost> <VirtualHost *:80> ServerName main </VirtualHost> This will output the following settings: *:80 is a NameVirtualHost default server 3dweergave (/opt/local/apache2/conf/extra/httpd-vhosts.conf:21) port 80 namevhost 3dweergave (/opt/local/apache2/conf/extra/httpd-vhosts.conf:21) port 80 namevhost main (/opt/local/apache2/conf/extra/httpd-vhosts.conf:34) I made 3dweergave the default server by putting it first in the list. This will cause all undefined virtual hosts' names to load 3dweergave, and thus http://localhost will point to 3dweergave. Of course, normally, the first in the list is the virtual host main and localhost will point to main, but for testing purposes I switched them. When I navigate to http://localhost, my CakePHP default homepage shows as expected: Screenshot 1 But when I navigate to http://3dweergave, my CakePHP default homepage doesn't show as expected. It looks like every relative link to resources are not accepted by the server: Screenshot 2 For example, the CSS isn't loaded. When I open the source and click on the link, it opens the CSS file in the browser without errors. But when I run FireBug while loading the webpage, it seems that the CSS file isn't retrieved. (<link rel="stylesheet" type="text/css" href="/css/cake.generic.css" />) How can I fix this unwanted behaviour?

    Read the article

  • Redirect error in Google Webmaster Tools report

    - by Aurelio De Rosa
    I built a CMS and I used it to create the following website http://www.tkdmontecatini.com . After some days, Google Webmaster Tools started to give me several "Redirect error" on some pages like the follows: http://www.tkdmontecatini.com/it/photogallery http://www.tkdmontecatini.com/it/pagina/9/Informazioni/Corsi/Chi-Siamo http://www.tkdmontecatini.com/it/pagina/2/Informazioni/Eventi/Eventi The funny things are: If I access those links from a browser, it's all right and I've not redirect loops or other similar issues If I use the "Fetch as Googlebot" function, I get a great "Success" result Question: Any idea of why this happens and how can I fix it?

    Read the article

< Previous Page | 392 393 394 395 396 397 398 399 400 401 402 403  | Next Page >