Search Results

Search found 65464 results on 2619 pages for 'web based backup'.

Page 356/2619 | < Previous Page | 352 353 354 355 356 357 358 359 360 361 362 363  | Next Page >

  • How to add a holding page in front of a domain

    - by Jason Bradberry
    I have set up a holding page to announce a new version of a website coming soon. I wanted people to still be able to access the original site, so my approach was to place the holding page in the root folder on the server, and move the original site to a subfolder and link to it from the holding page. However, on testing this setup it appears to have hurt the SEO placing of the website. Is there a better approach to this? I'm a bit stumped as I want both to share the same URL.

    Read the article

  • Should Site Title be Before or After Page Title?

    - by NickAldwin
    Apologies if this is a dupe. I tried searching, but didn't find anything specifically addressing this concern. When creating a large(ish) site, page titles usually reference both the site name and the current page name. However, it seems there are two main conventions: Bob's Awesome Site - Contact Page and Contact Page - Bob's Awesome Site I've looked around, and pages usually use one of the two variants above. Is there any reason to use one over the other? SEO/readability/usability/etc? I've thought about it, and have only come up with: Page first - Differentiates the tab when the browser is crowded with lots of tabs Site first - Immediately see the "parent" site, so to speak; more cohesive experience

    Read the article

  • Beats Audio Headphones is the best passion of those designs

    - by WoolrichParka
    Beats By Dre Pro is accordingly smarter to buy the headsets on the internet back you will get a huge gathering of headsets place you can gently be able to achieve an overall choice.I want to give monster surpasses headsets all the awards because they create viewing activities about the Globe Cup so interesting.It has aegis adjoin your anthology and beanbag aggressive awnings recognizing the clover autogenously and abiding external.They are use firm with Beats Headphones tv set can also be incredibly common.wufengfengmaple36

    Read the article

  • Very High CPU usage (100%) from just browsing the Web

    - by cole
    I tested on Firefox and Chromioum. Im at 100% while loading pages which causes them to load slow and when I dont have a application running Im at 40% CPU (At least) Everything is slow basically. Im also already on Ubuntu Classic so im not using Unity. Should I go to 10.04? is that more stable? On windows this wasnt an issue. I have a Dual Boot with XP and a 2.4Ghz Intel Celeron with 768MB RAM and an Nvidia 6200 Graphics card. I heard 10.04 was the most stable. any suggestions?

    Read the article

  • Web Server with phpMyAdmin

    - by Kumar P
    We have web development company, We using RHEL 5 as local (proxy) server with few windows XP client machines. Now i want to make my Server machine as Web Server ( LAN only ), with mysql and phpMyAdmin. I installed httpd,php,mysql by yum. How to install phpMyAdmin ? And Where i want to installl ? Now i want to make my client machines can create php files in web folder, Also use mysql and phpMyAdmin. How can i do it ? Give me clear steps to do it .

    Read the article

  • Image Collector Rips Web Page Images to Your Dropbox Account

    - by Jason Fitzpatrick
    Chrome: Image Collector is a simple Chrome extension that rips the images on the page you’re visiting to your Dropbox (or Google Drive) accounts. Just click the icon, uncheck any images you don’t want it to download, and click save. You can, technically, modify the script to download the images directly to your hard drive, but modifying it was a bit of a hassle and the default save-to-Dropbox action is so smooth we saw little reason to do so. Hit up the link below to grab a free copy. Image Collector [via Freeware Genuis] How to Make Your Laptop Choose a Wired Connection Instead of Wireless HTG Explains: What Is Two-Factor Authentication and Should I Be Using It? HTG Explains: What Is Windows RT and What Does It Mean To Me?

    Read the article

  • Page load speeds effect on crawl rate

    - by Sam Pegler
    We've noticed a big drop in the total pages crawled per day on our site, we have no control over the crawl rate in google webmaster tools so it's possible this has been changed by google. However it's a fairly large site and I wouldn't of thought that the crawl rate would've been decreased. What we have noticed though is a sizeable increase in page load times, in my mind this would be the cause. Can anyone else confirm if the crawl rate is directly correlated to page load time? Seems logical, longer page load time, less pages crawled. Any decent documentation on this would be appreciated, I don't normally have any input on SEO so this is new to me.

    Read the article

  • Will using two different tracking codes affect my SERP

    - by Danny Hefer
    Hello everyone and thanks for your time! I am now facing a problem after a site migration. New site is basically an improved version of old site, with the same content and some extras. After pointing the domain name to the new site, the old site was still online for a while but didn't get any traffic. The new site has its own tracking code. So, old tracking code has age (something like 7 years) but no visitors for a month, but new tracking code is a month old with an acceptable traffic. How to you think google will react if I add old tracking code to new site? Thanks by advance!

    Read the article

  • Disqus integration in website.. what is wrong??

    - by Thieme Hennis
    hi, I try to embed a disqus forum in a website I created. I used the exact code and instructions they give on the installation instructions. I just don't get it. Not much on Google either. Is something wrong in the code? Should I change anything? <!DOCTYPE HTML> <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"> <link rel="shortcut icon" href="../favicon.ico"> <title>Little Louie | Hennis &amp; Blaisse Lovers Productions</title> <META NAME="keywords" CONTENT="some,tags"> <link href="../style2.css" rel="stylesheet" type="text/css"> </head> <body> <p><b>Some text</p> <p> <div id="disqus_thread"></div> <script type="text/javascript"> (function() { var dsq = document.createElement('script'); dsq.type = 'text/javascript'; dsq.async = true; dsq.src = 'http://littelouie.disqus.com/embed.js'; (document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(dsq); })(); </script> <noscript>Please enable JavaScript to view the <a href="http://disqus.com/?ref_noscript=littelouie">comments powered by Disqus.</a></noscript> <a href="http://disqus.com" class="dsq-brlink">blog comments powered by <span class="logo-disqus">Disqus</span></a> </p> <p> <script type="text/javascript"> var disqus_shortname = 'littelouie'; (function () { var s = document.createElement('script'); s.async = true; s.src = 'http://disqus.com/forums/littelouie/count.js'; (document.getElementsByTagName('HEAD')[0] || document.getElementsByTagName('BODY')[0]).appendChild(s); }()); </script> </p> </body> </html>

    Read the article

  • Copyright of pictures upload to a website?

    - by All
    I want to run a website like stock photos. How can I be sure that the uploader is real copyright holder of the picture? Is it possible to leave the responsibility of this copyright claim to the uploader or at last the webmaster is responsible for the website content? It generally confuses me, as for example, stock photo websites needs form signed by the model for photos showing human face. How they can be sure that the signature actually belongs to the model? How they keep them safe from possible lawsuit in this case (e.g. if selling photos of a models with a fake signature?)

    Read the article

  • Challenges of Managing Off Shore Web Development Teams

    Have you ever thought of challenges that may arise in managing a full fledged team of professionals who are located thousands of miles away from your official location? The problem of skillfully managing an official team of your company is quite an uphill task and can give rise to numerous problems.

    Read the article

  • How to parse JSON data from web more faster [closed]

    - by Kaidul Islam Sazal
    I have json inventory inventory.json on the server like this: [ { "body" : "SUV", "color" : { "ext" : "White diamond pearl", "int" : "Taupe" }, "id" : "276181", "make" : "Acura", "miles" : 35949, "model" : "RDX", "pic" : [ { "full" : "http://images1.dealercp.com/90961/000JNBD/001_0292.jpg" } ], "power" : { "drive" : "Front wheel drive", "eng" : "2.3L DOHC PGM-FI 16-VALVE", "trans" : "Automatic" }, "price" : { "net" : 29488 }, "stock" : "6942", "trim" : "AWD 4dr Tech Pkg SUV", "vin" : "5J8TB2H53BA000334", "year" : 2011 }, { "body" : "Sedan", "color" : { "ext" : "Premium white pearl", "int" : "Taupe" }, "id" : "275622", "make" : "Acura", "miles" : 40923, "model" : "TSX", "pic" : [ { "full" : "http://images1.dealercp.com/90961/000JMC6/001_1765.jpg" } ], "power" : { "drive" : "Front wheel drive", "eng" : "2.4L L4 MPI DOHC 16V", "trans" : "Automatic" }, "price" : { "net" : 22288 }, "stock" : "6945", "trim" : "4dr Sdn I4 Auto Sedan", "vin" : "JH4CU2F66AC011933", "year" : 2010 } ] here are two index, There are almost 5000 index like this. I parsed this json like this: var url = "inventory/inventory.json"; $.getJSON(url, function(data){ $.each(data, function(index, item){ //straight-forward loop if(item.year == 2012) { $('#desc').append(item.make + ' ' + item.model + ' ' + '<br/>' + item.price.net + '<br/>' + item.pic[0].full); } }); }); This is working fine.But the problem is that, this searching and fetching process is little bit slow as there are 5000 indexes already and it's increasing day by day. It seems that, it is a straight-forward loop to parse the data and a normal brute-force method. Now I want to know if there any time efiicient way to parse more faster.Any faster method to parse instead of straight-forward loop ?

    Read the article

  • Google webmaster Verification failed.

    - by KMC
    I have a site created by Ruby on Rails. I had verified against Google Webmaster Tool some months ago, which was successful. One day webmaster starts giving me Re-verification fails. I tried again to verify my site using Meta tags and HTML files. But I kept having "Verification failed. The connection to your server timed out." Since then, Google stop crawling my site's content - though, somehow google still crawl my PDF contents on my site.

    Read the article

  • How will we be able to produce websites without using cookies with the new law? [closed]

    - by Theresa Forster
    Possible Duplicate: How do I comply with the EU Cookie Directive? Under this new EU law we are not allowed to use any cookies without asking first, I for one need to use a cookie to register the user logged on, as if not with a cookie they can log on more than once and breach the license terms of the software. so i find myself asking what can we use instead of cookies to perform this task?

    Read the article

  • Impressions and traffic dropped by 70 %

    - by Louise
    Can anyone advise why my impressions dropped and traffic as well? I used to have very generic keywords such as: anti aging, anti wrinkle, face cream, eye cream. I thought they were bad and made the keywords more specific: anti wrinkle eye cream, anti aging face cream, etc. Following that change, my impressions and traffic dropped dramatically! I used to get 45+ visitors a day, now I get 15- visitors a day. What is the way forward? I thought what I did to the keywords was good?

    Read the article

  • Directing from a 1und1 hosting solution, with urls intact

    - by Jelmar
    I have done this before on GoDaddy without a hitch, but I cannot seem to figure out this particular case. I have a domain space with temporary url http://yogainun.mysubname.com/ and am hosting the domain name that is to be applied to it at 1und1.de. Right now I have set it up so that from the 1und1 domain name hosting the address http://www.yoga-in-unternehmen.de/ is frame redirected to the subdomain that I just referred to. But this is not what I want. http://www.yoga-in-unternehmen.de/ is to be the domain. With the frame redirect, url's like http://www.yoga-in-unternehmen.de/example-article do not show up. But this is what I want. With godaddy in a similar case, I just turned on DNS and changed the name servers. That worked without problem, but with 1und1 not. Is there something I am missing?

    Read the article

  • Ditch cPanel / WHM in favour of manual seup

    - by BWRic
    We currently use cPanel / WHM on a reseller account but are looking at getting a dedicated server. My first thought was to duplicate this set up on the dedicated box to allow us to quickly create new accounts. I'll be a managed server so they'll have set up the LAMP stack. I'm curious if I actually need cPanel and WHM. We don't use many of the features from cPanel / WHM, just creating accounts and databases, clients do not have FTP access. I'm no sys admin and come from a Windows / GUI background but have some knowledge in setting up development servers. WHM: Creating accounts I presume this sets up the Apache virtual host, FTP access and DNS settings. I've some knowledge of editing the Apache files to create virtual hosts. Am I correct in thinking as long as the DNS is pointing to the server IP and the virtual host is configured the server can serve the (php) pages? I'm not sure I need per site FTP access as only we will have access so I could have a server wide/htdocs only access to view all the site. The company who supply the dedicated hosts would also provide the own DNS management tool so I'm not need to cPanel one. MySQL: Creating users and databases We use cPanel to create the MySQL users and databases. As it's a dedicated box and I can have root access I think this could be replaced by SQLyog for db management and phpMyAdmin for user management. Do you I need cPanel or can I get by editing a few text files for creating the accounts, then use the MySQL tools for databases? Or am I missing something major with how the sites are configured?

    Read the article

  • Photoshop, slice, Dreamweaver, web?

    - by Omega
    So I am playing around with Photoshop and Dreamweaver. I have created a site layout, and have used the slice function on it. Next, I saved as html & images. In Dreamweaver, I open such html file and I fill the page with content, links, etc. I have a website and everything, and I would like to use my newly created html page on it. But, obviously, if I copy & paste the html to my website it won't work because it will lack the images. But two things: I can't find the images, and apparently they are a lot. I am sure I am doing a great mistake regarding the images. Can someone help me?

    Read the article

  • Webmaster Tools word count

    - by Henrik Erlandsson
    Is there a way to somehow verify that the googlebot finds the headings and the content, for example by word count? I'm asking this because I tried a program called Screaming Frog, which fails to even fetch the first h1 on a validated page - for about 1/3 of all the pages(!) - and got insecure. Even though the site looks hunky dory in Webmaster Tools, I'd like to know what a googlebot-like content crawler finds on my page and in what order. Any tips on such tools is appreciated. This is not about keyword count.

    Read the article

  • Recording slow web stream

    - by Budric
    I'm trying to record an mpeg2 video stream from a website that doesn't have the greatest bandwidth. The video often buffers. I want to download the stream and watch it offline. The extract stream format received is: Stream #0.0[0x44]: Audio: mp2, 48000 Hz, stereo, s16, 192 kb/s Stream #0.1[0x45]: Video: mpeg2video (Main), yuv420p, 704x576 [PAR 16:11 DAR 16:9], 15000 kb/s, 27.19 fps, 25 tbr, 90k tbn, 50 tbc I use the following tool to transocde the stream: ffmpeg -i "http://url" -y -vcodec libx264 -b 3000k -acodec copy /tmp/stream.mp4 Unfortunately after a few seconds ffmpeg stops recording with an error [mpegts @ 0x1f0b9c0] PES packet size mismatch [mp2 @ 0x1f14640] incomplete frame Error while decoding stream #0.0 [mpeg2video @ 0x1f16860] ac-tex damaged at 0 26 [mpeg2video @ 0x1f16860] Warning MVs not available I've tried encoding with vlc as well with similar issues. Although vlc doesn't stop encoding, the output video has regions where it hangs. vlc -I dummy "http://url" --network-caching="1000" --sout="#transcode{vcodec=h264,vb=3000,acodec=mp3,ab=192}:std{access=file,mux=mp4,dst=/tmp/stream.mp4}" [mpeg2video @ 0x7f2d4c001e20] ac-tex damaged at 9 33 [mpeg2video @ 0x7f2d4c001e20] Warning MVs not available [mpeg2video @ 0x7f2d4c001e20] concealing 132 DC, 132 AC, 132 MV errors [mpeg2video @ 0x7f2d4c001e20] ac-tex damaged at 16 17 [mpeg2video @ 0x7f2d4c001e20] Warning MVs not available [mpeg2video @ 0x7f2d4c001e20] concealing 836 DC, 836 AC, 836 MV errors libdvbpsi error (PSI decoder): TS discontinuity (received 4, expected 3) for PID 0 I also tried flv transcoding and it shows up with its own set of issues, like output flv file hangs in certain parts. Anyone know what's wrong or how to fix this?

    Read the article

  • What should I do to scale out an high-traffic website?

    - by makerofthings7
    What Best Practices should be undertaken for a Website that needs to "scale out" to handle capacity? This is especially relevant now that people are considering the cloud, but may be missing out on the fundamentals. I'm interested in hearing about anything you consider a best practice from development-level tasks, to infrastructure, to management. Use your best judgement when posting multiple answers, since it may make sense to post them separately for voting purposes. (hint: you'll likely get more reputation points for many small answers than one large answer)

    Read the article

  • What's the best platform for blogging about coding? [closed]

    - by timday
    I'm toying with starting an occasional blog for posting odd bits of coding related stuff (mainly C++, probably). Are there any platforms which can be recommended as providing exceptionally good support (e.g syntax highlighting) for posting snippets of code ? (Or any to avoid because posting mono-spaced font blocks of text is a pain). Outcome: I accepted Josh K's answer because what I actually ended up doing was realizing I was more interested in articles than a blog style, getting back into LaTeX (after almost 20 years away from it), using the "listings" package for code, and pushing the HTML/PDF results to my ISP's static-hosting pages. (HTML generated using tex4ht). Kudos to the answers mentioning Wordpress, Tumblr and Jekyll; I spent some time looking into all of them.

    Read the article

< Previous Page | 352 353 354 355 356 357 358 359 360 361 362 363  | Next Page >