Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 290/592 | < Previous Page | 286 287 288 289 290 291 292 293 294 295 296 297  | Next Page >

  • Interactive map using javascript [on hold]

    - by Denis
    Im trying to learn HTML and javascript. But i cant find any information about how to create interactive map/picture using javascript. Ex. I take a map where is a part of my town and write some information about like few buildings there, so after i put my mouse over those buildings the information will be displayed. It should look similar to this http://davidlynch.org/projects/maphilight/docs/demo_usa.html I need to use the javascript to make it done.

    Read the article

  • Are backlinks transitive when old URL is forwarded to new URL?

    - by JVerstry
    Say there are two companies operating in the same industry: www.companya.com and www.companyb.com. Company A has acquired some backlinks over time. Company B acquires company A and decides to only use its URL. It forwards properly all links from www.companya.com to www.companyb.com. Will company B benefit from the backlinks of www.companya.com through the redirection? Is it worth the effort from a backlink perspective only?

    Read the article

  • Issues With IIS Hosting Two Domains From Same Folder [closed]

    - by Bob Mc
    I have two different domain names that resolve to the same ASP.Net site. Both domains are hosted on the same server, which runs Windows Server 2003 and IIS6. The sites are differentiated in IIS Manager using host headers. However, both of the sites point to the same folder on the local drive for the site's page files. I am occasionally experiencing an ASP.Net error that says "The state information is invalid for this page and might be corrupted." I'm the site developer so I've addressed all the relevant code-related causes for this issue. However, I was wondering whether having two domains/sites sharing the same folder for an ASP.Net application might be causing this intermittent error. Also, is this generally a bad practice? Should I make separate, duplicate folders for each of the domains? Seems like that can become a maintenance headache.

    Read the article

  • Any reason not to buy SSL certificates from a reseller?

    - by andrewtweber
    Is there any reason to buy an SSL certificate directly from the host (for $49, for example), instead of through a well-known reseller, for only $10.95? Specifically in my case, I just need to encrypt user data as they fill out forms. Nothing too critical, only usernames and passwords. I don't think my users actually care about the brand name or "trust" level behind the certificate, I'm just looking out for them.

    Read the article

  • Options for different domain and hosting

    - by Carl
    The situation I have a hosting service (one.com) on which I have installed a wordpress.org site in a subdirectory 'wordpress': myhost.com/wordpress/ (myhost.com is actually my own domain, but it already has contents and I don't want wordpress/ to appear in the root of that domain.) I want to use a second domain for this site. Thinking I would be able to forward to the wordpress site without problems, I registered the domain at GoDaddy.com: mydomain.com What I want So when my visitors type in mydomaind.com, I want them to see the contents on myhost.com/wordpress/, and the same for all subpages (mydomain.com/a/subpage fetches from myhost.com/wordpress/a/subpage). Just a redirect isn't enough, I want my visitors to see only mydomain.com as their domain. Some notes If I set up forward with URL masking at GoDaddy, they just give a full frame, pointing to myhost.com/wordpress/. This isn't good enough for me, since mydomain.com will always show up in the adress bar, also for subpages (I want mydomain/a/subpage to show in the adress bar for a subpage). I believe this could in principle be done with a .htaccess file with URL rewriting, but I have no hosting with GoDaddy so I can't upload such a file there. Hosting with GoDaddy is very expensive (of course) so I don't want to do that. I don't think I can use DNS settings; the host of mydomain.com says they don't allow anyone else to point to their name servers. If possible, I wouldn't want to re-install the wordpress site, it would take quite some time. I'd prefer to keep it at myhost.com/wordpress/ (if possible) Anything involving transferring the domain is supposed to take 5-7 working days. I would need my site up-and-running earlier than that, so I'd like to avoid it if possible. Am I locked in? As it seems, I am rather locked-in with GoDaddy. I can't use the domain with .htaccess since I can't upload such a file (and won't pay for hosting by GoDaddy). I can't use any of their forward options since none of them do what I want (one just forwards, the one that masks the URL does it with frames). Would you agree? Possible solutions Transfer the domain to any hosting service with reasonable hosting pricing, as opposed to GoDaddy (I'd probably use one.com, the same host as for myhost.com, in that case), and there either re-install wordpress on the new account, or use .htaccess with URL rewrite on the new account to fetch the contents from myhost.com/wordpress/. Can this be set-up to work with sub-pages as well? And visitors won't ever see "myhost.com/wordpress", just "mydomain.com"? E.i., mydomain.com/a/subpage/ wold fetch from myhost.com/wordpress/a/subpage/? This might be a long shot but: Find some free (preferrably) hosting allowing to point to their nameservers Make DNS settings at GoDaddy so that my domain appears at the site above at that site, put a .htaccess file with URL rewriting to forward to myhost.com/wordpress/ Could this be possible? What services could I use in that case? As I see it, this would be the only way not to have to transfer a domain (taking 5-7 working days) and not having to re-install the wordpress site. Sorry for the long question. All info and ideas are welcome.

    Read the article

  • Cheap server stress testing

    - by acrosman
    The IT department of the nonprofit organization I work for recently got a new virtual server running CentOS (with Apache and PHP 5), which is supposed to host our website. During the process of setting up the server I discovered that the slightest use of the new machine caused major performance problems (I couldn't extract tarballs without bringing it to a halt). After several weeks of casting about in the dark by tech support, it now appears to be working fine, but I'm still nervous about moving the main site there. I have no budget to work with (so no software or services that require money), although due to recent cut backs I have several older desktops that I could use if it helps. The site doesn't need to withstand massive amounts of traffic (it's a Drupal site just a few thousand visitors a day), but I would like to put it through a bit of it paces before moving the main site over. What are cheap tools that I can use to get a sense if the server can withstand even low levels of traffic? I'm not looking to test the site itself yet, just fundamental operation of the server.

    Read the article

  • Tracking users behaviour - with or without Google Analytics

    - by Ilian Iliev
    If I understand correctly the following (point & from GA TOS): PRIVACY . You will not (and will not allow any third party to) use the Service to track or collect personally identifiable information of Internet users, nor will You (or will You allow any third party to) associate any data gathered from Your website(s) (or such third parties' website(s)) with any personally identifying information from any source as part of Your use (or such third parties' use) of the Service. You will have and abide by an appropriate privacy policy and will comply with all applicable laws relating to the collection of information from visitors to Your websites. You must post a privacy policy and that policy must provide notice of your use of a cookie that collects anonymous traffic data. You are not allowed to use custom variables that will identify the visitor(for example website username, e-mail, id etc.) So the question is how can I track a specific user behaviour(for example the actions that every single logged in user do).

    Read the article

  • is it ok to have 2 sitemaps on 1 website?

    - by user615041
    Do I have to have a sitemap page on my index page for bots to read it or can I just have it anywhere on my server? I have a phpbb/wordpress integration and I need 2 sitemaps mods for each one (or I need to have them somehow integrated together into one xml sitemap). Is this possible? Whats my best option? I would have the phpbb one something like this: http://www.example.com/phpbb/sitemap.html and the wordpress one something like this: http://www.example.com/wordpress/sitemap.html and then I would submit both off..but not have the links on my footer to confuse anyone.., the sitemaps would strictly be for search engines. Is this a good idea? what are you thoughts?

    Read the article

  • 80,000 hits on irrelevant topic in 1-2 days [closed]

    - by John
    On 3rd of Nov 2012 somebody started this thread : http://forums.hostgator.com/advice-needed-new-account-t214566.html?t=214566 subject "Advice needed on new account". I visited it on 5th and when I saw the total views I was stunned to see figure of 80,000( while other threads were having views from 50-300). I even made a post using name of rag_gupta. Then I simply typed in Google in IE (I work on Firefox) : Advice needed on new account ---- and yes this same page was in no#1 position. Subsequently I tried to create similar wording article in Hubpages to see how it'd fare. Unfortunately it was not allowed to be published. Then I published in blogger.com. But hardly any hits. Considering only the first two posts in the thread what could have driven google to rank it highly?

    Read the article

  • Feedburner is Displaying an Inactive Email Address When Logged In

    - by Cynthia
    I have set up a Feedburner subscription through my Google account login. For some reason Feedburner displays an old inactive email address in the top right when logged into Feedburner. I'm concerned since I don't have access to this email account. If for any reason I need to move my Feedburner Feed I'm concerned I'll need to access this email account which is impossible. My email address is correct in every other Google account, profile etc. and has been updated to a working and correct email address. How can I change this? Do I need to be concerned?

    Read the article

  • Awesome WordPress modular theme builder?

    - by Matt M.
    I came across a really neat modular theme for WordPress a while ago, but I can't remember the name so I'm wondering if somebody can help out. It was a paid theme with a dedicated company behind it if I'm not mistaken. The theme allowed you to change the colors of all the content areas, define widgets, and resize content areas. I'm really interested in being able to resize columns and such at the touch of a button, getting too old for the usual CSS micromanagement that goes on with brittle themes.

    Read the article

  • Hosting a website from a dynamic IP

    - by nick
    I recently upgraded my internet to the point that it is much faster and more reliable than my current webhost. I would like to move my current domain to be hosted at home, but my IP address is dynamic. As far as I know, I only get a new IP when I restart my modem and or router (which is almost never) or when cable one (my ISP) pushes out a firmware update (rarely). There are a few ways I can see doing this 1) convince my ISP to give me a static IP 2) assign my router my current IP to force a static IP (which might work?) 3) set my dns record to my current IP address and update it on the rare occasions that it changes. Obviously I'm hoping that the first one works, but I don't want to pay a lot of extra money (if that's what it takes) to get a static IP address. Has anyone had any luck with something like that?

    Read the article

  • What are some potential issues in blocking all incoming requests from the Amazon cloud?

    - by ElHaix
    Recently I, along with the rest of the world, have seen a significant increase in what appears to be scraping from Amazon AWS-related sources. So simply put, I blocked all incoming requests from the Amazon cloud for our hosted application. I know that some good services/bots are now hosted on the cloud, and I'm wondering if certain IP addresses should be allowed, as they may gather data that would in the end benefit our site's SEO rankings? -- UPDATE -- I added a feature to block requests from the following hosts: Amazon Softlayer ServerDeals GigAvenue Since then, I have seen my network traffic decrease (monitored by network out bytes). Average operation is around 10,000,000 bytes. You can see where last week I was not blocking, then started blocking. I've since removed the blocks and will see what the outcome is.

    Read the article

  • How do I remove only some values of a URL parameter in Google Analytics?

    - by Iain Hallam
    I'm using Google Analytics on a DokuWiki site, which uses a URL parameter to decide what to do with the current page: /page is equivalent to: /page?do=show 1) I want to see some of these "modes", but mostly I'd like them counted as viewing the bare page URL itself. The following are the only ones I want to see separately: /page?do=login /page?do=backlinks /page?do=revisions /page?do=subscribe How do I collapse the unwanted modes to the page itself (/page)? 2) Some modes do something that should really not have a page attached, such as: /page1?do=sitemap /page2?do=sitemap How do I get these to show up without the page part (/?do=sitemap)? 3) What do I do with the search mode? Can I remove the page part from this too, and still find out which page people used the search function on? /page?do=search&id=query+text

    Read the article

  • Would it be possible to build a client portal on Squarespace6?

    - by aBathologist
    I'm helping a family member set up a site which will need to include a secure client portal, providing access to documents and a simple database. I have been encouraging them to go with a more established, open source CSM like drupal or joomla, whose capability in this area is evident. However, they have a strong preference for Squarespace. Does any one know if it would be possible to accomplish this with the new developer platform for squarespace 6? I've spent well over an hour searching google, the squarespace site and stackexchange, but can't seem to find any clear answer to this question. I'm grateful for any insight you all can provide.

    Read the article

  • rel="Canonical": Ranking Benefits ? & specifying for PDF?

    - by Miak
    I think I understand the basic case for using rel="canonical": to tell google which is the preferred URI when the same page/content may be accessed via more than one URI. This helps you avoid duplicate content penalties. But what else does it do? Does it also affect search ranking? i.e. will the page I specify in the canonical be ranked higher than the others? (if all else equal). And in the case of PDF documents, I understand that you can now specify rel="canonical" for them too, using HTTP headers (i.e. in htaccess). Again, this would obviously help avoid dupilcate content penalties if the PDF content is the same as the HTML page or if it can be accessed in more than one place. But does it affect ranking? or are there any other benefits to doing this.

    Read the article

  • Google Analytics - include filter not working

    - by gerl
    I just added an include filter this morning in my domain (test.org). I have: Custom Filter Include Request URI ^/test-a/46212$|^/test-a/46212|^/test-a/46315 Now after I go to Content Site Content All Pages, I see stats for other pages that I didn't include in my filter. For example I see /somethingelse. I only want to see stats for /test-a/46212 and whatever else in my filter. Please let me know what I'm doing wrong.

    Read the article

  • Is it possible to have multiple subdomains point to the same Blogger blog?

    - by cclark
    For our application we want to have a status page which is hosted outside of the rest of our infrastructure so in case there are issues in our data center we can post updates for our users and our users will be able to access them. We registered a blog on Blogger and set it up with xyzstatus.blogspot.com and status.xyz.com. Everything seems to work fine. We need to perform some maintenance at our datacenter which will sever all connectivity so we're unable to have a redirect using nginx or apache. We'd like to do this with a short TTL CNAME DNS entry. Ideally www.xyz.com and app.xyz.com could be CNAMEd to status.xyz.com. When I setup the CNAME and go to that URL I get a Google broken robot 404 page. I figure I must need to let Google know it should associate traffic to www.xyz.com and app.xyz.com to the blog served up by status.xyz.com. But I can't see anywhere to do this in Blogger. Does anyone know if this is possible?

    Read the article

  • Rendering citations and references in HTML using PHP/Perl/Python/

    - by Nick
    Is there a PHP/Perl/Python/... library for picking citations out of an HTML file and rendering a nice list of references at the bottom, like in Wikipedia? I'm developing a website with heavily-sourced content, and I'd really like to have automatically-generated lists of formatted references, like in Wikipedia. (Check out their philosophy page, and see how the superscript numbered citations interact with the references at the bottom. This is all dynamically generated, automatically ordered & linked.) They do it really well: the citations are linked to the references (which are backlinked to the citations), when you click on one of the links, the target is highlighted, etc. I'm tempted to build the site on MediaWiki just for this one feature, but it seems like overkill. Do I have any options?

    Read the article

  • For an inexperienced VPS administrator, is Nginx a suitable alternative to Apache?

    - by James
    I couldn't think of the best way to set the title, so if somebody wants to edit it to something more appropriate, I'd be grateful ;) I'm what I would consider to be an inexperienced user/ administrator when it comes to running my VPS. I can get by with a few CLI commands, I can set up Webmin and I can set up Yum repos, but beyond the very basic stuff, I'm out of my depth. So far, I'm running Apache. I don't know it particularly well, but I can get by with editing httpd.conf if I'm told what to edit. I've heard good things about Nginx and that it's not as resource-hungry as Apache. I'd like to give it a go, but I can't find any information about its suitability for administrators like me, with little experience of sysadmin or web server config. Webmin now has support for Nginx, so getting it installed and running probably won't be too much of a problem. What I'm wondering is, from a site adminstrator perspective, is running Nginx as transparent as running Apache? IE, at the moment, I can just throw up Wordpress and Drupal sites without having much to worry about or having to make any config changes to Apache. Would Nginx be as transparent?

    Read the article

  • jquery slidetoggle expands and collapses straight away [migrated]

    - by Floran
    Please see here: http://www.wunderwedding.com/weddingvenues/search On the left side there's a filter 'show more cities...' When it's clicked I want to show more cities. But right now when it's clicked the box expands and IMMEDIATELY collapses again. I dont know what Im doing wrong. This is the code for show/hide: $("#toggle_cities").click(function () { if ($("#facets_city").is(":visible")) { $("#toggle_cities").text('toon meer steden...'); } else { $("#toggle_cities").text('toon minder steden...'); } $("#facets_city").slideToggle("slow"); });

    Read the article

  • Google Analytics: tracking subdomains for a profile defined for a subdomain

    - by Alex G
    Hope you can help. We have set up a single property under our Google Analytics account. That property's default URL is set to subdomain1.example.com. We would now like to track multiple subdomains for example.com, under the same property. Seems easy enough: we just need to add _gaq.push(['_setDomainName', 'example.com']); to our tracking code, right? But my question is: does it matter if a) we don't need to track www.example.com (this is tracked under a seperate account and property) and b) the default URL for our property is set to subdomain1.example.com? Will either of these have any impact on data collection?

    Read the article

  • My Sites Were Hacked. What To Do?

    - by Vad
    I host multiple domains with this very popular hosting provider and I just went into one of my sites and... I see a black page with message "Hacked by...". I checked and all my sites with the provider are showing this same page. Inside of file system I have seen the hacker placed all default.* and index.* files with this message. So the hacker overwrote all index pages, placed new pages and that is under every, I say again, every folder. Cleaning this up will be close to a most horrible job. What to do (right now I am awaiting the restore of files from hosting provider)? How to prevent this? Whom to blame?

    Read the article

  • How to change my website's appearance in a Facebook wall post?

    - by Lode
    When posting a website link in a Facebook wall post, Facebook fetches some content (title, text and image) from the website to show it to readers. Is there a way I can adjust / propose which content is used / preferred by Facebook? I found someone saying to use <meta property="og:image" content="image.jpg">, but this doesn't seem to have any effect. But maybe Facebook caches these results for a while?

    Read the article

< Previous Page | 286 287 288 289 290 291 292 293 294 295 296 297  | Next Page >