Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 206/592 | < Previous Page | 202 203 204 205 206 207 208 209 210 211 212 213  | Next Page >

  • Trade off: Lower the number of URLs in sitemap from 43k to 23k or update the sitemap.xml only weekly basis

    - by Tobias
    we rewrote the sitemap creation process. Now the sitemap contains 43.000 URLs. 20k more than before. We have daily changing in URLs. The script that is creating the complete sitemap takes more than 30h. So we can not build it every day. Lets say that increasing the speed of the script is not possible. What should I do? A: Stay with the 23k URLs and update it daily B: Increase number of URLs to 43k and update it weekly

    Read the article

  • Analyze Drupal and Wordpress sites CPU load in shared server

    - by Tedi
    Our hosting company is complaining that both our Drupal and Wordpress websites running in a shared server are consuming too many CPU resources. The traffic for each site is not more than 100 users per day and, at a first glance, we don't have very many plugins/add-ons. Is there any tool or resource to analyse what is causing that high CPU load? Thanks Update: We decided to suspend our accounts while the problem was being debugged but still our hosting (Site5) said that they saw unacceptable activity on our sites so we had to move to a dedicated server... asked them several times to provide us with more information and they always came back saying that we had to purchase a higher account. Finally decided to move to another hosting service.

    Read the article

  • Symlink are using both locations?

    - by Tiago Rossi
    Ive made a research and didnt found any answers, so I decided to ask here. To make you know, the /dev/sda2 disk of my WHM/Cpanel webserver got 100% full. The /var/ folder are the /dev/sda2 and I've found the reason of that isse are the /var/lib/mysql folder. To fix it I need to move the /var/lib/mysql folder from /dev/sda2 to /home/ where I have a lot of space in disk. Then I used the command lines: service mysql stop cp -r -p /var/lib/mysql/ /home/databasesmysql/ mv /var/lib/mysql /var/lib/mysql.backup/ ln -s /home/databasesmysql/ /var/lib/mysql service mysql start Ok, now to check if its running at the new location I just renamed the /var/lib/mysql to /var/lib/mysql.backup and MySQL stopped working. Also when I rename the /home/databasesmysql/ folder MySQL also stop to work. I dont know whats happening, the symlink are using both locations? Thanks very much.

    Read the article

  • How to effectively use an overseas SEO team?

    - by Dan Gayle
    My company is currently in contract with a 20+ person team in the Philippines, previously used for comment linking and guest blogging spun content articles. This is a practice that we're stopping, but we don't want to sever our team because they work hard, they're really cheap, and they produce excellent accounting and reporting of their actions. What are ways that we can best put them to use as a link generating or content generating resource? Their English is fair, but not of high enough quality to use them for any direct content creation. Thanks

    Read the article

  • Implications on automatically "open" third party domain aliasing to one of my subdomains

    - by Giovanni
    I have a domain, let's call it www.mydomain.com where I have a portal with an active community of users. In this portal users cooperate in a wiki way to build some "kind of software". These software applications can then be run by accessing "public.mydomain.com/softwarename" I then want to let my users run these applications from their own subdomains. I know I can do that by automatically modifying the.htaccess file. This is not a problem. I want to let these users create dns aliases to let them access one specific subdomain. So if a user "pippo" that owns "www.pippo.com" wants to run software HelloWorld from his own subdomains he has to: Register to my site Create his own subdomain on his own site, run.pippo.com From his DNS control panel, he creates a CNAME record "run.pippo.com" pointing to "public.mydomain.com" He types in a browser http://run.pippo.com/HelloWorld When the software(that is physically run on my server) is called, first it checks that the originating domain is a trusted one. I don't do any other kind of check that restricts software execution. From a SEO perspective, I care about Google indexing of www.mydomain.com but I don't care about indexing of public.mydomain.com What are the possible security implications of doing this for my site? Is there a better way to do this or software that already does this that I can use?

    Read the article

  • How can I host a website on a dynamically-assigned IP address?

    - by nick
    I recently upgraded my internet to the point that it is much faster and more reliable than my current webhost. I would like to move my current domain to be hosted at home, but my IP address is dynamic. As far as I know, I only get a new IP when I restart my modem and or router (which is almost never) or when cable one (my ISP) pushes out a firmware update (rarely). There are a few ways I can see doing this: Convince my ISP to give me a static IP Assign my router my current IP to force a static IP (which might work?) Set my DNS record to my current IP address and update it on the rare occasions that it changes. Obviously I'm hoping that the first one works, but I don't want to pay a lot of extra money (if that's what it takes) to get a static IP address. Which of these options will work most reliably?

    Read the article

  • How should I deal with user agent parsing in logs?

    - by Mr. Jefferson
    My web app project includes logging functionality so we can see where visitors are coming from (referrer URL), what the popular user agents are, what pages are most popular, etc. The log is stored in SQL Server, and when I query the user agents I use a large (almost 100 lines) and growing CASE statement to separate the user agents using string matching (i.e. if the user agent contains the string "Firefox/9" then it's Firefox 9). Is there a better way to do this so I don't have to continually add to that CASE statement to deal with new browser releases? Also, how should I deal with less common, weird/unknown user agents? I've seen the following in the logs and been unable to find good information online about what they are: WordPress/3.3.1; http://www.facecolony.org Mozilla/4.0 ( http://www.hairirons.org redips; <a href=http://hairirons.org/>chi hair iron</a>) I'd guess they're bots/crawlers, but the sites they point to don't appear to reference web crawlers (or even be available sometimes). I've seen other user agents aren't familiar to me, but I know they're bots because they include "bot" or "spider" or something similar in them.

    Read the article

  • Why is my page ranked *very* low in Google? [closed]

    - by duality_
    I have created a web site mianatura.net that doesn't even rank in the top 100 results in Google for the query "Mia Natura". I have the text Mia Natura in the domain, <title>, <h1>, I have the site in Google Webmaster Tools, the site is crawled (finding 172 results for site:mianatura.net). I have checked my standing manually (going through the SERPs), using What Page of Search Am I on and diyseo. The site is not involved in any dubious link building campaigns (as far as I know). So what's going on?

    Read the article

  • Please explain some of the features of URL Rewrite module for a newbie

    - by kunjaan
    I am learning to use the IIS Rewrite module and some of the "features" listed in the page is confusing me. It would be great if somebody could explain them to me and give a first hand account of when you would use the feature. Thanks a lot! Rewriting within the content of specific HTML tags Access to server variables and HTTP headers Rewriting of server variables and HTTP request headers What are the "server variables" and when would you redefine or define them? Rewriting of HTTP response headers HtmlEncode function Why would you use an HTMLEncode in the server? Reverse proxy rule template Support for IIS kernel-mode and user-mode output caching Failed Request Tracing support

    Read the article

  • Is there any good hosting for asp.net and MySQL

    - by HAJJAJ
    HI every one ,I have account with one of the hosting company, and i did my project in asp.net and I used MySQL for the database. the hosting company is not giving me the full privileges to create new user or to create new stored procedure!!! this is what they said for me: Due to the shared nature of our environment we had to make some modifications to your procedure (namely the definer). We also had to review your procedure to determine if it would be compatible with our environment. While your procedures will work (via phpMyAdmin or some other interface), it is unlikely they will be accessible via the Connector/.NET (ADO.NET) that your application is likely using. This is due to a security restriction with how that connector works in shared environments. http://dev.mysql.com/doc/refman/5.0/en/connector-net-programming-stored.html "Note When you call a stored procedure, the command object makes an additional SELECT call to determine the parameters of the stored procedure. You must ensure that the user calling the procedure has the SELECT privilege on the mysql.proc table to enable them to verify the parameters. Failure to do this will result in an error when calling the procedure." Unfortunately, giving read privileges on the mysql.proc table will give you access to the data of our other customers and that is not an acceptable risk. If your application can only work using stored procedures, then MSSQL will probably be the better option for your site. I apologize for the inconvenience and the wait to have this ticket completed. So is there any good hosting that any body already used it to publish his asp.net and mysql project ??? this is one of my stored procedure and i think it's sample and it will not harm any other uses!!: -- -------------------------------------------------------------------------------- -- Routine DDL -- Note: comments before and after the routine body will not be stored by the server -- -------------------------------------------------------------------------------- DELIMITER $$ CREATE DEFINER=`root`@`localhost` PROCEDURE `SpcategoriesRead`( IN PaRactioncode VARCHAR(5), IN PaRCatID BIGINT, IN PaRSearchText TEXT ) BEGIN -- CREATING TEMPORARY TABLE TO SAVE DATA FROM THE ACTIONCODE SELECTS -- DROP TEMPORARY TABLE IF EXISTS TEMP; CREATE temporary table tmp ( CatID BIGINT primary key not null, CatTitle TEXT, CatDescription TEXT, CatTitleAr TEXT, CatDescriptionAr TEXT, PictureID BIGINT, Published BOOLEAN, DisplayOrder BIGINT, CreatedOn DATE ); IF PaRactioncode = 1 THEN -- Retrive all DATA from the database -- INSERT INTO tmp SELECT CatID,CatTitle,CatDescription,CatTitleAr,CatDescriptionAr,PictureID,Published,DisplayOrder,CreatedOn FROM tbcategories; ELSEIF PaRactioncode = 2 THEN -- Retrive all from the database By ID -- INSERT INTO tmp SELECT CatID,CatTitle,CatDescription,CatTitleAr,CatDescriptionAr,PictureID,Published,DisplayOrder,CreatedOn FROM tbcategories WHERE CatID=PaRCatID; ELSEIF PaRactioncode = 3 THEN -- NOSET YET -- INSERT INTO tmp SELECT CatID,CatTitle,CatDescription,CatTitleAr,CatDescriptionAr,PictureID,Published,DisplayOrder,CreatedOn FROM tbcategories WHERE Published=1 ORDER BY DisplayOrder; END IF; IF PaRSearchText IS NOT NULL THEN set PaRSearchText=concat('%', PaRSearchText ,'%'); SELECT CatID,CatTitle,CatDescription,CatTitleAr,CatDescriptionAr,PictureID,Published,DisplayOrder,CreatedOn FROM tmp WHERE Concat(CatTitle, CatDescription, CatTitleAr, CatDescriptionAr) LIKE PaRSearchText; ELSE SELECT CatID,CatTitle,CatDescription,CatTitleAr,CatDescriptionAr,PictureID,Published,DisplayOrder,CreatedOn FROM tmp; END IF; DROP TEMPORARY TABLE IF EXISTS tmp; END

    Read the article

  • website development, where to start from [closed]

    - by hopefulLLl
    hello everyone.. i am a computer science student,and i know C language. i want to learn making websites but dont know how to go about it. i did learn some HTML, and right now learning CSS from www.w3schools.com . now can anyone tell me what shall i learn next and what all things[languages] i need to learn to start making websites. also refer to the study material if u can. thanks. also, how long will it take me to make some nice websites?

    Read the article

  • Should I use rel=index or rel=contents in this instance?

    - by Martin Bean
    I’m creating an MMA website. There’s the home page, there’s a fighters section whose index page lists fighters in the organisation, and then each fighter has a profile page. The URL structure is like this: / /fighters/ /fighters/john-doe My question is: on the fighter’s profile page I want to link to the fighters index page (/fighters). In my HTML page, which meta tag would be the most appropriate? <link href="/fighters/" rel="index" /> Or: <link href="/fighters/" rel="contents" /> I’m having trouble distinguishing which would be best, and whether rel="index" would be the index for the whole site or the current page/section I’m viewing?

    Read the article

  • Why is Google Webmaster Tools crawling invalid URLS and showing 500 errors?

    - by Amos Kane
    Google Webmaster tools is reporting 12k+ 500 errors. Eeek! None of the URLS are valid- they all contain www.youtube.com. First, why is Google crawling these URLS if they don't exist? I supplied a sitemap, and they are of course not in the sitemap. I don't have a robots.txt blocking anything. I've checked for invalid redirects--none, and checked for unclosed tags or something that would throw www.youtube.com into the URL by accident--none. In every 'linked from', the referring URL is also a bad URL, with www.youtube.com in it. The Google Tools report no malware, and I can't check the server logs because the host won't give me access. Really stuck!! Any ideas appreciated!

    Read the article

  • Activist shared printing material gallery

    - by Dave
    What would you say would be the best way to do this: We would like to create a section on our activist community FB page and website in order to share with everyone images and files ready for printing panflets, brochures, t-shirts, stickers, etc. Let's say we have some cool slogans for t-shirts, so we would like to show them on a gallery, and offer for download the original design files needed for a print shop to create the t-shirts. And the same thing for all other kinds of media. We want to enable anyone to be able to just download the files for free, and easily create printed materials with them. But besides offering this hybrid between picture gallery and downloads manager, we would also like to make it very easy for anyone to upload and share their own files with the community, to make it a true collaboration initiative, be it that they get posted automatically, or that we first review and approve all uploads. Cafepress or Spreadshirt let you upload your design and sell your own merchandise. We need something similar, but where people can then download working files for making quality printings and materials. What apps, tools, services or methods are out there with which you think this could be best done?? We have some ideas, but we would like to hear some more!!

    Read the article

  • Move website from host a to host b without down time dns change

    - by grigione
    I would like to move my website from host A to host B I have uploaded a copy of my site to the new host, while keeping the old copy in place with the old host I will need to update the nameservers to point to the new nameservers. I'll want to change the DNS settings of domain name to point to new host b To avoid down time for the DNS change to propagate through the net, can I add the old nameservers and new nameservers together without causing problems, or must I delete the old nameservers first? What happens to my website when it points to two different nameservers?

    Read the article

  • website particular url suddenly disappeared from google search result

    - by Ragavendran Ramesh
    i have a website , in that a particular page url was indexed in google search result in the first 10 results , but suddenly it disappeared , not that page is not even in the 100results , what would be the reason. i am feeling that the page has be spammed by our competitors . is it possible to avoid that , or can i find that page has been spammed or not. Is it possible to find the particular page in a website is spam or malicious.

    Read the article

  • Can third party content on sub-domains harm the main site's search rankings?

    - by dror
    I have a site that is a "portal" or "directory" for service providers. We opened every service provider's own page on our site, but now we get a lot of applications from those providers that want sites from their own. We want to make a full site for every service provider, but rather put them on sub domain URLs. (They don’t mind, it's OK for them.) So, my site is www.exaple.com Their site will be: provider.example.com Now I have two questions: Can the content on the provider sites harm my site in SEO? If one from those sub domains is punished by Google because the owner does "black hat SEO", how it will affect the rood domain? Can it make the root domain get punished?

    Read the article

  • Nginx routing script for NodeJS and Wordpress

    - by Nilay Parikh
    We are moving blogs and site from wordpress to nodejs and ready to move into production. However I'm not able to figure it out how to implement routing from front server (Nginx) to NodeJS (prefered web instance) and if data not synced yet into NodeJS website than (404 will throw by NodeJS) fall back to (using reverse proxy) to Wordpress and serve page, during the transformation period. Q1. Is the approach good for the scenario, or anyone can suggest better approach? Q2. Should NodeJS treat itself as Reverse proxy (using bouncy : https://github.com/substack/bouncy or similar package) in event of fall back or shoud stick with Nginx to do so using fastcgi approch. Both NodeJS and Wordpress are on single server only, In first scenario, /if resource available than serve directly User -> Nginx -> NodeJS (8080) \if resource not available then reverse query wordpress and serve content second scenario, /if resource available than serve directly User -> Nginx -> NodeJS (8080) \if resource not available then 404 to Nginx and Nginx script fallback to Wordpress (FastCGI PHP) Later we have plan to phase out Wordpress and PHP from the server environment completely. I'd like to see any examples of Nginx or Varnish scripts and/or NodeJS scripts if you have for me to refer. Thanks.

    Read the article

  • Why is "googlehosted.com" in the DNS records for our website after signing up for DDOS protection?

    - by Blake Nic
    Recently we had to get some DDOS protection for our website because of the large attacks we were seeing after getting a bit of popularity. We handed over our domain and hosting information to our DDOS protection provider. It worked perfectly but I have a question. On our DNS records we have the Host and Answer and Type. The host has our domain name there. The answer is this: SOMETEXTXXXX.dv.googlehosted.com. And when I copy and paste it into my browser it gives me a 404 error. But our website still loads and functions as it should. I don't understand why it would need this? I asked them about this and they said it is a method for DDOS protection and the other IPs are the reverse proxy (the other IPs give a 404 error too). Can anyone expand on this more please. How does all this tie in together and make the internet browser know where to point the person with all these reverse proxies and stuff I don't understand. Here is an image for reference:

    Read the article

  • Visitors have old website cached in their browsers

    - by RussianBlue
    My client's new website is example.com, the old website is example.co.uk. I've re-pointed the A Records to the new website (so as to leave the emails alone) and put in 301 redirects from old pages to new pages. But, my client is upset as he (and he thinks many of his clients) have the old website cached in their browsers and won't know how to clear their browser cache. Is there anything I can do to overcome this and if not, what sort of time will browsers finally stop using their cached pages so I can at least go back to my client and tell him that his clients will finally start to see the new website?

    Read the article

  • Is there any negative impact with similar page titles and descriptions on similar sites?

    - by ElHaix
    Currently we have Canadian versions of some websites. We are going to create some American versions, which essentially have everything the same, except the search results are geo-specific to the USA. The format for the results page title and descriptions will remain the same, ie {0} in {1} | Find more {0} etc etc etc... {1}. The search term will most-likely be the same between both sites. Will the relative similarity in the page titles and descriptions between the CDN and USA sites have any negative SEO impact, where the geo location would be the most significant difference?

    Read the article

  • Groups page is blank in SharePoint 2010 [migrated]

    - by Murali Ramakrishnan
    Sometimes it's very confusing how Sharepoint 2010 group creation works Here's a scenario we have been facing from a long time wrt groups in SharePoint 2010 We had requirement of creating a two custom groups followed by creating a custom site through programmatically, For the most case the scenario works as how it is excepted to work. but, out of 1/100 site creation process the groups creation fails, which means we were able to access the group and users associated with it through programmatically. but, when it comes to UI stand point if you try to access the specific group page from the site permissions page - SharePoint returns a BLANK WHITE Page... BLANK WHITE Page... nothing else... Ain't is this a Sharepoint 2010 issue. or anybody had this problem and fixed it. Kindly share your thoughts

    Read the article

  • Video monetising and simple shop, platform?

    - by fieldman
    I have this client that wants a single-product website. The product is a training-video that they want to deliver virtually and optionally physically. I usually do all the front-end design and back-end development but the budget is close to $0 to start with. So I'm looking for a platform like shopify or something where a shop/cart can be set up quickly and simply with minimal up-front cost - but which can accomodate some kind of paywall (DRM too?) for the online video with an option to purchase for an aditional cost the physical DVD. Am I approaching the wrong way all together? Or do you know of any platform that will accomodate the specs?

    Read the article

  • My Flex 3 Website Doesn't Have Any Keywords Listed in Google's Webmaster Tools

    - by Laxmidi
    Hi, I've got a Flex 3 website. When I in Google's webmaster tools - Your site on the web - Keywords, none are listed. Does anyone have an all-Flex site that has keywords listed for it the above? The sites been up for about a month. It's been indexed by Google. I have keyword metatags in the site, which from what I've read Google ignores. Where does google come up with the keywords for your site? Any suggestions on what I need to do? Thank you. -Laxmidi www.brainpinata.com

    Read the article

  • Good free CSS Sprite for icons

    - by Saif Bechan
    I am working on a small project where I need some of the basic icons: edit, favorite, delete. You know them. Now i can download them all seperate, and put them together in a sprite, but I was wondering if there are ready to download sprites which I can use. Now I am working on an accounting app, so it would be nice if the icons were not too childish. A little but of fancy business type icons. Thanks

    Read the article

< Previous Page | 202 203 204 205 206 207 208 209 210 211 212 213  | Next Page >