Search Results

Search found 24378 results on 976 pages for 'pinned site'.

Page 153/976 | < Previous Page | 149 150 151 152 153 154 155 156 157 158 159 160  | Next Page >

  • Consolidating multiple domain names

    - by Mike
    I have a client that has three separately hosted copies of their website, each on a separate domain name. The websites are all essentially the same, bar a few discrepancies caused by badly managed updates in the past. I will soon be launching a completely new website for them, at which point, all three domain names are to resolve to the same web server. One domain name will become the default domain name that they refer to in all their literature, and the other two will simply be used as catch-alls for old links, bookmarks, and so on. I would like to know what people consider the best route to achieve this. My plan so far is: Get the new site up and running on the new webserver. Change the relevant A record of the default domain name to point to the new webserver. a) Keep the existing hosting accounts in operation. Create a list of 301 redirects from old page names on the old site to new page names on the new site. or b) Configure CNAME records for the non-default domain names, each pointing to the new webserver. Create a list of 301 redirects on the new site that redirect from old page names to new page names. If my understanding is correct, 3a will help to maintain whatever search engine rankings the sites already have (I know it's not going to be perfect), while at the same time informing search engines that the old domain names are no longer in use. What's a good approach to take here?

    Read the article

  • httpd.config Easy Apache WHM CentOS

    - by jessie
    First let me explain how I got to this situation. I run a Streaming Video Site. Videos are about 100-250MB in size at any given time there are 500 people on the site. So I guess that would make then static. Recently My site started getting really slow and the only way to fix it temporarily was to restart apache. Now there was no change in traffic that could have caused this. My site is not being attacked. My hosting company recomended to implement mpm_mod and suPHP. They did that by using Easy Apache in WHMS. Then everything was working fine but a little slow. I researched around and to my understanding that mpm will do that but be more table. I was told that installing FastCGI would speed things up just enough. Well that made everything worse. The site is slow and time's out. I used WHM and took off fastCGI but its still the same, it seems like everything i do as of now nothing changes. I even did a roll back on the htconfig file but that didn't work. I'm not sure how to fix this. and my hosting network guy wont be able to touch the problem until Tuesday. I have root access.

    Read the article

  • Securing php on a shared apache

    - by Jack
    I'm going to install apache+php in a server where two users, A and B, will deploy their website. I'm trying to achieve isolation of users' space for security reasons: that is no scripts from site A should be able to read files in site B. To achieve this I installed suphp. Website files of user A are owned by A:A with perm=700 and user of B are owned by B:B with perm=700. Suphp works great, but apache complains about permissions to read .htaccess. How can I let apache to read .htaccess in every dir of A and B while keeping isolation between site A and site B? I played with ownership (group = www-data) and permissions (750) but I found no way to keep isolation granted. Any idea? Maybe by running apache as root, but in this case are there any drawbacks?

    Read the article

  • Naming your website longname.com vs shortcatchy.net vs shortcatchy.info

    - by jskye
    I'm designing a website that will basically be a social network for sharing information. I have the domain $$$$d.net and the same domain $$$$d.info where $$$$ is a word (that runs into the d) pertaining to the purpose of the site . The .com of this domain was already taken, but they've got nothing showing. They only have a not reached google error showing ie. dont seem to be trying to sell it either. I also have the long name of the site $$$$------&&&&&&&&&.com where the words $$$$ and &&&&&&&&& would contribute relevant seo to the site. In fact the word $$$$------ would also if a one letter spelling mistake is recognised at all by google, which i doubt but am unsure about. But as a brandname the $$$$------ word still works relevantly. Which do you think is a better choice to use? The short catchy name with the .info for relevance to information The .net which is more familiar than .info but slightly less relevant maybe. (But i think net as in network still works cos as i said it will be a social networking site). The long, .com domain which has more SEO plus a pun albeit on a spelling mistake. I know its kind of a subjective question and also hard to answer without knowing the name (which I've obfuscated because I'm only in initial design stage) but nevertheless im interested in what some of you guys think.

    Read the article

  • Weird IIS with Windows Authentication + IE problem

    - by Paulius Maruška
    I have a website running on IIS and using Windows Authentication. All users that are configured to get access to the site are form a AD domain (not local users). In the properties of a Website, I have set to use the AD domain as the realm. Now, when using Firefox, Safari or Chrome - Everything is fine. When the user tries to open the site, he get's the login box. he enters simply "username" and "password" (let's pretend that it's an actual login and password :P) and he get's into the site. When using IE, however, things get nasty. When the user tries to open the site - he get's the login box. User enters the "username" and "password" again, but those get rejected! And when the second time login box pops up - it has the username filled in as "web-server-domain-name\username" which is wrong, because web-server-domain-name is not the domain where all users reside (it's "ad-domain"). I've spent days trying to figure out what's going on... Note, that if I manually enter "ad-domain\username" - I get accepted into the site without problems. So, my guess is that IE sends wrong username if domain is not specified. Anyway, IE is the only browser that triggers this behavior! Is it possible to do a server-side fix? Maybe it's possible to somehow auto-map the users to AD users? If it's not solvable server-side - is there a client-side fix for this? Thank you. PS: I'm more of a programmer than a sys-admin, so configuring servers isn't the strong side of mine... :P UPDATE: @Evan: Yes, "Digest authentication for Windows domain servers" is also enabled. @Eric: IIS version is 6.0. The authentication methods enabled are: Integrated and digest - all other methods are disabled. As for the security log. I looked at it, when doing "username" and "password" login in Chrome/Firefox and when doing "ad-domain\username" and "password" login from IE - the generated log messages are the same (I see no difference, anyway). When entering "username" and "password" I don't see any errors in the security (or any other) log, so can't tell what method it's trying to use. UPDATE 2: As suggested by Eric in the comments - I played around with Fiddler... While playing with it, I noticed, that when "username" and "password" is entered in FF and IE - the "Authorization" header value (encrypted) sent by IE is longer (almost two times) than one sent by FF. I tried to disable Windows Integrated authentication and only leave the Digest enabled - that fixed the problem (meaning, IE used the right realm just like other browsers), but that caused bazillion other problems with my site, because with Digest - user impersonation on the server doesn't work (that causes problems, when connecting to database etc). Any ideas?

    Read the article

  • Can a webite have too many bindings?

    - by justSteve
    IIS7.x on a win08 web version on a dedicated server. I have a site that's serving a few dozen affiliates - many of which are hitting me via a subdomain from their own root domain - all of which have a subdomain specific to their account. E.G. my affiliate named 'Acme' hits my site via: myApp.Acme.com (his root, my app) Acme.MyDomain.com (his account within my root domain) Currently I'm adding each of these as a binding entry in IIS (targeting a discrete IP, not '*'). As I ramp this up to include more affiliates I'm wondering if I should be concerned about how many binding this site handles. Proabaly, in Acme's case I can do without the 'Acme.MyDomain.com' because, in reality, all traffic takes place via myApp.Acme.com. Mine is a niche site - very volume compared to most. At what point do I worry about all those bindings? thx

    Read the article

  • Provisioning SIP Phones over the internet

    - by Jorge Fernandez
    I have a few SIP Phones that are located of site and connect to my PBX over the internet to make calls. For some reason one of these phones has become unprovisioned. In my office phones get provisioned by the server via TFTP. The ones that I have off site I pre-provisioned manually before I sent them off-site (I'm in Florida the phone is in New Jersey). Whats the best way to provision these over the internet? TFTP is very insecure. Sending the plain text profiles with the SIP Account and Password over the internet is out of the question. The phones have been off-site for about 6 months without any issues. Im using Trixbox and Cisco 7940 Phones.

    Read the article

  • Links break in IE9 when using Wordpress plugins in non Wordpress Page

    - by mouli
    I have a site that uses SEF URLs and htaccess RewriteRules to serve up the pages. This has worked fine for several years until the arrival of IE9. Now it appears that the links are not being rewritten and the site is dead in the water. I have tried different compatabilty modes, to no avail, and I've played with the Rewrite Rules over and over, tried different doctypes and a few other browser settings. I agree that it cannot in theory be a browser specific problem if the problem is with the htaccess file but this site works in IE8, firefox and chrome. I have run the rewriterule through a validator and it looks fine. Any ideas would be appreciated as I am running out of ideas. The site is www.marlboroughsounds.co.nz a sample link is http://www.marlboroughsounds.co.nz/walking/freedom-walk-queen-charlotte-track/4dfw and the rewrite rule thats not working looks like this: RewriteRule ^walking/.*/([a-z0-9_]*)/?$ /walking.php?act_code=$1 [L] The link fails and it serves up a browser 404 page, not even the custom 404 I have for the site. Any ideas would be much appreciated as I am stumped.

    Read the article

  • Is it possible to use WebMatrix with pure IIS?

    - by Mike Christensen
    I'd like to check out WebMatrix for publishing our site to IIS automatically (right now, I have to zip it up, copy it out, Remote Desktop into the server, unzip it, etc). However, every example I can find on how to setup WebMatrix involves Azure, or using a .publishsettings file that you'd get from your hosting provider. I'm curious if I can publish to a normal, every day IIS server running on Windows Server 2008. So far, all I've done to the IIS server is install Web Deploy, which I believe is the protocol that WebMatrix uses to publish. When I enter the Remote Site Settings screen, I select Enter settings. I select Web Deploy as the protocol, type in my NT domain credentials (I'm an Admin on that server). I put in the site URL for the Site Name and Destination URL. When I click Validate Connection, I get: Am I doing something wrong, or is this just not possible to do?

    Read the article

  • Moving a Drupal between linux servers, best practice to avoid file-ownership problems

    - by zero
    I want to port over a Drupal commons 6x24 from a local LAMP-stack to a production webserver. Both systems run OpenSuse Linux. How do I do this, what are the most important steps. How should I handle file-ownership. It's important for me to have to have full control of the file ownership. If I use the wwwrun account, I frequently run into problems, due to a very strict webserver-admin. See for example the long history of looking for fixes and solutions see this thread and even more interesting see this very long and impressive thread here. All troubles I run into have to do with file-owernship and permissions. This is my current setup; Note: This was just a quick hacked installation - quick and dirty. Well my interest is after the general options i have in the port of a drupal from linux to linux linux-vi17:/srv/www/htdocs/com624 # ls -l insgesamt 224 -rwxrwxrwx 1 root www 45285 19. Jan 00:54 CHANGELOG.txt -rwxrwxrwx 1 root www 925 19. Jan 00:54 COPYRIGHT.txt -rwxrwxrwx 1 root www 206 19. Jan 00:54 cron.php drwxrwxrwx 2 root www 4096 19. Jan 00:54 includes -rwxrwxrwx 1 root www 923 19. Jan 00:54 index.php -rwxrwxrwx 1 root www 1244 19. Jan 00:54 INSTALL.mysql.txt -rwxrwxrwx 1 root www 1011 19. Jan 00:54 INSTALL.pgsql.txt -rwxrwxrwx 1 root www 47073 19. Jan 00:54 install.php -rwxrwxrwx 1 root www 15572 19. Jan 00:54 INSTALL.txt -rwxrwxrwx 1 root www 14940 19. Jan 00:54 LICENSE.txt -rwxrwxrwx 1 root www 1858 19. Jan 00:54 MAINTAINERS.txt drwxrwxrwx 3 root www 4096 19. Jan 00:54 misc drwxrwxrwx 35 root www 4096 19. Jan 00:54 modules drwxrwxrwx 4 root www 4096 19. Jan 00:54 profiles -rwxrwxrwx 1 root www 1470 19. Jan 00:54 robots.txt drwxrwxrwx 2 root www 4096 19. Jan 00:54 scripts drwxrwxrwx 4 root www 4096 19. Jan 00:54 sites drwxrwxrwx 7 root www 4096 19. Jan 00:54 themes -rwxrwxrwx 1 root www 26250 19. Jan 00:54 update.php -rwxrwxrwx 1 root www 4864 19. Jan 00:54 UPGRADE.txt -rwxrwxrwx 1 root www 294 19. Jan 00:54 xmlrpc.php linux-vi17:/srv/www/htdocs/com624 # thx to BetaRides answer here a quick overview on the drush functionality with rsync http://drush.ws/ core-rsync Rsync the Drupal tree to/from another server using ssh. Examples: drush rsync @dev @stage Rsync Drupal root from dev to stage (one of which must be local). drush rsync ./ @stage:%files/img Rsync all files in the current directory to the 'img' directory in the file storage folder on stage. Arguments: source May be rsync path or site alias. See rsync documentation and example.aliases.drushrc.php. destination May be rsync path or site alias. See rsync documentation and example.aliases.drushrc.php. Options: --mode The unary flags to pass to rsync; --mode=rultz implies rsync -rultz. Default is -az. --RSYNC-FLAG Most rsync flags passed to drush sync will be passed on to rsync. See rsync documentation. --exclude-conf Excludes settings.php from being rsynced. Default. --include-conf Allow settings.php to be rsynced --exclude-files Exclude the files directory. --exclude-sites Exclude all directories in "sites/" except for "sites/all". --exclude-other-sites Exclude all directories in "sites/" except for "sites/all" and the site directory for the site being synced. Note: if the site directory is different between the source and destination, use --exclude-sites followed by "drush rsync @from:%site @to:%site" --exclude-paths List of paths to exclude, seperated by : (Unix-based systems) or ; (Windows). --include-paths List of paths to include, seperated by : (Unix-based systems) or ; (Windows). Topics: docs-aliases Site aliases overview with examples Aliases: rsync

    Read the article

  • What is the best way for an experienced developer to work on a WordPress blog

    - by nanothief
    I'm beginning to work on my first WordPress blog, however I've noticed most tutorials just have you do modifications (such as theme changes, installing plugins) on the production site. This worries me for a few reasons: No backups No version control If you make a mistake, your production site is affected Developing remotely is slower than local development, especially when tweaking css files. I understand why WordPress works like this - it allows people with no development experience to manage their WordPress installation (or the one provided by their service provider). It also allows you to work on the WordPress installation without having ssh access to the server. However as I am confortable working with tools like git and ssh, and am using a virtual server for the blog, this isn't very important to me. So I was wondering what techniques experienced developers use when working on a WordPress blog. For example: Do you develop locally, then push the changes to the live site? How do you do this? How do you manage database changes and backups? What do you store under version control (if anything)? If a plugin changes the database, do you somehow track the changes it does in version control, so you can rollback the changes done by the plugin if you need to? Or maybe I'm just overcomplicating everything if working on the production site isn't as risky as I am thinking it would be. I would appreciate any answers either way.

    Read the article

  • How to copy or replicate a complex website to local file and modify then

    - by Andre Chenier
    I am not good at designing the visual side of a website. I found a website which I gave 10 over 10 because its functionality suits my aims and also it seems very esthetical. I know HTML, PHP, mySQL and some degree of CSS. I don't know JS, Ajax, Jquery. So I want to replicate this web site (save completely) on my local and then modify it. (content, colors, icons etc.) I saved this web site in Chrome and IE. After clicking the site from my local folder, a saw an ugly & non-working site. My aim is to understand the functions of the parts that I don't know. For example when I delete a js in its page what will happen as the result of the deletion operation. Since the page is too complex it has lots of css, js files to download inside. I don't want to deal it manually. Is there any alternative and easy way to get the web page completely to my local which also works like a charm from local? regards

    Read the article

  • Active Directory Replication across Sites slow or not working

    - by neildeadman
    I've just inherited (isn't it always the way!) a Windows Domain. The domain is spread across 2 sites. Site01 has 3 DCs & Site02 has 2 DCs. If I create a user in either site, the other DCs in that site, immediately replicate and show the new user. The new user is not shown in the other site though. If I manually run the following command, everything syncs and the new user appears: repadmin /syncall issdc01 /APed In the Inter-Site Transports DEFAULTIPSITELINK the replicate every time value is set to 180 minutes. I thought this was the solution, but on another Windows Domain, this is the same, but replication takes place across sites immediately. What can I check to resolve this issue? We are running Windows Server 2008 Results of dcdiag /test:dns show a server that is no longer part of our domain: TEST: Delegations (Del) Error: DNS server: oldserver.win.domain.com IP: [Missing glue A record]

    Read the article

  • Issue with https:// url going to an unknown location

    - by Brandon
    We have a website (ASP.NET/Plesk 9.5.5) that can be accessed just fine through the regular URL (http://example.com). However when accessing the site through https://example.com the site displays the invalid security certificate warning, which is fine since we don't have an SSL certificate. If I add an exception, I'm sent to a completely separate site that is apparently hosting a malware script (I'm still on https://example.com though). Because of this Google has flagged the site as dangerous. I can't find anything in the Plesk panel that would help fix this, and as far as I can tell those files don't exist on our server. How do I tell where the https:// link is sending me? I'm not that familiar with DNS, but is that what is causing this behavior?

    Read the article

  • How to make google Chrome omnibar search sites permanent?

    - by tim
    In google Chrome, when you have visited a site, say wikipedia.com, your can thereafter search the site directly from the omnibar by typing in a few letters then hitting tab. However, I noticed that after I clear my cache, Chrome does not remember the search autofills, and I once again have to visit the site manually for it to take effect. Is there a way to make the omnibar searches permanent even after doing a full cache clear in google Chrome? Thanks. Update: I tried the suggestion below, but bookmarking the site just allowed me to autocomplete the address. It did not allow me the option to hit tab to do the search directly in the omnibar. Any suggestions?

    Read the article

  • Website hosted on IIS is not accessbile

    - by Tola Odejayi
    I have two sites set up in IIS on a remote machine RM; one on regular port 80, and the other on port 5773. From my local machine LM, I can access the site on 80, but I cannot access the one on 5773; I get a status code of 502 and an error code of 10060 (A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond) when I try to do this. I can access the 5773 site via IIS when I am logged into RM (i.e. by right clicking on a page on the site and going 'Browse'). I can also access pages on the 5773 site via a browser, again when I am logged into RM. I just can't do the same via a browser when I am logged into LM. I have ensured that port 5773 is open for outgoing traffic on LM. Could the problem be that I also need to ensure that port 5773 is open for inbound traffic on RM?

    Read the article

  • Can I improve my AdWords quality scores with better landing pages?

    - by Eric
    I noticed that I have some keywords in my AdWords that are totally applicable to my site but the quality score of the keyword is 4 or 5. I'd like to get it up higher by creating custom versions of my site's home page (landing page) targeted specifically for people searching on those keywords. So for example, if we pretend my site sells pet food, my current home page has the phrase "dog food." I have a specific AdWords campaign for people searching on cat food (with cat food-specific ads). I'm thinking about changing the URL on those ads to something like http://mysite.com/cat.html, so a different home page comes up with the phrase "cat food." My thinking is that will help Google see that this new landing page is appropriate for the keywords and will raise my quality score for the "cat food" keywords. (Note that none of what I'm doing is shady or misleading; nobody would disagree that all of the keywords and ads I've created are perfect and appropriate for what my site offers.) Question: is what I describe the correct way to raise poor quality scores on keywords, and will it help?

    Read the article

  • Intermediate SSL Certificates on Azure Websites

    - by amhed
    I have successfully configured an Extended-Validation Certificate on an Azure Website following this article: http://www.windowsazure.com/en-us/documentation/articles/web-sites-configure-ssl-certificate/ The main (non-technical) stakeholder of the web application went through great lengths to validate that our site is secure. He went to this site to check the validity of our SSL: http://www.whynopadlock.com/ The site throw the following error: `SSL verification issue (Possibly mis-matched URL or bad intermediate cert.). Details: ERROR: no certificate subject alternative name matches`` The certificate is installed using IP Based SSL instead of SNI. This is done this way because some site visitors still use Internet Explorer 8 on Windows XP, which has no support for SNI and throws a security warning. Is my certificate correclty installed? I received three .CRT files from my SSL provider: PrimaryIntermediate.crt SecondaryIntermediate.crt EndCertificate.crt This is how I exported our certificate as a .PFX file to Azure: openssl pkcs12 -export -out myserver.pfx -inkey myserver.key -in myserver.crt

    Read the article

  • Looking for some IIS redirect help/ideas

    - by CoreyT
    Right now we have a site with a LOT of static asp pages such as, www.site.com/123.asp. This is due to how our current site's CMS builds it's pages by default. I don't have an exact count but we have roughly 6000 asp files in the site right now. We are in the middle of a redesign and restructuring of the site, and are looking to migrate to SEO friendly URLs. The problem we're having right now is what do we do to redirect the old pages to the new friendly URLs? I know how to do redirects that is not the issue here. The problems I am coming up with right now are listed below. 1 - Is there a limit to the number of redirects in IIS? 2 - Would having even a few thousand redirects affect IIS performance? 3 - My understanding is that we would not be passing along page rank to the new URLs, is that true? (not a major question I can ask on more SEO forums if nobody here is sure) 4 - Would using something like the IIS URL Rewrite 2 module for IIS 7 help us out? Or would I still need to define several thousand unique redirects in it? Our server right now is running Server 2003, however in the redesign I would be open to migrating to Server 2008 R2 if there is a good case for it (i.e. the URL Rewrite module). Thanks for any guidance or help. I have been looking for a good way to do this for a while now and keep coming up with things that sound problematic and bad (such as having 6000 redirects).

    Read the article

  • Can IP address transfer from person to another after he disconnects from ISP or any other way?

    - by learner
    I have been checking this website that sells a product (health related) and trying to find out if it is a scam site. The site is something.blogspot.in (and not something.blogspot.com, which happens to be a different site altogether). So is it an Indian site? It has a CBox chat box where the owner communicates with customers (or potential ones) for information. The owner shows that his product has worked for people by providing links from a forum (created by him at network54.com) where people have posted positively. One doesn't have to be registered to post on there, but the IP address of the poster gets shown along with the post. According to the owner, IP address is basis of authenticity. I found that many people had different IP addresses on their different posts. The owner has declared the nationalities of the people who posted. When I traced the IP addresses of them with this site, I found that the nationalities provided by the owner were wrong. Is it possible that when a person disconnects himself from an ISP, another person from another country gets his old IP address?

    Read the article

  • SEO with duplicate content

    - by user16831
    I have a nature photography site with multiple types of photo galleries. Each photo and associated caption on my site appears in several galleries. For instance, a photo of a goldfinch that was taken on a trip to New Mexico in 2008 will appear in the "goldfinch.php" gallery, in the "finches.php" gallery, and in the "New_Mexico_2008.php" gallery. This duplication is useful for my site visitors - User A may want to see goldfinch photos, whereas User B wants to see photos from New Mexico - but I am concerned about the SEO implications. The typical suggestions to deal with duplicate content, such as 301 redirects and canonical tags, probably won't work in this case, because the page content is substantially different (ranging from ~1% to ~90% duplication, depending on the specific example chosen). The obvious solution to me would be to edit robots.txt to only allow search engines to crawl one type of gallery - for instance, if they crawled only the galleries organized by species(e.g. goldfinch.php), all the photos on my site would be found exactly once. However, the Google content guidelines recommend against blocking crawler access to duplicate information. Should I go ahead and use robots.txt anyway? Or is there a better solution?

    Read the article

  • Disaster Recovery Example

    Previously, I use to work for a small internet company that sells dental plans online. Our primary focus concerning disaster prevention and recovery is on our corporate website and private intranet site. We had a multiphase disaster recovery plan that includes data redundancy, load balancing, and off-site monitoring. Data redundancy is a key aspect of our disaster recovery plan. The first phase of this is to replicate our data to multiple database servers and schedule daily backups of the databases that are stored off site. The next phase is the file replication of data amongst our web servers that are also backed up daily by our collocation. In addition to the files located on the server, files are also stored locally on development machines, and again backed up using version control software. Load balancing is another key aspect of our disaster recovery plan. Load balancing offers many benefits for our system, better performance, load distribution and increased availability. With our servers behind a load balancer our system has the ability to accept multiple requests simultaneously because the load is split between multiple servers. Plus if one server is slow or experiencing a failure the traffic is diverted amongst the other servers connected to the load balancer allowing the server to get back online. The final key to our disaster recovery plan is off-site monitoring that notifies all IT staff of any outages or errors on the main website encountered by the monitor. Messages are sent by email, voicemail, and SMS. According to Disasterrecovery.org, disaster recovery planning is the way companies successfully manage crises with minimal cost and effort and maximum speed compared to others that are forced to make decision out of desperation when disasters occur. In addition Sun Guard stated in 2009 that the first step in disaster recovery planning is to analyze company risks and factor in fixed costs for things like hardware, software, staffing and utilities, as well as indirect costs, such as floor space, power protection, physical and information security, and management. Also availability requirements need to be determined per application and system as well as the strategies for recovery.

    Read the article

  • using web proxies - safe to enter passwords?

    - by bergin
    Hi Wanted to check something on a local site and see how the outside world sees it. however, using a web proxy im not sure that when i enter my credentials the proxy wont record this and give the proxy owner access to my site. is there another way to see my own site as though I was on the other side?

    Read the article

  • Loading wordpress admin page prevents and blocks further requests

    - by Auxiliary
    I've installed a Wordpress site on a host (actually I moved it from localhost). The site loads and works perfectly, however if I log in with Admin, the admin page doesn't completely load and all further requests to the server are blocked for about 10 or 15 minutes. I can't even ping to the site. Where could this problem be from? Is it firewall related or...? Any help or idea is greatly appreciated.

    Read the article

  • Google Analytics unexplained spike

    - by Dianne
    My client's Google Analytics has had a spike everyday from May 6th (from 0 - 100.) This is in a city that he is not optimized for and does very little business in. The hits are coming in direct to the website. My client is concerned that it has something to do with competition using his site as a price shopping device. I can't view the ip to see where they are coming from and his site is not built in PHP so the work around doesn't work here. Any thoughts? Could it be a "referring site" situation and if so is there a way for me to find out what the referring site is?

    Read the article

< Previous Page | 149 150 151 152 153 154 155 156 157 158 159 160  | Next Page >