Search Results

Search found 29495 results on 1180 pages for 'cross site scripting'.

Page 403/1180 | < Previous Page | 399 400 401 402 403 404 405 406 407 408 409 410  | Next Page >

  • Secure Your Wireless Router: 8 Things You Can Do Right Now

    - by Chris Hoffman
    A security researcher recently discovered a backdoor in many D-Link routers, allowing anyone to access the router without knowing the username or password. This isn’t the first router security issue and won’t be the last. To protect yourself, you should ensure that your router is configured securely. This is about more than just enabling Wi-Fi encryption and not hosting an open Wi-Fi network. Disable Remote Access Routers offer a web interface, allowing you to configure them through a browser. The router runs a web server and makes this web page available when you’re on the router’s local network. However, most routers offer a “remote access” feature that allows you to access this web interface from anywhere in the world. Even if you set a username and password, if you have a D-Link router affected by this vulnerability, anyone would be able to log in without any credentials. If you have remote access disabled, you’d be safe from people remotely accessing your router and tampering with it. To do this, open your router’s web interface and look for the “Remote Access,” “Remote Administration,” or “Remote Management” feature. Ensure it’s disabled — it should be disabled by default on most routers, but it’s good to check. Update the Firmware Like our operating systems, web browsers, and every other piece of software we use, router software isn’t perfect. The router’s firmware — essentially the software running on the router — may have security flaws. Router manufacturers may release firmware updates that fix such security holes, although they quickly discontinue support for most routers and move on to the next models. Unfortunately, most routers don’t have an auto-update feature like Windows and our web browsers do — you have to check your router manufacturer’s website for a firmware update and install it manually via the router’s web interface. Check to be sure your router has the latest available firmware installed. Change Default Login Credentials Many routers have default login credentials that are fairly obvious, such as the password “admin”. If someone gained access to your router’s web interface through some sort of vulnerability or just by logging onto your Wi-Fi network, it would be easy to log in and tamper with the router’s settings. To avoid this, change the router’s password to a non-default password that an attacker couldn’t easily guess. Some routers even allow you to change the username you use to log into your router. Lock Down Wi-Fi Access If someone gains access to your Wi-Fi network, they could attempt to tamper with your router — or just do other bad things like snoop on your local file shares or use your connection to downloaded copyrighted content and get you in trouble. Running an open Wi-Fi network can be dangerous. To prevent this, ensure your router’s Wi-Fi is secure. This is pretty simple: Set it to use WPA2 encryption and use a reasonably secure passphrase. Don’t use the weaker WEP encryption or set an obvious passphrase like “password”. Disable UPnP A variety of UPnP flaws have been found in consumer routers. Tens of millions of consumer routers respond to UPnP requests from the Internet, allowing attackers on the Internet to remotely configure your router. Flash applets in your browser could use UPnP to open ports, making your computer more vulnerable. UPnP is fairly insecure for a variety of reasons. To avoid UPnP-based problems, disable UPnP on your router via its web interface. If you use software that needs ports forwarded — such as a BitTorrent client, game server, or communications program — you’ll have to forward ports on your router without relying on UPnP. Log Out of the Router’s Web Interface When You’re Done Configuring It Cross site scripting (XSS) flaws have been found in some routers. A router with such an XSS flaw could be controlled by a malicious web page, allowing the web page to configure settings while you’re logged in. If your router is using its default username and password, it would be easy for the malicious web page to gain access. Even if you changed your router’s password, it would be theoretically possible for a website to use your logged-in session to access your router and modify its settings. To prevent this, just log out of your router when you’re done configuring it — if you can’t do that, you may want to clear your browser cookies. This isn’t something to be too paranoid about, but logging out of your router when you’re done using it is a quick and easy thing to do. Change the Router’s Local IP Address If you’re really paranoid, you may be able to change your router’s local IP address. For example, if its default address is 192.168.0.1, you could change it to 192.168.0.150. If the router itself were vulnerable and some sort of malicious script in your web browser attempted to exploit a cross site scripting vulnerability, accessing known-vulnerable routers at their local IP address and tampering with them, the attack would fail. This step isn’t completely necessary, especially since it wouldn’t protect against local attackers — if someone were on your network or software was running on your PC, they’d be able to determine your router’s IP address and connect to it. Install Third-Party Firmwares If you’re really worried about security, you could also install a third-party firmware such as DD-WRT or OpenWRT. You won’t find obscure back doors added by the router’s manufacturer in these alternative firmwares. Consumer routers are shaping up to be a perfect storm of security problems — they’re not automatically updated with new security patches, they’re connected directly to the Internet, manufacturers quickly stop supporting them, and many consumer routers seem to be full of bad code that leads to UPnP exploits and easy-to-exploit backdoors. It’s smart to take some basic precautions. Image Credit: Nuscreen on Flickr     

    Read the article

  • Mini keyboard has no home/end keys; how to type them?

    - by Steve Crane
    Some months ago I needed a small keyboard and bought an Okion KM229 without noticing that it has no Home or End key. This makes it tricky to type as I'm so used to using these keys. I haven't yet figured out if there is a key combination that issues Home and End keystrokes. Does anyone have experience of these keyboards and know how to issue those keystrokes? The keyboard is used on a PC running Windows XP. I have used the contact form on the Okion USA web site to ask this question but received no response. Wikipedia suggests that Home and End keystrokes are issued with Fn-Left and Fn-Right on some limited size keyboards. I may have tried those but don't remember for sure. I will try them again though when I am on site again in a week or so. Any other thoughts would be welcomed in the meantime.

    Read the article

  • Tell browsers to cache until last modified date changes?

    - by Chad Johnson
    My web site consists of static HTML files which are usually republished once per day, and sometimes more. I'm using Apache. In the vhost settings for my site, I'd like to tell browsers to cache HTML files indefinitely, until Apache sees that they are modified. So as soon as an HTML file is changed, Apache should immediately begin telling browsers it's changed and send the updated file. As soon as a new file is published, browsers should immediately begin receiving that...they should never receive old versions of files. Maybe ExpiresByType text/html modification and no "plus x days." Is something like this possible?

    Read the article

  • VPN within a VM to allow for internet access on the host

    - by David Durrant
    I have a network connection (created under Networks and Sharing) that I use to connect to a customer's site. But when I use this to connect to the site, I loose all access to the public internet, and can only access customer specific items. I want to circumvent this issue by creating a VM and then utilizing the VM to connect to the network location and interact within the customer's domain, while leaving my host machine open to the internet. I'm not extremely familiar with networking, but I have a few basic skills. Please let me know if this is possible and what the correct procedures are. I already have a VM created with VirtualBox, and both the host and guest are running Windows 7 x64. I have created duplicate VPNs already, but can only connect successfully on the host machine.

    Read the article

  • Hiding a column from a pivot table without removing it from the chart

    - by Simon
    I have a pivot table with two columns: number of users who visited a website (impressions) and number of users who registered on the site (regs). The rows are for dates. I want to visualize the percentage of users who registered after visiting the site. Thus, I have the number of users for each cell as a value field, displaying it as percentage of impressions. Generating a pivot chart from the table, impressions and regs are plotted over date as a percentage of impressions. This means there is one line at 100% for impressions (always 100% of itself) and the graph for registrations below that. I'd like to remove the line for impressions, but when I set a filter to do so, registrations vanish as well, since the column for impressions is filtered from the pivot chart as well, turning the value field invalid. How can I just show registrations as a percentage of impressions in the chart?

    Read the article

  • Configuring subdomains for a machine (Win2k8) in a lan

    - by RMS
    I am currently setting up a windows 2008 server to host a website with multiple subdomains, all accessible only within the lan. also, there is no active directory. what I did is : 1 - computer name : 'web' 2 - in IIS, I added a site binding as 'site1.web' to the default web site 3 - added DNS role to the server 4 - added 'web' as principal zone in direct lookup zones (default options) 5 - added CNAME 'site1' From a client machine, in tcpip config I added the ip address of 'web' to the DNS list in addition to the ISP DNS. (client machine ip is from DHCP) Now browsing to 'http://web' or 'http://site1.web' works correctly. My question is, is it possible throught additional steps in the server to have the websites accessible without requiring the DNS config in all the client machines ? thanks in advance

    Read the article

  • Ditch cPanel / WHM in favour of manual seup

    - by BWRic
    We currently use cPanel / WHM on a reseller account but are looking at getting a dedicated server. My first thought was to duplicate this set up on the dedicated box to allow us to quickly create new accounts. I'll be a managed server so they'll have set up the LAMP stack. I'm curious if I actually need cPanel and WHM. We don't use many of the features from cPanel / WHM, just creating accounts and databases, clients do not have FTP access. I'm no sys admin and come from a Windows / GUI background but have some knowledge in setting up development servers. WHM: Creating accounts I presume this sets up the Apache virtual host, FTP access and DNS settings. I've some knowledge of editing the Apache files to create virtual hosts. Am I correct in thinking as long as the DNS is pointing to the server IP and the virtual host is configured the server can serve the (php) pages? I'm not sure I need per site FTP access as only we will have access so I could have a server wide/htdocs only access to view all the site. The company who supply the dedicated hosts would also provide the own DNS management tool so I'm not need to cPanel one. MySQL: Creating users and databases We use cPanel to create the MySQL users and databases. As it's a dedicated box and I can have root access I think this could be replaced by SQLyog for db management and phpMyAdmin for user management. Do you I need cPanel or can I get by editing a few text files for creating the accounts, then use the MySQL tools for databases? Or am I missing something major with how the sites are configured?

    Read the article

  • Browser/addon which shows what's been downloaded so far regardless of formatting and layout ?

    - by Mick
    Every now and then a website becomes super-slow (but not broken) because there are too many people looking at it at the same time. When I try and view such a site, say with Firefox, I can see that it is downloading all sorts of components of the site because of the progress information printed at the bottom of the window and I'm sitting there thinking "If only the browser would show me what it's got so far. I don't care if its a jumbled mess, I just want to see what you've got". Does any browser offer such an option?

    Read the article

  • Rails with phusion passenger and wordpress

    - by Venu
    We had a site developed using on ruby on rails. It had Website Web services for mobile app Admin panel to manage data. We started using wordpress to manage site content. We have finished development, have to move to production now. This is the current virtual host code for wordpress to work under /wordpress URI. <Location /wordpress> PassengerEnabled off <IfModule mod_rewrite.c> RewriteEngine On RewriteBase /wordpress/ RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /wordpress/index.php [L] </IfModule> </Location> I want to make phusion passenger work for the /admin and /api URIs. And / to go to wordpress. Can we change the document root based on the URI? or any other better solution?

    Read the article

  • Any website/software where I can add some text rows on daily basis

    - by Moorage
    I have few notes or text like few remembering lines on daily or weekly basis. I want to write it. but i should be able to see at any backdate /monthly or yearly. The date-time should also be stored when i enter text. is it possible EDIT: I will explain clearly what i exactly want. Suppose while working on internet 1)i find "ABC is good for BCD". now i want to add that text to some online site where i can see later 2)Now i can add those type of things any time and on internet i can see that in tabular form like click to see montly list , yearly or weekly 3)Other thing is i should be able to add text as easy as possible like in firefox if i can press some shotcut and enter something and it gets added there rather than opening the site and then write The to do list do that sort of thing but they dont have creation date , rather they filter based on due date.

    Read the article

  • How does Google Analytics aggregate the Count of Visits (Frequency & Recency Report)?

    - by Brian Dant
    Here's my simple understanding of Count of Visits: Each person that comes to my site gets one "count" for each visit. They are put into a bucket of people with the same number of total counts -- if you visit twice, you are in the two bucket, if you visit six times, you are in the six bucket. From there, a report (Frequency & Recency) makes a line for each bucket and reaches into the bucket and totals the number of people in that bucket, putting that total in the second column. My Question: Will a two month report automatically put someone into two buckets, and put them on two separate lines in the Count of Visits table? This explaination makes it seem like a two-month long report will put the same person into a bucket twice, one bucket for each month. The two-month report will then show that person's visits on two different lines, instead of aggregating them. Example for Clarification: Bob comes to my site three times in January and seven times in February. I run a report for Jan 1 -- Feb 28. Will Bob be on both the Three Count line and the Seven Count line, or will he be on the Ten Count line?

    Read the article

  • The best way to learn how to extend Orchard

    - by Bertrand Le Roy
    We do have tutorials on the Orchard site, but we can't cover all topics, and recently I've found myself more and more responding to forum questions by pointing people to an existing module that was solving a similar problem to the one the question was about. I really like this way of learning by example and from the expertise of others. This is one of the reasons why we decided that modules would by default come in source code form that we compile dynamically. it makes them easy to understand and easier to modify for your own purposes. Hackability FTW! But how do you crack open a module and look at what's inside? You can do it in two different ways. First, you can just install the module from the gallery, directly from your Orchard instance's admin panel. Once you've done that, you can just look into your Modules directory under the web site. There is now a subfolder with the name of the new module that contains a csproj that you can open in Visual Studio or add to your Orchard solution. Second, you can simply download the package (it's NuGet) and rename it to a .zip extension. NuGet being based on Zip, this will open just fine in Windows Explorer: What you want to dig into is the Content/Modules/[NameOfTheModule] folder, which is where the actual code is. Thanks to Jason Gaylord for the idea for this post.

    Read the article

  • FREE goodies if you are a UK based software house already live on the Windows Azure Platform

    - by Eric Nelson
    In the UK we have seen some fantastic take up around the Windows Azure Platform and we have lined up some great stuff in 2011 to help companies fully exploit the Cloud – but we need you to tell us what you are up to! Once you tell us about your plans around Windows Azure, you will get access to FREE benefits including email based developer support and free monthly allowance of Windows Azure, SQL Azure and AppFabric from Jan 2011 – and more! (This offer is referred to as Cloud Essentials and is explained here) And… we will be able to plan the right amount of activity to continue to help early adopters through 2011. Step 1: Sign up your company to Microsoft Platform Ready (you will need a windows live id to do this) Step 2: Add your applications For each application, state your intention around Windows Azure (and SQL etc if you so wish) Step 3: Verify your application works on the Windows Azure Platform Step 4 (Optional): Test your application works on the Windows Azure Platform Download the FREE test tool. Test your application with it and upload the successful results. Step 5: Revisit the MPR site in early January to get details of Cloud Essentials and other benefits P.S. You might want some background on the “fantastic take up” bit: We helped over 3000 UK companies deploy test applications during the beta phase of Windows Azure We directly trained over 1000 UK developers during 2010 We already have over 100 UK applications profiled on the Microsoft Platform Ready site And in a recent survey of UK ISVs you all look pretty excited around Cloud – 42% already offer their solution on the Cloud or plan to.

    Read the article

  • server host name and server ip address redirect (Debian, Apache)

    - by Matthias Reisner
    I have the following folder structure on my apache! .../var/www/www.x.tt/htdocs .../var/www/www.y.tt/htdocs I have defined a virtual host for each. So if I type in www.x.tt in my browser I get to the www.x.tt site. And the same for www.y.tt. But now my question! If I type in the server address or the server host name I get to the ../var/www directory but I want that the user will be redirected to the www.x.tt site! Do I have to create a new virtual host for this problem or is it also possible only to add a .htaccess Rule?! Thanks!

    Read the article

  • Rewrite rule to redirect all subpages to a single page?

    - by user784637
    I have two two files /etc/apache2/sites-available/foo and /etc/apache2/sites-available/foo_maintenance The rewrite rule I use in /etc/apache2/sites-available/foo is <Directory /var/www/public_html> Options +FollowSymlinks RewriteOptions inherit RewriteEngine on # RewriteCond %{HTTP_HOST} ^mysite\.com [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] </Directory> so that all mysite.com/* redirect to www.mysite.com After I take my site down for maintenance, if the user is navigates to a subpage of the site like mysite.com/subdir/something.php I would like to redirect them to www.mysite.com so the index.html of the maintenance page would be displayed. What is the rewrite rule to redirect all traffic from any subpage to www.mysite.com?

    Read the article

  • Refactoring and Open / Closed principle

    - by Giorgio
    I have recently being reading a web site about clean code development (I do not put a link here because it is not in English). One of the principles advertised by this site is the Open Closed Principle: each software component should be open for extension and closed for modification. E.g., when we have implemented and tested a class, we should only modify it to fix bugs or to add new functionality (e.g. new methods that do not influence the existing ones). The existing functionality and implementation should not be changed. I normally apply this principle by defining an interface I and a corresponding implementation class A. When class A has become stable (implemented and tested), I normally do not modify it too much (possibly, not at all), i.e. If new requirements arrive (e.g. performance, or a totally new implementation of the interface) that require big changes to the code, I write a new implementation B, and keep using A as long as B is not mature. When B is mature, all that is needed is to change how I is instantiated. If the new requirements suggest a change to the interface as well, I define a new interface I' and a new implementation A'. So I, A are frozen and remain the implementation for the production system as long as I' and A' are not stable enough to replace them. So, in view of these observation, I was a bit surprised that the web page then suggested the use of complex refactorings, "... because it is not possible to write code directly in its final form." Isn't there a contradiction / conflict between enforcing the Open / Closed Principle and suggesting the use of complex refactorings as a best practice? Or the idea here is that one can use complex refactorings during the development of a class A, but when that class has been tested successfully it should be frozen?

    Read the article

  • How to write a blog for SEO purpose

    - by Mathieu Imbert
    I have a photo sharing website, which provides very little textual content. Users can add tags to photos and a description, but it creates a lot of duplicate content, because most of the descriptions will be 'wow', 'lol', ... I don't think I should rely on users to build my SEO. I think it would be a great idea to write a blog, and use it to describe the best photos, start contests, explain themes, in short: create original content that search engines will love. Our website's main URL is like www.domain.com, and our new blog is hosted on blog.domain.com. From a SEO perspective, is it a good idea to keep the blog separate from the main site? This has the advantage to leave the original site unchanged, but will it add any page rank to the www.domain.com? If the blog ranks well it will obviously pass some page rank to the original through links. What do you think is the best option from a SEO perspective? Include the blog in www.domain.com? Or leave it in blog.domain.com?

    Read the article

  • Change the white background in webpages to another color

    - by Bruce Connor
    I'm currently using a dark theme in firefox. It looks really nice, but many webpages use a plain white background. The resulting contrast is a little unpleasant and sometimes hurts the eye when I switch from a dark tab to a white tab. Is there a way to make firefox replace white backgrouns everywhere with some other color (light gray, for instance)? It could be a Stylish script, a userChrome.css hack, or anything that works (preferably as light as possible). To make myself clear: after I achieve my objective, the background color whenever I visit the super-user site should be light-grey instead of white, and the same should happen to any other site with a white background (google sites, tech crunch, etc). Is there a way to do that?

    Read the article

  • Tackling thin content on an images gallery

    - by Ted Wilmont
    We run an images gallery as part of our site, however we have over 8,000 images and every image has a separate HTML page of its own to display the image caption, related image and comments from users of the site. This seems to be a problem especially with the Google Panda update because these pages are technically "thin content". What would be the best way to tackle this? We'd love some feedback and advice regarding this scenario. We have a few options we thought of already but can't decide: We could noindex the separate image pages and loose any image search listings we have for the image in favour of removing these thin pages from the index. We could 301 all of the individual image pages back to the image category listing and anchor each image (e.g. #img2122) and include all of the comments and description on the category listing page itself. If we was to simply list all of the images and content on the category pages themself; what's the best method? We could add all of the content in the anchor tags and use jQuery to display them in a box when a user clicks on the image or we could use Ajax to retrieve the information. However, what's the best Ajax method for SEO? Any ideas, suggestions, tips or advice is greatly appreciated and thank you in advance for any given.

    Read the article

  • Would it be possible to create an open source software library, entirely developed and moderated by an open community?

    - by Steven Jeuris
    Call it democratic software development, or open source on steroids if you will. I'm not just talking about the possibility of providing a patch which can be approved by the library owner. Think more along the lines of how Stack Exchange works. Anyone can post code, and through community moderation it is cleaned up and eventually valid code ends up in the final library. For complex libraries an elaborate system should probably be created, but for a simple library it is my belief this is already possible even within the Stack Exchange platform. Take a library of extension methods for .NET for example. Everybody goes their own way and implements their own subset of what they feel is important, open-source library or not. People want to share their code, but there is no suitable platform for it. extensionmethod.net is the result of answering this call for extension methods, but the framework hopelessly falls short; there is no order, or structure at all. You don't know whether an idea is any good until you try it, so I decided to create an Extension Methods proposal on Area51. I belief with proper moderation, it could be possible for the site to be more than a Q&A site, and that an actual library (or subsets of it) could be extracted from it. Has anything like this been attempted before? Are there platforms better suited for this?

    Read the article

  • Can't ping self

    - by Paddy
    I have a wireless internet connection setup on my Mac. (v10.5.6) Am connected to the internet and everything is running smoothly. I recently discovered a quirky behaviour while setting up apache web server. When i typed in my dynamic ip (http://117.254.149.11/) in the webbrowser to visit my site pages it just timed out. In terminal i tried pinging localhost and it worked. $ ping localhost PING localhost (127.0.0.1): 56 data bytes 64 bytes from 127.0.0.1: icmp_seq=0 ttl=64 time=0.063 ms 64 bytes from 127.0.0.1: icmp_seq=1 ttl=64 time=0.056 ms 64 bytes from 127.0.0.1: icmp_seq=2 ttl=64 time=0.044 ms But if i pinged my ip it would just time out. $ ping 117.254.149.11 PING 117.254.149.11 (117.254.149.11): 56 data bytes ^C --- 117.254.149.11 ping statistics --- 10 packets transmitted, 0 packets received, 100% packet loss Pinging any other site works though. I am completely stumped. Any help would be greatly appreciated.

    Read the article

  • Report from OpenWorld Shanghai

    - by jmorourke
    Oracle OpenWorld Shanghai 2013 was held July 22nd – 25th at the International Expo Center in Shanghai, China. The conference drew over 19,000 attendees from 44 countries. In addition, 580 CxOs attended the Executive Edge program, and 430+ partners attended the Oracle Partner Network Exchange. The conference included a number of sessions on Big Data, Business Analytics, Business Intelligence and Enterprise Performance Management delivered by Oracle, our partners and customers.  I had the pleasure to attend the conference and delivered three sessions focused on Oracle’s Hyperion Enterprise Performance Management (EPM) applications. Each of my sessions was well-attended, and in a few cases was standing room only, so there is clearly a lot of interest in the China market in EPM. The EPM and BI demo pods in the DemoGrounds at the conference also received a lot of traffic. In addition to the conference sessions I delivered, I had several meetings with customers and partners in Shanghai.These sessions and meetings I attended made clear the interest that customers in China have in improving their planning, management reporting, financial reporting, and profitability management processes. In fact, with the China Ministry of Finance now standardizing on XBRL for annual reporting across multiple agencies in China, there is a great opportunity here for our disclosure management application. One interesting finding is that the China market may not be ready for cloud-based applications as many companies are state-owned and have security concerns, so on-premise applications are likely to see continued demand.  For more information about the Oracle OpenWorld China 2013 conference, please check the web  site:  http://www.oracle.com/events/apac/cn/en/openworld/index.htmlAnd don’t forget, Oracle OpenWorld San Francisco 2013 is just around the corner in September of 2013. Please check the web site for registration and content information: http://www.oracle.com/openworld/index.html

    Read the article

  • Should I split my website into different servers

    - by Nyxynyx
    I have a website where a user uploads photos, the photos gets resized and thumbnailed, and stored on the server. At the same time, there are some INSERTS into a MySQL table regarding the photo uploaded (like description, user id etc). The site currently runs off a managed VPS, and I love the support it provides. However it is expensive to store the many small photos and the resizing and thumbnailing processes do cause spikes on the app performance. (Amazon S3 is pretty expensive, especially considering the costs for uploading many small files) Question: Will it be a good idea to move the image processing operations and image storage to another server which is an unmanaged dedicated server with a much lower cost/gb and keep the current VPS for its 24/7 support and hosting the webapp? Or should I move the entire site to the dedicated server? VPS Specs 16 cores 2.4GHz (E5620) 1GB memory 60GB Storage 3.5TB transfer $43/mth Managed (24/7) Dedicated Specs i3 2130 2 cores 3.4+ GHz 16 GB DDR3 2 x 1TB SATA2 storage 15 TB transfer $79/mth Unmanaged (Weekdays support) Software used Apache PHP MySQL Solr PostgreSQL ImageMagick

    Read the article

  • foobar.com working, but www.foobar.com not working?

    - by dpmattingly
    I am setting up a web site for a client. She is using GoDaddy for domain registration, and a hosting company I have never used before. After setting up the nameservers on GoDaddy's side, the address foobar.com (for example) is correctly directing to the new site. However, the address www.foobar.com is redirecting to a 404 page on the hosting company's side. I've been dealing with customer service on the hosting side, and they have told me various things including wait for DNS propagation (which has obviously happened since the 404 page is on their side), and to make sure that the nameservers on GoDaddy's side were entered in lower case instead of upper case (which I know doesn't matter since nameservers are case insensitive). I think I'm getting the runaround from the hosting company, but the client had signed up with them before I came to the project, so if possible I'd like to resolve this issue with them before we start treating it as a loss. Does anybody know what could cause foobar.com to resolve correctly but www.foobar.com to not resolve? How would I best be able to suggest a fix to this through the technical support channels of a hosting company?

    Read the article

< Previous Page | 399 400 401 402 403 404 405 406 407 408 409 410  | Next Page >