Search Results

Search found 24378 results on 976 pages for 'pinned site'.

Page 198/976 | < Previous Page | 194 195 196 197 198 199 200 201 202 203 204 205  | Next Page >

  • ftp connects but files aren't visible browsing

    - by YsoL8
    Hello If this should be on that other site, please don't shoot me, as I can't remember the name or the url. I have an ftp account in Dreamweaver that connects to the remote site and appears to be uploading files as normal. But when I browse to the location I can't see any new files or changes to the index page. (I've uploaded index.php and connect.php). I'm getting a 404 page. I suspect the host directory is wrong, but looking at the file tree, I can't see the folder I'm supposed to be using, so I'm uploading to the apparent site root. Any guidance on this?

    Read the article

  • Hide directory contents from showing when accessing the URL directly

    - by SoLoGHoST
    On my site, if you browse to http://example.com/images/ the contents of the entire directory are shown like so: How can I make it so that this doesn't show up when people browse directly to http://example.com/images/? Can I create an .htaccess file in that directory? Or is there a better way? I really don't want people being able to do this for the entire site (i.e. every directory on that site). What can I do to prevent this? I figure it's either something that has to be done in Apache or using an global .htaccess file and placing it in the public_html folder perhaps? EDIT I diverted this using an index.php file, but I still feel that security is an issue here, how can I fix this permanently?

    Read the article

  • Can too many 301 redirects cause a DNS error?

    - by Graham
    For a site http://imageocd.com that I just set up I initially spelled the category "automobiles" as "autimobiles"... I know it's rediculous. I then set up over 10,000 pages behind that category e.g. http://imageocd.com/automobiles/hillman-minx-cabrio-pictures-and-wallpapers. So, I set up over 10,000 301 url redirects to change the spelling on automobiles. I just checked my Google Webmasters report and got an error saying: http://www.imageocd.com/: Googlebot can't access your siteSep 7, 2012 Over the last 24 hours, Googlebot encountered 2 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 66.7%. Could the overabundance of 301 redirects be causing this? I host 13 sites on this dedicated server and all sites are running fine. I also contacted GoDaddy and they said the server is running fine. Any ideas on what might be going on? Also, I have "canonical" set up for every URL. Could this be part of the error? Thanks.

    Read the article

  • Location-Based redirection and duplication in sub-directories affecting SEO

    - by Joshua
    I currently own the website www.xyz.com. The website has a sub-directory for each of the 3 target countries: .../en-US/ (United States), .../es-MX/ (Mexico), and .../es-DO/ (Dominican Republic). I have two main questions about this setup: Currently, the main domain/root (xyz.com) contains a blank index.php file, but I would like for a user to be redirected to one of the sub-directories based on their regional location. What is the best way to accomplish this? I have looked at using browser language-based redirection, but how would I know whether to direct a user to the MX or DO site if the browser language is set to spanish? Is there a way to detect a user's geographic location? Also, the 3 websites are practically identical except they all have 3 unique color schemes and the US site is in english while the MX and DO sites are in spanish. My problem is that I believe GoogleBot is penalizing/banning my site because the spanish text on the MX and DO pages are nearly identical and are thus marked as duplicates/spam. Is there a way to avoid this?

    Read the article

  • Remote Desktop Problem on Windows Server 2008 R2

    - by lukiffer
    Revised this question to be more concise, consolidating several revisions. Symptoms: From a domain-member Windows 7 Client: Domain credentials to a domain controller = success Domain credentials to a member server (by hostname or FQDN) = success Domain credentials to a member server (by IP) = fail Local credentials to a member server (by either) = success From a non-domain-member Windows 7 Client: Domain credentials to a domain controller = success Domain credentials to a member server = fail Local credentials to a member server = success (Identical behavior from a Mac RDC 2.1 client) Server Configuration Details: Windows 2008 R2 Datacenter w/ SP1 The domain in question is a subdomain of a Windows 2008 domain (forest root). Root has DCs in both Site A and Site B, subdomain only has DCs in Site B. RDP is operating normally on all root member-servers and DCs. No remote desktop settings are defined by GPOs. Network level authentication is enabled; all clients are compatible and the certificate exchange/SSL handshake completes successfully. Not catching any errors in netlogon log.

    Read the article

  • Where can I find Ad Networks with single liner Ads?

    - by MaX
    I've developed a site that serves pure HTML Weather widgets (and they are great looking too). Just after two months I am generating 1.25K hits monthly (Google Analytics). Now I want to generate some money out of it. You can check my service out on Here . I am looking for affiliate or an Ads service that can I can hookup within but there is a twist in story. I want single liner text Ad in a particular location otherwise widgets will look rubbish, see this snapshot: Plus I have some unique places in my site to place some banner ads as well, Here are existing set of services that I've already tried: Ad Sense, doesn't allow or have such formats of methods. Peefly provides you with straight links works best but I recorded some clicks (Through Google Events) and they didn't show me any, plus it introduces overhead of manually going and choosing your links. BidVertise totally rubbish opens popups and what not, makes site look like spam I am new to this ad stuff so have a limited knowledge. Suggestions please? I have one more place in Forecast but I want to start simple. P.S. I also have a MetroUI like widget coming in the pipeline but its not ready yet.

    Read the article

  • Sharepoint 2010 can't find domain users when granting permissions

    - by quani
    I'm trying to grant permissions to other people to view a SharePoint site but when granting permissions it uses "Check Names" and claims any user or group that is part of a domain does not exist. It does this if I try granting permissions to the team site or in central admin BUT if I try to add someone to Farm Administrators in Central admin then all of the sudden it can find all domain users. Why is it finding domain users in that one context but not others? It is supposed to be using NTLM authentication and has Windows configured as the authentication provider (And IIS is configured to use NTLM). What's even more strange is I enabled Anonymous Access for the team site which I thought would allow anyone to view it but others say they can't access it.

    Read the article

  • Is rsync corrupting my RAR?

    - by Mark Henderson
    We have two qnap devices - one in our datacentre and one off-site. We have hundreds of password protected RAR files stored on the qnap that contain virtual machine image snapshots, with approx 20 of them being created each day. We synchronise the two devices using rsync, and it looks like all the files are being rsynced OK - they come over and have the same file size and all the files are present and accounted for. However, when I try to open the RAR files on the remote site, I get Cannot open \\qnap01\FromDatacentre\Snapshots\DB001SQL1-20110626.rar I can open the RAR files on the local site just fine, so I assume that something is getting mangled during the rsync procedure. However, the older files (pre 2011-06-20) work just fine, it's something that's only started happening in the last week. There haven't been (as far as I know) any changes to any of the devices, setup or configuration in that time. Obviously something has changed though. Where should I start investigating?

    Read the article

  • How does 301 redirection work across the network? & should I use it if there is a chance we made need to change the resource back to the original URL?

    - by Faust
    I've built a CMS that makes it fairly easy for my client to relocate pages in their site hierarchy. This site has all human-readable and intuitive URLs, so moving a page necessarily means that its URL changes. I am storing records of each resource's past URLs in the data store so that requests for bygone URLs are re-routed to their appropriate successors. I'm warning my clients not to re-arrange the site willy-nilly (for numerous reasons). But nevertheless I suspect there's a chance page moves could get reversed from time to time. So I'm trying to figure out whether 301 or 302 or 307 redirects should be used when serving up pages to requests for out-of-date URLs. I understand the value of using 301 for search engine optimization. But my concern is with this system possibly inadvertently making some pages unavailable to some users QUESTIONS: That is, if the clients move a page at location/URL A to a new location B, then users get the redirect for A to B, and then the clients move the page back to A again, how long can I expect any of those users to keep getting their requests for A redirected to B -- in this case sending them to my friendly 404 page? Is it until an item in their browser history is cleared? Is the redirect somehow cached in routers throughout the internet? How does this work? How long can I expect the 301 redirect to linger out there ?

    Read the article

  • Linux foxboard network monitor

    - by het.oosten
    I want to use a Foxboard a simple network monitor for multiple routers (all routers are connected to the internet). Foxboard is a mini pc with an embedded version of Debian. My idea is to use multiple virtual network devices like this: eth0 192.168.2.10 eth0:1 192.168.3.10 eth0:2 192.168.4.10 I found a nice Python script to ping an external host here (the solution from Ryan Cox): http://stackoverflow.com/questions/316866/ping-a-site-in-python Is it possible to configure Debian to use eth0 when I ping www.site-a.com and eth0:1 when I ping www.site-b.com?

    Read the article

  • How to Edit aspx, Cshtml and other kind of files live on FTP server ?

    - by Anirudha
    Originally posted on: http://geekswithblogs.net/anirugu/archive/2013/06/27/how-to-edit-aspx-cshtml-and-other-kind-of-files.aspxMany time we just want to make a small changes on site and we don’t want to download the whole project again. In this post I will show you some good way to do it.   People who have Expression Web 4 can do it. I tried it and it’s work good with aspx file. If you have site in asp.net and use aspx engine then this is a good option. Well, Expression Web is free (previously paid software). A another good option is Komodo Edit. You can use komodo edit and few plugin to make FTP editing work for you. The problem in these 2 apps are they don’t have syntax highlight and support for CSHTML file which are introduced with MVC 3. For this I suggest you to go with webmatrix. You can use Webmatrix to edit cshtml file online. Remember that Webmatrix don’t support Compiling of MVC project. You need Visual Web developer Express at-least to compile your project. if you are in hurry try  https://c9.io/ put your FTP settings and you just got your hands ready to make changes on live site. If you have anything else in your mind share it here.

    Read the article

  • News Applications internal working [on hold]

    - by Vijay
    How does news applications work other than RSS Feed based applications? I know some of them take the RSS content from the source site.But sometimes I see, those applications show - Title Description Date Image video etc. Even though when I see the original site's rss, image, video is not there in rss. So how does one get that to show in there applications? Some applications even shows feeds from magazine sites, newspaper sites. How do these applications work? I am creating an application which will link to different news sites feeds categorized (like top news, technology, games, articles etc.) On the front page it will show the website names, then on selection of any news site it will get the feed from that website and show it to user. So I would like to know All the fetching of data from should be done on user selection or data should be prefetched? Detailed information I want to fetch from the original like provided in the rss data. How should I go about it?

    Read the article

  • Securing internal data accessed by a website on the big, bad internet

    - by aehiilrs
    A close relative of this question on Stack Overflow: When you have a web site in your DMZ that needs to access production data stored on an internal DB, what strategies do you recommend using to lower the risks that come from accessing live data? Is it even considered acceptable to have a connection initiated from the DMZ come inside of your network? An extra detail about the nature of the site that kind of throws a monkey wrench into the machinery is that people using the web site will be competing for "spots" on a first-come, first-serve basis with others using the internal software. Because of this, as close to zero lag time between the two applications as possible is ideal.

    Read the article

  • Long domain lookup on .dev domain inside vmware

    - by skelle
    I'm developing on my macbook and normally I have a local running webserver which just works finde. Now I have to use a vmware image where the webserver is running. I set up everything and my dev site is running under site.dev inside vmware. I can connect to the webserver but EVERY request takes a very long time. I already red that this is related with iIPv6 and the way OSX handles /etc/hosts. There I added 192.168.155.42 site.dev and I already did this (Resolving to virtual host very slow on Mac OS X Lion) but my lookup still takes ~30seconds on every request. What can I do to fix this issue?

    Read the article

  • How do spambots work?

    - by rlb.usa
    I have a forum that's getting hit a lot by forum spambots, and of course the best way to defeat something is to know thy enemy. I'll worry about defeating those spambots later, but right now I'd like to know more about them. Reading around, I felt surprised about the lack of thorough information on the subject (or perhaps my ineptness to input the correct search terms for better google results). I'm interested in learning all about spambots. I've asked on other forums and gotten brush-off answers like "Spambots are always users registering on your site." How do forum spambots work? How do they find the 'new user registration' page? (I'm especially surprised because some forums don't have a dedicated URL for this eg, www.forum.com/register.html , but instead use query strings or even other methods invisible to the URL bar) How do they know what to enter into each 'new user registration' field? How do they determine what's a page they can spam / enter data into and what is not? Do they even 'view' this page at all? ..If not, then I'd assume they're communicating with the server directly - how is - this possible? How do they do it? Can forum spambots break CAPTCHAs? Can they solve logic questions (how?)? Math questions? Do they reverse-engineer client-side anti-bot validation scripts? Server-side scripts? What techniques are still valid to prevent them? Where do spambots come from? Is someone sitting behind the computer snickering as they watch their bot destroy site after site? Or are they snickering as they simply 'release' it onto the internet somehow? Are spambots 'run' by an infected computer somewhere? Do they replicate themselves? etc

    Read the article

  • Jail user to home directory while still allowing permission to create and delete files/folders

    - by Sevenupcan
    I'm trying to give a client SFTP access to the root directory of their site on my server (Ubuntu 10.10) so they can manager their website themselves. While I have been successful in jailing a user to a directory and giving them SFTP access; they are only allowed to create and delete new files in sub directories (the directories they own). This means that I must give them access to the parent directory to the root of their site. How can I limit them to the root of their site (for example public_html) while still allowing them the ability create and delete files. All the tutorials I have read suggest that the root must be the owner of the user's home directory, which prevents them from write access inside that directory. I'm relatively new to managing my own server so any advice would be very grateful. Many thanks.

    Read the article

  • stop apache from asking for SSL password each restart

    - by acidzombie24
    Using instructions from this site but varying them just a little i created a CA using -newca, i copied cacert.pem to my comp and imported as trusted issuer in IE. I then did -newreq and -sign (note: i do /full/path/CA.sh -cmd and not sh CA.sh -cmd) and moved the cert and key to apache. I visited the site in IE and using .NET code and it appears trusted, great (unless i write www. in front which is expected). But every time i restart apache i need to type in my password for the site(s?). How can i make it so i DO NOT need to type in the password?

    Read the article

  • Default document not working after installing SP1 on Windows 2008 R2 x64

    - by boredgeek
    We have a web site that should only be available for authorized users. So we deny anonymous access for the site. However we do allow anonymous access to the default page and the login page. When we installed SP1 the behavior of the server changed. Now if the user is trying to access the root of the site, say http://mysite.com, she is redirected to login page rather than the default page. Is there a hotfix to bring back the previous behavior?

    Read the article

  • How to auto-update a website mirror with exceptions to certain pages?

    - by tomatosalad
    I'm currently mirroring a website on my server. The site itself is rarely updated, but it is updated enough that info can become outdated quickly. I mirrored it first with wget, and this worked fine, but I made some changes: The original index.html used frames, but the site also provides a main.html which is essentially index.html but without frames. I deleted index.html and renamed main.html. I did not want to mirror the webchat, blog or forum, so I deleted those files and directories and made directories "blogs" "forum" and "chat" and placed a php redirect in each of those, redirecting visitors to the orignal site. I'd like to auto-update the mirror (maybe once every 24-72 hours), but preserve the changes I made. Is this possible? How would I go about doing it? I am completely clueless as to how. Thanks for any and all help! :)

    Read the article

  • Serve up syntactic XHTML5 using the text/html MIME type?

    - by cboettig
    I have a site currently written with HTML5 tags. I'd like to be able to parse the site as XML, with support for namespaces, etc, to facilitate programmatic extraction of data. Currently I have <!DOCTYPE html> and <meta charset="utf-8"> Which I gather is equivalent in HTML5 to explicitly setting the content-types as <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> for my current setup. In order to serve XML it sounds like the right thing to do is <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> Should I also change my Content-Type to <meta http-equiv="content-type" content="application/xhtml+xml; charset=iso-8859-1" /> Or is that not necessary? What is the advantage of having content-type be "application/xhtml+xml"? What is the disadvantage? (Sounds like it may break internet explorer rendering of the site? but maybe that information is out of date now?) Many thanks!

    Read the article

  • Issues With IIS Hosting Two Domains From Same Folder

    - by Bob Mc
    I have two different domain names that resolve to the same ASP.Net site. Both domains are hosted on the same server, which runs Windows Server 2003 and IIS6. The sites are differentiated in IIS Manager using host headers. However, both of the sites point to the same folder on the local drive for the site's page files. I am occasionally experiencing an ASP.Net error that says "The state information is invalid for this page and might be corrupted." I'm the site developer so I've addressed all the relevant code-related causes for this issue. However, I was wondering whether having two domains/sites sharing the same folder for an ASP.Net application might be causing this intermittent error. Also, is this generally a bad practice? Should I make separate, duplicate folders for each of the domains? Seems like that can become a maintenance headache.

    Read the article

  • Slow slef hosted wordpress website

    - by Integrati Marketing
    Hi All, we have a great site which has been humming along nicely for about 5 months and then in May it went from a page load speed time of 3-5 secs to now an agonising 15+ secs!!! The host has been really helpful and has even shifted the site to a new server which is faster! I guess seeing as though we do not have the insight or your expertise we would ask the Serverfault community and see what this crowd of experts could recommend? Appreciate any insight, thank you. site is here: integrati.com.au Cheers. :)

    Read the article

  • PHP hosting some info required [closed]

    - by mtk
    I have recently given a control of newly bought hosting space and the domain account. There is a technical team from the hosting site to help out with problems, but that is a long process, i.e. log a ticket, wait for a long time, and I don't get the correct answer in the first shot. I was wondering, if anyone has any helpful guide and how one must go with hosting a site. Any info that must be know w.r.t to cpanel? Any other useful stuff if any one has, or could point me to ? Just to give a few difficulties: The same php code working well on local machine, giving error on remote as "File not found". The file is present indeed as I have ftp'ed all the files correctly. session_start error are outputted to html page with warning "Header already sent". and many more technical things, that work well on local but not on actual hosting server. So, if anyone has any helpful stuff in this reference, as to what all changes are required or what a programmer must be aware from a hosting perspective, please let me know. Note I am hosting a PHP site with mysql db, on a shared environment.

    Read the article

  • Slow self hosted wordpress website

    - by Integrati Marketing
    Hi All, we have a great site which has been humming along nicely for about 5 months and then in May it went from a page load speed time of 3-5 secs to now an agonising 15+ secs!!! The host has been really helpful and has even shifted the site to a new server which is faster! I guess seeing as though we do not have the insight or your expertise we would ask the Serverfault community and see what this crowd of experts could recommend? Appreciate any insight, thank you. site is here: integrati.com.au Cheers. :)

    Read the article

  • How do I reduce RAM usage on my server?

    - by Abs
    I have recently launched a site that is very popular but I am having trouble with scalability. My site makes heavy use of FFmpeg and at peak times RAM usage hits the 2 GB point quickly and the swap file starts getting used. CPU usage starts rising too. Users complain that the site is slow. They say this because all FFmpeg instances run very slow because of the number running at the same time. Users make use of FFmpeg on my server in real time. Is there anything I can consider or do to ease down the usage of the server and RAM just shooting up? Maybe there is something better than FFmpeg (!). Is the only solution "throwing some cash" at a more powerful server? I have given little information, please ask for more, so this problem can be solved.

    Read the article

< Previous Page | 194 195 196 197 198 199 200 201 202 203 204 205  | Next Page >