Search Results

Search found 24735 results on 990 pages for 'site ranking'.

Page 127/990 | < Previous Page | 123 124 125 126 127 128 129 130 131 132 133 134  | Next Page >

  • PHP error logging - can I log all ofg the site's error to a single file?

    - by mawg
    hi, in PHP.INI, I can set the variable "error_log" and it is currently set to "error_log", which manes one file of that name in each directory Do you know ... if I set it to "public_html/error_log" - will I get only one single error log file? Any other way to do so? I really just want a single site-wide error file to check, rather than one on each sub-directory. For bonus marks, can I send myself an email each time a new entry is added to the file? Left as an exercise for the reader - can I ignore some "errors" which aren't really?

    Read the article

  • How can I make a boring project (another WordPress site) interesting?

    - by Christopher Altman
    WordPress is my example, but the question can be generalized to any technology that is not particularly interesting. To me, WordPress takes away the intellectually gratifying pieces of coding. I would rather write a new version of WordPress than write a WordPress theme and glue together some plugins. I am using WP because my company dictates the platform for some of our clients (I do not disagree with the choice from a business perspective, WP is quick and cheap to implement). My question is, how can I make my next WordPress project interesting? I want to advance my understanding of the fundamentals of programming (aka data structures, algorithms, and caching) but do not see how I can achieve this when coding another WP site. I have a fairly tight understanding of front-end technologies and believe I have made WP do things it was never intended to do. Examples are here and here. Solving front-end related problems is not as interesting as coding a full stack application. Any advice will help.

    Read the article

  • Same script, working on a site, not working on the other!

    - by Tioneb
    Hello, First of all I apologize in advance for this question, a bit off the rang of stackoverflow, but I've spend a day trying to solve that issue and I'm totally stuck. The issue: The search function of my script (php) works perfectly fine on one host but not on the other. If you search something here : edu-cafe.com, you'll get a result, just as it should be. However, try a search on this site, hosted somewhere else : code-reduc.com, exact same script, files and datable, and it just hang. I've asked both the host and the original programmer of the script to look at the issue but they can't seem to find an answer... Obviously the cause of my troubles comes from the Host, but I can't find the issue Any bit of help would be hugely appreciated! PS: part of the script here: http://codepaste.net/fuymqn Thanks!

    Read the article

  • Site works perfect in Mozilla but not in IE. Is my js file not compatible with IE

    - by Bonkers
    I'm working on a site written in PHP/MySQL. We have a form to reserve time on a calendar and it works great in Mozilla and stores the reservation to our database, but in IE you fill out the form and when you click the "Reserve" button to submit it and nothing happens. All I can think of is that my javascript is not working with IE. I have these lines in my .js file: resLenT = document.getElementById(resLenElem); resLenI = resLenT.selectedIndex; resLen = resLenI + 1; where resLenElem is a drop-down box. These are the only lines that I can think of at the moment that might be causing trouble in IE. Does this all sound like I'm on the right track or am I way off base?

    Read the article

  • How can i add Active Directory security groups to a SharePoint site to control permissions, rather than individual user accounts

    - by user574811
    SharePoint does integrate active directory accounts, of course, but how about security groups? Have a few sites where I'm fairly confident access is going through an existing Active Directory (AD) security groups (i.e. only an AD security group has been granted permissions through the 'People and Groups') In another situation, where I created the AD group and granted it permissions to a site, the customers were not able to access immediately. Eventually had to fast-track it and add the individuals to the People and Groups to keep the project going, but hoping not to have to maintain it that way. Any specific requirements of the security group in AD? Universal, Global, or domain local? Is there any time delay between modifying group members in AD and having that take effect in SharePoint?

    Read the article

  • Optimal ASP.Net cache duration for a large site?

    - by HeroicLife
    I've read lots of material on how to do ASP.Net caching but little on the optimal duration that pages should be cached for. Let's say that I have a popular site with 50,000 pages. The content does not change frequently, so I could cache pages for up to an hour if I wanted. The server has 16 GB of RAM, but database connections are limited. How long should pages be cached for? My thinking is that if I set the cache duration too high (let's say 60 minutes), I will fill up memory with a fraction of the total content, which will continually be shuffled in and out of memory. Furthermore, let's say that 10% of the pages are responsible for 90% of traffic. If the popular pages are hit every second, and the unpopular ones every hour, then a 60 second cache would only keep the load-intensive content cached without sacrificing freshness. Should numerous but rarely-accessed content be cached at all?

    Read the article

  • Windows Ce 6.0 loses Windows credentials when viewing a web site that's running on Windows 2008 server

    - by gnomixa
    When a user views a web page (with integrated Windows authentication) on WindowsCE 6.0 device, the authentication is lost sporadically. The page being viewed is running on Windows 2008 server. We never had the same issue with Windows 2003 server. The credentials were being asked once and cached for a certain time. My question is: has anything changed in Windows 2008 that doesn't pass the credentials the same way to WindowsCE? The only variable in this scenario is the web server OS - Windows 2003 vs WIndows 2008. Any help would be appreciated, thanks!

    Read the article

  • Setup LAN to serve webpages and voip and access to the web site from inside LAN with domain name

    - by Mauricio Arias
    I'd like to know if it will work: I have my domain and I´m serving a webpage in a nginx to the internet, but if I type my domain in my laptop inside LAN I access to my modem/router configuration, I cannot access to the web server unless I type the IP address. I would like to add a Bind server after the modem/router - (port forward, ports 80 and 5060), if the request is www.mydomain.com bind should resolve the nginx IP address and serve it, and if it is a voip request should address to the voip server and if I'd like to access to the website from inside LAN I'd like to type mydomain.com. Could I do it with this configuration? Do I need something else? Thanks in advace!

    Read the article

  • Is there a Firefox or Chrome plugin, or a standalone program, for monitoring site usage and search queries?

    - by Leigh Caldwell
    I'm running some research on how people search the web for specific types of information. I'd like to be able to set them up with a laptop and browser and then record a history of what they search for and what sites they visit. A Firefox or Chrome plugin would be ideal, but a standalone program is fine too. It doesn't need to be free, just quick and reliable. It doesn't need to be a general PC monitoring program (though that would be OK too) - it's only Web usage I need to track. I've found a few on the Web but am not sure which ones to trust. Your recommendations would be much appreciated.

    Read the article

  • Using dnsmasq for accessing multiple nameservers assigned by DHCP

    - by Ash
    At my work desktop running openSUSE 11.4, I have a local network which gets its address, domain (work.site) and nameservers (10.100.1.1, 10.100.1.2) info through DHCP - which get written into /etc/resolv.conf I get to access the internet using the work network, and these 2 nameservers end up returning the entries for any public domain name lookups on the internet. I also have a private VPN that I end up connecting. The nameserver (10.111.1.1) and domain (private.site) are rarely bound to change for this network, but currently they're pushed by the openVPN client into networkmanager, and which also gets merged with the existing /etc/resolv.conf My resolv.conf ultimately ends up looking like this: search private.site work.site nameserver 127.0.0.1 nameserver 10.111.1.1 nameserver 10.100.1.1 As you can see the 2nd nameserver from my work network was pushed out because of the max 3 entry limitations. It is fine still, but would be a problem if that nameserver goes down for maintenance or something. So I found out that dnsmasq could help me here, and hence I setup dnsmasq just as a local DNS resolver without any DHCP support. So right now this is my /etc/dnsmasq.conf: resolv-file=/etc/resolv.conf server=/private.site/10.111.1.1 server=/1.111.10.in-addr.arpa/10.111.1.1 listen-address=127.0.0.1 bind-interfaces log-queries I've made dnsmasq get the list of nameservers from /etc/resolv.conf since NetworkManager seems to be updating this list correctly (for a max of 3 nameservers). I'm able to resolve the host names in both the networks correctly. So these are the questions I have: Is there a way I can make either NetworkManager or dhclient write out the list of nameservers somewhere else which I can make dnsmasq use as resolv-file ? How do I make dnsmasq use certain nameservers as the default for all queries ? Right now I notice that lookups for public domains on the internet are usually sent to both the nameservers - the one on work.site as well as private.site. It would be good if I can limit this only to work.site.

    Read the article

  • Having XP VM use my host OSX ssh tunnel to connect to a remote site?

    - by Manachi
    I am using Mac OSX and have Windows XP running on VMWare Fusion. I'm creating an ssh tunnel from OSX to a remote server, and then trying to have Windows XP use that tunnel (I actually use a program called Proxifier on XP to filter my XP MS SQL Server traffic through that tunnel) Note that I can successfully create an ssh tunnel (on port 9333) from the XP putty to the remote host, and have SQL Server Proxify through that tunnel and it all works correctly. However when I try to set up the tunnel in OSX, and have Proxifier in XP point to the OSX tunnel instead of localhost, it doesn't seem to connect. Here is the OSX command i'm using to create the tunnel: ssh -i /my/key -p 9001 -D 9333 -g me@remotehostname Then I set my XP proxifier to point to macosxhostname:9333 (instead of the previous localhost:9333 which worked corrently when using putty) Any suggestions on what I may have missed? My XP firewall is turned off while setting this up.

    Read the article

  • How can I edit an individual site's virtualhost using either WHM or cPanel?

    - by user55578
    I've been using Webmin/virtualmin for years. In Webmin, I can edit Apache config files quite easily. For example, if a user wants to change the DocumentRoot because he/she wants to serve up a Ruby on Rails app using Phusion Passenger, I can do that in a few seconds using the Webmin GUI. /etc/apache/sites-available/samplesite.com.conf Is there something similar in WHM/cPanel? How can I edit the VirtualHost (and inside that, the Document Root), using WHM/cPanel?

    Read the article

  • Is there a web-based VNC client that I can use from my own web site? [closed]

    - by Eliot Solomon
    I'm interested in being able to control one of my computers remotely, without having to download and install a bunch of software on every machine I want to access my computer from. Is there a VNC client or RDP software that can be embedded in a web page? I realize this seems like it might be a web application based question, but the web application portion would only end up being one small portion of the whole solution. Would VNC really be the way to go for this?

    Read the article

  • How can I use varnish to generate a robots.txt file even for subdomain of the same site?

    - by Sam
    I want to generate a robots.txt file using Varnish 2.1. That means that domain.com/robots.txt is served using Varnish and also subdomain.domain.com/robots.txt is also served using Varnish. The robots.txt must be hardcoded into default.vcl file. is that possible? I know Varnish can generate a maintenance page on error. I'm trying to make it generate a robots.txt file. Can anyone help? sub vcl_error { set obj.http.Content-Type = "text/html; charset=utf-8"; synthetic {" <?xml version="1.0" encoding="utf-8"?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> <html> <head> <title>Maintenance in progress</title> </head> <body> <h1>Maintenance in progress</h1> </body> </html> "}; return (deliver); }

    Read the article

  • VMWare Newbie - looking for hardware recommendations and help :) [closed]

    - by Dan
    I am looking for some hardware recommendations on an upcoming virtualization project. We are a small company (80 users - 25 in site 1, 55 in site 2) currently using Windows Server 2003 - no VM servers yet. Our AD is setup where site 1 is the root domain and site 2 is a subdomain/subnet - connected by T1 and VPN for failover. The current DC's also server as file servers, print servers, AntiVirus servers. Email is in the cloud. Additionally then in site 1 we have 3 additional member servers - one running IBM Websphere for a customer specific app, one running Infor PowerLink (no real heavy load) and another that we use for Virtual Studio apps and also runs DirSync for Exchange Online. No heavy workloads on any of these machines really. We also have an AS400 box that we run ERP/CRM software on that site 2 connects to over the WAN link. In site 2 we also have a SQL machine that runs on Win2K server. Database files are not large less than 5 GB. Light to Medium workload on this machine. File servers in each site store less than 500 GB data and probably won't grow to more than 1TB in the next 5 years. I am looking to go to VMWare in both sites and virtualize all servers. What recommendations do you have for server, storage hardware? Is it safe to virtualize all of your DC's? Any help or advice would be greatly appreciated. Thanks.

    Read the article

  • Myst 4 still not working on my mac 10.6 after following instructions from this web site (link provid

    - by user33675
    I followed the instructions from this link (http://superuser.com/questions/125931/how-do-i-install-myst-4-revelation-on-os-x-10-6) to the T on this website.... after I installed the game and tried to play it i got this message.....(Myst4 Revelation quit unexpectedly) it gave me the chance to reopen it but every time i tried it did not work. Can any one help me with this problem thanks.

    Read the article

  • How to setup a django site with Cherokee, DynDNS and virtual_env?

    - by e-satis
    I have a django project running with the dev server, and would like to try run it in a production environment. I wanted to try Cherokee for a change, so I installed it. We don't have a domain name yet, so I set up a DynDNS looking like stuff.gotdns.org. It works fine, I can see the Cherokee welcome page (so red, I first believed I got an error :-p). I ran the wizard to create a new virtual server for Django. No everything is setup, but I have nothing. Still the default Cherokee welcome page. What should I do now if I want to go to "http://stuff.gotdns.org" and see my website? What should I do now if I next want to make it available only at "http://project.stuff.gotdns.org"? Important fact, I use virtual_env, so your can call Python directly, you have to activate it first.

    Read the article

  • Building a Social Networking website, should I have separate servers for certain parts of the site? [closed]

    - by Dylan Cross
    I have been working on building a social networking website, I'm pretty committed to this and I think I have something that could work out. I hope to be launching it January 1st, and so I have a question for server setup and such. Facebook has separate domain names/servers for their photos (and I don't know what else), so I would assume that by doing this it would help spread the server loads out. So I am wondering if it would make a very big difference in speed if I had my main server for basically everything, but had another server and such that the photos would be stored on and access them the same way that facebook does.

    Read the article

  • How can I disable Kerberos authentication for only the root of my site?

    - by petRUShka
    I have Kerberos-based authentication and I want to disable it on only root url: http://mysite.com/. And I want it to continue to work fine on any other page like http://mysite.com/page1. I have such things in my .htaccess: AuthType Kerberos AuthName "Domain login" KrbAuthRealms DOMAIN.COM KrbMethodK5Passwd on Krb5KeyTab /etc/httpd/httpd.keytab require valid-user I want to turn it off only for root URL. As workaround it is possible to turn off using .htaccess in virtual host config. Unfortunately I don't know how to do it. Part of my vhost.conf: <Directory /home/user/www/current/public/> Options -MultiViews +FollowSymLinks AllowOverride All Order allow,deny Allow from all </Directory> UPD. I'm using Apache/2.2.3 (Linux/SUSE) I tried to use such version of .htaccess: SetEnvIf Request_URI ^/$ rootdir=1 Allow from env=rootdir Satisfy Any AuthType Kerberos AuthName "Domain login" KrbAuthRealms DOMAIN.COM KrbMethodK5Passwd on Krb5KeyTab /etc/httpd/httpd.keytab require valid-user Unfortunately such config turn Kerberos AuthType for all URLs. I tried to place first 3 lines SetEnvIf Request_URI ^/$ rootdir=1 Allow from env=rootdir Satisfy Any after main block, but it didn't help me.

    Read the article

  • Is there an apache module to slow down site scans?

    - by florin
    I am administering a few web servers. Each night, random hosts from the Internet are probing them for various vulnerabilities in php, phpadmin, horde, mysqladmin, etc. Is there a way (apache plugin?) to slow down the rate of attack? For SSH, I have a rate limiting rule on the firewall, which does not allow more than three connections per minute. But I don't want to rate limit all HTTP access, only the access that returns 404s. Is there such an apache module?

    Read the article

  • RRAS Problem routing to central site from RRAS server only?

    - by TomTom
    Given is an office connected to headquarters using a RRAS bridge (2 virtual machines using RRAS to route between the two networks). Naming: The office is A, the RRAS on A is a-lnk. THe headquartters is B, b-lnk the RRAS machine there. The VPN works perfectly - machines can ping and work between the sites. Domain controllers on both ends replicating, DFS working, remote desktop working. All in all... everything is fine. EXCEPT: a-lnk itself can not reach any machine in B. This would normally not be troublesome (noone ever does anything on a-lnk), but there are two exceptions: * a-lnk is supposed to get it's license from a KMS in B, so not being able to reach B means it is not prolonging. * a-lnk is supposed to pull updates from a WSUS in B - and not being able to reach B means - no updates. Given that thigns work (and security is a minor issue - A-lnk is not reachable from the internet as it is behing a NAT hardware anyway) this got not handled for months. I just wan to get this item ticked off now. Anyone an idea what this is? It definitely is not a "dns does not work" or "routing in general is bad" item, as any computer in A can connect to any computer in B, and the other way arount - only the RRAS computer itself seems to do something really awkward. Platform for both: 2008 R2 standard.

    Read the article

  • Backup Exec 10 - Network connection to the remote agent has been lost

    - by jherlitz
    Okay, so I have 4 remote offices, all running off of a 3mb ethernet connection. Two sites are part of a WAN and 2 sites are using 3mb connections over a site to site tunnel. I am using Backup Exec 2010, I have the remote agent installed on all the remote servers. For the past few weeks now, on the two sites running over the site to site tunnel have been failing with the following error message now. "The network connection to the Backup Exec Remote Agent has been lost. Check for network errors" We used to be on a DSL connection site to site tunnel, now we changed to the 3mb ethernet connection using site to site tunnel. I have to find out, has it been failing ever since we changed, or just recently. Backup exec support is telling me it is a network issue. My communication or connection to the server is solid, we don't have any issues, or outages. So I am baffled on why this continues to fail. And why just those two sites.. Any advice?

    Read the article

  • Any way to stop people from img "framing" your site?

    - by Yegor
    Someone was trying to get cute with me, by "iframeing" my search result page via an IMG tag with 0 width and 0 height, in hopes of killing my server resources. My searches are cached, so it doesn't do much damage, since its just a static file being served, but I was wondering if there was anything I can do to "fight back"? I know you can use a frame breaker, had it been an iframe. Is there anything to do in the case of an image?

    Read the article

< Previous Page | 123 124 125 126 127 128 129 130 131 132 133 134  | Next Page >