Search Results

Search found 37688 results on 1508 pages for 'site search'.

Page 330/1508 | < Previous Page | 326 327 328 329 330 331 332 333 334 335 336 337  | Next Page >

  • Apache2 Modpython : IOError: Write failed, client closed connection.

    - by llazzaro
    This is the error : [Mon Mar 01 12:19:50 2010] [error] [client XXX.XXX.248.60] mod_python (pid=9528, interpreter='realpage.com', phase='PythonHandler', handler='django.core.handlers.modpython'): Application error [Mon Mar 01 12:19:50 2010] [error] [client XXX.XXX.248.60] ServerName: 'realpage.dom' [Mon Mar 01 12:19:50 2010] [error] [client XXX.XXX.248.60] DocumentRoot: '/htdocs' [Mon Mar 01 12:19:50 2010] [error] [client XXX.XXX.248.60] URI: '/' [Mon Mar 01 12:19:50 2010] [error] [client XXX.XXX.248.60] Location: '/' [Mon Mar 01 12:19:50 2010] [error] [client XXX.XX.248.60] Directory: None [Mon Mar 01 12:19:50 2010] [error] [client XXX.XXX.248.60] Filename: '/htdocs' [Mon Mar 01 12:19:50 2010] [error] [client XXX.XXX.248.60] PathInfo: '/' [Mon Mar 01 12:19:50 2010] [error] [client XXX.XXX.248.60] Traceback (most recent call last): [Mon Mar 01 12:19:50 2010] [error] [client XXX.XXX.248.60] File "/usr/lib/python2.5/site-packages/mod_python/importer.py", line 1537, in HandlerDispatch\n default=default_handler, arg=req, silent=hlist.silent) [Mon Mar 01 12:19:50 2010] [error] [client XXX.XXX.248.60] File "/usr/lib/python2.5/site-packages/mod_python/importer.py", line 1229, in _process_target\n result = _execute_target(config, req, object, arg) [Mon Mar 01 12:19:50 2010] [error] [client XXX.XXX.248.60] File "/usr/lib/python2.5/site-packages/mod_python/importer.py", line 1128, in _execute_target\n result = object(arg) [Mon Mar 01 12:19:50 2010] [error] [client XXX.XXX.248.60] File "/usr/lib/python2.5/site-packages/django/core/handlers/modpython.py", line 228, in handler\n return ModPythonHandler()(req) [Mon Mar 01 12:19:50 2010] [error] [client XXX.XXX.248.60] File "/usr/lib/python2.5/site-packages/django/core/handlers/modpython.py", line 220, in call\n req.write(chunk) [Mon Mar 01 12:19:50 2010] [error] [client XXX.XX.248.60] IOError: Write failed, client closed connection. Please! I am sure you need more information in order to find the bug, please tell me what and how to get it. The error is throwing every time!

    Read the article

  • Windows 7 SP1 not being offered on Windows Update

    - by Ian Boyd
    i have no option to install Windows 7 Service Pack 1 (SP1) on my computer. Why is the option to install Windows 7 SP1 missing from Windows Update? i'm less interested in why the option is missing, and more interested in how to diagnose why the option to install Windows 7 SP1 is being hidden. Following the suggestions in KB2498452 - You do not have the option of downloading Windows 7 SP1 when you use Windows Update to check for updates: Confirm that Windows 7 SP1 is not already installed and that you are not running a prerelease version of Windows 7 SP1 i am not already running SP1, or a pre-release SP1: Check for pending updates Update 976902 may have to be installed on your computer before Windows 7 SP1 will be offered in Windows Update. i already have 976902 installed: Verify that an incompatible version of SafeCentral is not installed on your computer Windows SP1 may not appear in Windows Update if certain versions of SafeCentral are installed on your computer. SafeCentral is a security program that is manufactured by SafeCentral, Inc. i do not have SafeCentral installed (i've never heard of such a thing): Check whether you have Intel integrated graphics driver Igdkmd32.sys or Igdkmd64.sys and whether you upgraded the driver i do not have an Intel GMA: Make sure that you did not use vLite to customize your Windows 7 installation i did not use vLite to customize my Windows 7 installation. Again, i've never heard of such a thing. Update One: Here's proof that i've checked for updates "today" (3/2/2011): And that i'm not being presented the option of installing SP1 (i dispatched an update to Silverlight and a fix for IE9 being hosted in a Direct2D or Direct3D application; so updates themselves do work): Update Two Tried the Windows Update Troubleshooter: Window 7 Service Pack 1 is still not available. Update Three Here is the tail end of windowsupdate.log. It speaks of Evaluating application rules: Found 2 updates and 65 categories in search; evaluated appl. rules of 1324 out of 1832 deployed entities These must be the rules that say i'm not allowed to see SP1: 2011-03-03 09:21:08:091 924 db4 AU Triggering AU detection through DetectNow API 2011-03-03 09:21:08:091 924 db4 AU Triggering Online detection (interactive) 2011-03-03 09:21:08:091 924 950 AU ############# 2011-03-03 09:21:08:092 924 950 AU ## START ## AU: Search for updates 2011-03-03 09:21:08:092 924 950 AU ######### 2011-03-03 09:21:08:093 924 950 AU <<## SUBMITTED ## AU: Search for updates [CallId = {8517376A-B8A3-488B-B4D4-67DFC75788C8}] 2011-03-03 09:21:08:093 924 ca8 Agent ************* 2011-03-03 09:21:08:093 924 ca8 Agent ** START ** Agent: Finding updates [CallerId = AutomaticUpdates] 2011-03-03 09:21:08:093 924 ca8 Agent ********* 2011-03-03 09:21:08:093 924 ca8 Agent * Online = Yes; Ignore download priority = No 2011-03-03 09:21:08:093 924 ca8 Agent * Criteria = "IsInstalled=0 and DeploymentAction='Installation' or IsPresent=1 and DeploymentAction='Uninstallation' or IsInstalled=1 and DeploymentAction='Installation' and RebootRequired=1 or IsInstalled=0 and DeploymentAction='Uninstallation' and RebootRequired=1" 2011-03-03 09:21:08:093 924 ca8 Agent * ServiceID = {7971F918-A847-4430-9279-4A52D1EFE18D} Third party service 2011-03-03 09:21:08:093 924 ca8 Agent * Search Scope = {Machine} 2011-03-03 09:21:08:094 924 ca8 Misc Validating signature for C:\Windows\SoftwareDistribution\WuRedir\9482F4B4-E343-43B6-B170-9A65BC822C77\muv4wuredir.cab: 2011-03-03 09:21:08:097 924 ca8 Misc Microsoft signed: Yes 2011-03-03 09:21:08:287 924 ca8 Misc Validating signature for C:\Windows\SoftwareDistribution\WuRedir\9482F4B4-E343-43B6-B170-9A65BC822C77\muv4wuredir.cab: 2011-03-03 09:21:08:289 924 ca8 Misc Microsoft signed: Yes 2011-03-03 09:21:08:292 924 ca8 Agent Checking for updated auth cab for service 7971f918-a847-4430-9279-4a52d1efe18d at http://download.windowsupdate.com/v9/microsoftupdate/redir/muauth.cab 2011-03-03 09:21:08:292 924 ca8 Misc Validating signature for C:\Windows\SoftwareDistribution\AuthCabs\authcab.cab: 2011-03-03 09:21:08:294 924 ca8 Misc Microsoft signed: Yes 2011-03-03 09:21:08:354 924 ca8 Misc Validating signature for C:\Windows\SoftwareDistribution\AuthCabs\authcab.cab: 2011-03-03 09:21:08:356 924 ca8 Misc Microsoft signed: Yes 2011-03-03 09:21:08:356 924 ca8 Setup Checking for agent SelfUpdate 2011-03-03 09:21:08:356 924 ca8 Setup Client version: Core: 7.3.7600.16385 Aux: 7.3.7600.16385 2011-03-03 09:21:08:357 924 ca8 Misc Validating signature for C:\Windows\SoftwareDistribution\WuRedir\9482F4B4-E343-43B6-B170-9A65BC822C77\muv4wuredir.cab: 2011-03-03 09:21:08:359 924 ca8 Misc Microsoft signed: Yes 2011-03-03 09:21:08:418 924 ca8 Misc Validating signature for C:\Windows\SoftwareDistribution\WuRedir\9482F4B4-E343-43B6-B170-9A65BC822C77\muv4wuredir.cab: 2011-03-03 09:21:08:420 924 ca8 Misc Microsoft signed: Yes 2011-03-03 09:21:08:422 924 ca8 Misc Validating signature for C:\Windows\SoftwareDistribution\SelfUpdate\wuident.cab: 2011-03-03 09:21:08:424 924 ca8 Misc Microsoft signed: Yes 2011-03-03 09:21:08:655 924 ca8 Misc Validating signature for C:\Windows\SoftwareDistribution\SelfUpdate\wuident.cab: 2011-03-03 09:21:08:658 924 ca8 Misc Microsoft signed: Yes 2011-03-03 09:21:08:659 924 ca8 Setup Skipping SelfUpdate check based on the /SKIP directive in wuident 2011-03-03 09:21:08:659 924 ca8 Setup SelfUpdate check completed. SelfUpdate is NOT required. 2011-03-03 09:21:08:808 924 ca8 Misc Validating signature for C:\Windows\SoftwareDistribution\WuRedir\7971F918-A847-4430-9279-4A52D1EFE18D\muv4muredir.cab: 2011-03-03 09:21:08:810 924 ca8 Misc Microsoft signed: Yes 2011-03-03 09:21:08:872 924 ca8 Misc Validating signature for C:\Windows\SoftwareDistribution\WuRedir\7971F918-A847-4430-9279-4A52D1EFE18D\muv4muredir.cab: 2011-03-03 09:21:08:874 924 ca8 Misc Microsoft signed: Yes 2011-03-03 09:21:08:876 924 ca8 PT +++++++++++ PT: Synchronizing server updates +++++++++++ 2011-03-03 09:21:08:877 924 ca8 PT + ServiceId = {7971F918-A847-4430-9279-4A52D1EFE18D}, Server URL = https://www.update.microsoft.com/v6/ClientWebService/client.asmx 2011-03-03 09:21:13:958 924 ca8 Misc Validating signature for C:\Windows\SoftwareDistribution\WuRedir\7971F918-A847-4430-9279-4A52D1EFE18D\muv4muredir.cab: 2011-03-03 09:21:13:960 924 ca8 Misc Microsoft signed: Yes 2011-03-03 09:21:14:083 924 ca8 Misc Validating signature for C:\Windows\SoftwareDistribution\WuRedir\7971F918-A847-4430-9279-4A52D1EFE18D\muv4muredir.cab: 2011-03-03 09:21:14:085 924 ca8 Misc Microsoft signed: Yes 2011-03-03 09:21:14:087 924 ca8 PT +++++++++++ PT: Synchronizing extended update info +++++++++++ 2011-03-03 09:21:14:087 924 ca8 PT + ServiceId = {7971F918-A847-4430-9279-4A52D1EFE18D}, Server URL = https://www.update.microsoft.com/v6/ClientWebService/client.asmx 2011-03-03 09:21:14:395 924 ca8 Agent * Added update {414642E2-5F20-4AD1-AA5A-773061238B5F}.101 to search result 2011-03-03 09:21:14:395 924 ca8 Agent * Added update {56D5FC3D-9AC8-44F1-A248-8C397A24D02F}.100 to search result 2011-03-03 09:21:14:395 924 ca8 Agent * Found 2 updates and 65 categories in search; evaluated appl. rules of 1324 out of 1832 deployed entities 2011-03-03 09:21:14:396 924 ca8 Agent ********* 2011-03-03 09:21:14:396 924 ca8 Agent ** END ** Agent: Finding updates [CallerId = AutomaticUpdates] 2011-03-03 09:21:14:396 924 ca8 Agent ************* 2011-03-03 09:21:14:404 924 ce0 AU >>## RESUMED ## AU: Search for updates [CallId = {8517376A-B8A3-488B-B4D4-67DFC75788C8}] 2011-03-03 09:21:14:404 924 ce0 AU # 2 updates detected 2011-03-03 09:21:14:404 924 ce0 AU ######### 2011-03-03 09:21:14:404 924 ce0 AU ## END ## AU: Search for updates [CallId = {8517376A-B8A3-488B-B4D4-67DFC75788C8}] 2011-03-03 09:21:14:404 924 ce0 AU ############# 2011-03-03 09:21:14:404 924 ce0 AU Successfully wrote event for AU health state:0 2011-03-03 09:21:14:405 924 ce0 AU ############# 2011-03-03 09:21:14:405 924 ce0 AU ## START ## AU: Refresh featured updates info 2011-03-03 09:21:14:405 924 ce0 AU ######### 2011-03-03 09:21:14:405 924 ce0 AU No featured updates available. 2011-03-03 09:21:14:405 924 ce0 AU ######### 2011-03-03 09:21:14:405 924 ce0 AU ## END ## AU: Refresh featured updates info 2011-03-03 09:21:14:405 924 ce0 AU ############# 2011-03-03 09:21:14:405 924 ce0 AU No featured updates notifications to show 2011-03-03 09:21:14:405 924 ce0 AU AU setting next detection timeout to 2011-03-04 08:03:53 2011-03-03 09:21:14:405 924 ce0 AU Setting AU scheduled install time to 2011-03-04 08:00:00 2011-03-03 09:21:14:405 924 ce0 AU Successfully wrote event for AU health state:0 2011-03-03 09:21:14:406 924 ce0 AU Successfully wrote event for AU health state:0 2011-03-03 09:21:14:407 924 db4 AU Getting featured update notifications. fIncludeDismissed = true 2011-03-03 09:21:14:408 924 db4 AU No featured updates available. 2011-03-03 09:21:19:396 924 ca8 Report REPORT EVENT: {633538B3-030E-4CAD-BE6B-33C6ED65AFF1} 2011-03-03 09:21:14:395-0500 1 147 101 {00000000-0000-0000-0000-000000000000} 0 0 AutomaticUpdates Success Software Synchronization Windows Update Client successfully detected 2 updates. 2011-03-03 09:21:19:396 924 ca8 Report CWERReporter finishing event handling. (00000000) i'm less interested in why the option to install Windows 7 SP1 is missing, and more interested in how to diagnose why the option to install Windows 7 SP1 is being hidden. The KB article says that SP1 will not be offered if your machine doesn't meet some secret special criteria. How can i discover what that secret criteria is? i presume it is logged somewhere. Nor am i particularly interested in a direct download link. i want to learn here. i want to be able to diagnose (i.e. in the future) why an update is not being offered. i'm a superuser here. Rather than others coming up with a checklist of things to try, i want to be able to come up with the checklist.

    Read the article

  • Sharepoint 2007 Event ID 6482

    - by Dave M
    Our two server SharePoint 2007 SP2 farm has an issue. Event ID 6482 appears in the Application log of the Web front end many times a day. Often many time a minute. The full error is from Office SharePoint Server Event Type: Error Event Source: Office SharePoint Server Event Category: Office Server Shared Services Event ID: 6482 Date: 11/12/2009 Time: 3:05:22 PM User: N/A Computer: XXXXXX Description: Application Server Administration job failed for service instance Microsoft.Office.Server.Search.Administration.SearchServiceInstance (36a9b7ef-59aa-4f94-8887-8bf7b56f2f91). Reason: Error during encryption or decryption. System error code 0. Techinal Support Details: System.ArgumentException: Error during encryption or decryption. System error code 0. at Microsoft.Office.Server.Search.Administration.SearchServiceInstance.SynchronizeDefaultContentSource(IDictionary applications) at Microsoft.Office.Server.Search.Administration.SearchServiceInstance.Synchronize() at Microsoft.Office.Server.Administration.ApplicationServerJob.ProvisionLocalSharedServiceInstances(Boolean isAdministrationServiceJob) For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp. The SharePoint site appears to be functioning normally and Search returns expected results. Any suggestions would be appreciated

    Read the article

  • Twitter API Voting System

    - by Richard Jones
    So I blatantly got this idea from the MIX 10 event. At MIX they held a rockband talent competition type thing (I’m not quite sure of all the details).    But the interesting part for me is how they collected votes. They used Twitter (what else, when you have a few thousand geeks available to you). The basic idea was that you tweeted your vote with a # tag, i.e #ROCKBANDVOTE vote Richard How cool….    So the question is how do you write something to collate and count all the votes?   Time to press the magic Visual Studio new Project button… Twitter has a really nice API that can be invoked from .NET.   This is the snippet of code that will search for any given phrase i.e #ROCKBANDVOTE   public static XmlDocument GetSearchResults(string searchfor) { return GetSearchResults(searchfor, ""); }   public static XmlDocument GetSearchResults(string searchfor, string sinceid) { XmlDocument retdoc = new XmlDocument();   try { string url = "http://search.twitter.com/search.atom?&q=" + searchfor; if (sinceid.Length > 0) url += "since_id=" + sinceid; HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url); request.Method = "POST"; request.ContentType = "application/x-www-form-urlencoded"; WebResponse res = request.GetResponse(); retdoc.Load(res.GetResponseStream()); res.Close();   } catch { } return retdoc; } } I’ve got two overloads, that optionally let you pass in the last ID to look for as well as what you want to search for. Note that Twitter rate limits the amount of requests you can send,  see http://apiwiki.twitter.com/Rate-limiting So realistically I wanted my app to run every hour or so and only pull out results that haven’t been received before (hence the overload to pass in the sinceid parameter). I’ll post the code when finished that parses the returned XML.

    Read the article

  • Default or fink python and lxml under 10.6.8

    - by songdogtech
    Ah, confusion. I'm trying to install a python library called lxml as needed by a python script. I've been through numerous SU quesitons and answers. I haven't been able to make much progress. I run easy_install lxml and get: Processing lxml-3.0.1-py2.6-macosx-10.6-universal.egg lxml 3.0.1 is already the active version in easy-install.pth Using /Library/Python/2.6/site-packages/lxml-3.0.1-py2.6-macosx-10.6-universal.egg Processing dependencies for lxml Finished processing dependencies for lxml but when I run my script, I get: File "scraper.py", line 3, in import lxml.html File "/Library/Python/2.6/site-packages/lxml-3.0.1-py2.6-macosx-10.6-universal.egg/lxml/html/init.py", line 42, in from lxml import etree ImportError: dlopen(/Library/Python/2.6/site-packages/lxml-3.0.1-py2.6-macosx-10.6-universal.egg/lxml/etree.so, 2): Symbol not found: _htmlParseChunk Referenced from: /Library/Python/2.6/site-packages/lxml-3.0.1-py2.6-macosx-10.6-universal.egg/lxml/etree.so Expected in: flat namespace in /Library/Python/2.6/site-packages/lxml-3.0.1-py2.6-macosx-10.6-universal.egg/lxml/etree.so I think that maybe I'm not using the correct python install? I've installed python with fink, but should I use OS X's python? This is in my .profile: test -r /sw/bin/init.sh && . /sw/bin/init.sh which points to the fink install. echo $PATH gives me: /sw/bin:/sw/sbin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin:/usr/X11/bin:/usr/X11R6/bin Should I change that to point to snow leopard's python? (Which is 2.6.1) In Library/, there is: which are the lxml libaries I need, it appears, as well as requests. And whereis python gives me /usr/bin/python What do I do? How do I get python to use these libraries. And which python?

    Read the article

  • restore content database in sharepoint server 2007

    - by Boris
    I have a site collection set up at web app running at port 80. I have made the backup of the site collection content db using stsadm.exe tool. Now, I want to restore that backup as a new content db of a different site collection - the one set up at web app running at port 500. I have done the following: Created a backup Created new web app at port 500 (I did not create a site collection for this web app) I have removed the content db of that new web app using Central Administration I have run the stsadm.exe -o addcontentdb -url webapp-at-port-500 -databasename Command is successfully completed, however when I check the Content Database page for that web app, it says that the Number of Sites is 0! Also, when I try to open http://webapp-at-port-500, I get the error saying that the webpage cannot be found. Could anyone please help me, it's driving me crazy. Thanks.

    Read the article

  • Cyrus on CentOS with sasl / pam / ldap

    - by Oscar
    SASL/PAM/LDAP is driving me crazy... that's what I read a lot when googling for problems in this area, and what I experience myself :-S I'm trying to get Cyrus imap working for virtual hosting on CentOS with this authorisation backend and really don't know what's happening. In saslauthd I configured the LDAP search filter to use, but it looks like pam completely ignores it. Here's what I do for testing (done more tests but all with similar results): [root@testserv ~]# imtest -u [email protected] -a [email protected] WARNING: no hostname supplied, assuming localhost S: * OK [CAPABILITY IMAP4 IMAP4rev1 LITERAL+ ID STARTTLS] testserv. Cyrus IMAP4 v2.3.7-Invoca-RPM-2.3.7-7.el5_6.4 server ready C: C01 CAPABILITY S: * CAPABILITY IMAP4 IMAP4rev1 LITERAL+ ID STARTTLS ACL RIGHTS=kxte QUOTA MAILBOX-REFERRALS NAMESPACE UIDPLUS NO_ATOMIC_RENAME UNSELECT CHILDREN MULTIAPPEND BINARY SORT SORT=MODSEQ THREAD=ORDEREDSUBJECT THREAD=REFERENCES ANNOTATEMORE CATENATE CONDSTORE IDLE LISTEXT LIST-SUBSCRIBED X-NETSCAPE URLAUTH S: C01 OK Completed Please enter your password: C: L01 LOGIN [email protected] {6} S: + go ahead C: <omitted> S: L01 NO Login failed: authentication failure Authentication failed. generic failure Security strength factor: 0 C: Q01 LOGOUT * BYE LOGOUT received Q01 OK Completed Connection closed. The LDAP entry does exist (and so does the mailbox in Cyrus): [root@testserv ~]# ldapsearch -WxD cn=Manager,o=mydomain,c=com [email protected] Enter LDAP Password: # extended LDIF # # LDAPv3 # base <> with scope subtree # filter: [email protected] # requesting: ALL # # myuser, accounts, testserv.mydomain.com, mydomain, com dn: uid=myuser,ou=accounts,dc=testserv.mydomain.com,o=mydomain,c=com objectClass: top objectClass: person objectClass: organizationalPerson objectClass: inetOrgPerson objectClass: posixAccount objectClass: shadowAccount uidNumber: 16 uid: myuser gidNumber: 5 givenName: My sn: Name mail: [email protected] cn: My Name userPassword:: dYN5ebB0fXhNRn1pZllhRnJX7Uk= shadowLastChange: 15176 homeDirectory: /dev/null # search result search: 2 result: 0 Success # numResponses: 2 # numEntries: 1 This is what I get in /var/log/messages Aug 2 04:00:11 testserv cyrus/imap[12514]: auxpropfunc error invalid parameter supplied Aug 2 04:00:19 testserv saslauthd[5926]: do_auth : auth failure: [[email protected]] [service=imap] [realm=testserv.mydomain.com] [mech=pam] [reason=PAM auth error] ... /var/adm/auth.log Aug 2 04:00:11 testserv cyrus/imap[12514]: auxpropfunc error invalid parameter supplied Aug 2 04:00:11 testserv cyrus/imap[12514]: _sasl_plugin_load failed on sasl_auxprop_plug_init for plugin: ldapdb Aug 2 04:00:19 testserv saslauthd[5926]: DEBUG: auth_pam: pam_authenticate failed: User not known to the underlying authentication module Aug 2 04:00:19 testserv saslauthd[5926]: do_auth : auth failure: [[email protected]] [service=imap] [realm=testserv.mydomain.com] [mech=pam] [reason=PAM auth error] (AFAIK I can ignore the auxprop msg) ... and /var/log/slapd.log: Aug 2 04:00:19 testserv slapd[5968]: conn=61 fd=27 ACCEPT from IP=127.0.0.1:51403 (IP=0.0.0.0:389) Aug 2 04:00:19 testserv slapd[5968]: conn=61 op=0 BIND dn="" method=128 Aug 2 04:00:19 testserv slapd[5968]: conn=61 op=0 RESULT tag=97 err=0 text= Aug 2 04:00:19 testserv slapd[5968]: conn=61 op=1 SRCH base="o=mydomain,c=com" scope=2 deref=0 filter="([email protected])" Aug 2 04:00:19 testserv slapd[5968]: conn=61 op=1 SEARCH RESULT tag=101 err=0 nentries=0 text= Aug 2 04:00:19 testserv slapd[5968]: conn=61 op=2 UNBIND Aug 2 04:00:19 testserv slapd[5968]: conn=61 fd=27 closed These are the settings in In /etc/imapd.conf: sasl_mech_list: PLAIN LOGIN sasl_pwcheck_method: saslauthd ## sasl_auxprop_plugin: sasldb sasl_auto_transition: no and my sasl config: [root@testserv ~]# cat /etc/sysconfig/saslauthd # Directory in which to place saslauthd's listening socket, pid file, and so # on. This directory must already exist. SOCKETDIR=/var/run/saslauthd # Mechanism to use when checking passwords. Run "saslauthd -v" to get a list # of which mechanism your installation was compiled with the ablity to use. MECH=pam # Additional flags to pass to saslauthd on the command line. See saslauthd(8) # for the list of accepted flags. FLAGS="-c -r -O /etc/saslauthd.conf" [root@testserv ~]# cat /etc/saslauthd.conf ldap_servers: ldap://127.0.0.1/ ldap_search_base: dc=%d,o=mydomain,c=com ldap_auth_method: bind #ldap_filter: (|(uid=%u)((&(mail=%u@%d)(accountStatus=active))) ldap_filter: (&(mail=%u@%d)(accountStatus=active)) ldap_debug: 1 ldap_version: 3 The accountStatus=active is not in ldap yet, but that doesn't make a difference since I don't see it in the filter... that's not the reason for the failure. The weird thing is, I do get an error when I rename or remove /etc/saslauthd.conf, but when the file exists it seems happily ignored... The filter in slapd.log seems to be taken from /etc/ldap.conf. Apart from some timers, that only contains: host 127.0.0.1 base o=mydomain,c=com pam_login_attribute mail Outcommenting the pam_login_attribute results in this filter in slapd.log: filter="([email protected])" Pam-imap looks like this: [root@testserv ~]# cat /etc/pam.d/imap auth required pam_ldap.so debug account required pam_ldap.so debug #auth sufficient pam_unix.so likeauth nullok #auth sufficient pam_ldap.so use_first_pass #auth required pam_deny.so #account sufficient pam_unix.so #account sufficient pam_ldap.so The outcommented stuff is because I don't have the cyrus admin user in Ldap; that's a Linux user. That works fine when uncommented, but I still need to play around with that a little and first I wanna get imap working. Finally nsswitch: [root@testserv ~]# cat /etc/nsswitch.conf # # /etc/nsswitch.conf # # An example Name Service Switch config file. This file should be # sorted with the most-used services at the beginning. # # The entry '[NOTFOUND=return]' means that the search for an # entry should stop if the search in the previous entry turned # up nothing. Note that if the search failed due to some other reason # (like no NIS server responding) then the search continues with the # next entry. # # Legal entries are: # # nisplus or nis+ Use NIS+ (NIS version 3) # nis or yp Use NIS (NIS version 2), also called YP # dns Use DNS (Domain Name Service) # files Use the local files # db Use the local database (.db) files # compat Use NIS on compat mode # hesiod Use Hesiod for user lookups # [NOTFOUND=return] Stop searching if not found so far # # To use db, put the "db" in front of "files" for entries you want to be # looked up first in the databases # # Example: #passwd: db files nisplus nis #shadow: db files nisplus nis #group: db files nisplus nis passwd: compat ldap group: compat ldap shadow: compat ldap hosts: files dns bootparams: nisplus [NOTFOUND=return] files ethers: files netmasks: files networks: files protocols: files rpc: files services: files netgroup: nisplus publickey: nisplus automount: files nisplus aliases: files nisplus Any info where to start looking will be greatly appreciated! Thnx in advance

    Read the article

  • Menu tab completion for recent history in zsh

    - by dat5h
    I am interested in a potential zle widget for zsh. Is there a way to build a widget that mimics the kill-completion selectable menu? Essentially I want to be able to press , tab in vi-command-mode, or maybe !-tab-completion at the shell and get a list of recent history (or related history compared what is already entered at the commandline) that allows me to scroll through it and possibly select a relevant function to call or compare similar calls. Looking through the manual I stumbled onto a similar widget that I have mapped like so: # tab completion history menu (vicmd) autoload -z history-beginning-search-menu zle -N history-beginning-search-menu-space-end history-beginning-search-menu bindkey -M vicmd "\t" history-beginning-search-menu-space-end # emacs binding could be "\e\t"? (I wouldn't know) Therefore, if I enter vicmd and hit tab when I enter something like "grep", then I get a list of all grep calls in history. It also asks me for the list-number and it will perform the numbered item in history. If I enter a space and then try this, it lists ALL of my history history. This is fairly close to what I want, but there are some problems. For example, 1) it prints the entire list of relevant history and does not check the number of lines of the screen so it could easily blow up the space on the terminal; 2) when I type in numbers for selecting an item in history it does not show me the numbers I type, so I may make a mistake and have to start over again; 3) I would love to be able to hook in appearance tweaks. I was wondering if there exists more updated version of this widget or if there is any way to look at the source for kill-completion or history-beginning-search-menu to see if I could think of a way to do it.

    Read the article

  • My -tpl file won't update!

    - by Kyle Sevenoaks
    Hi, I am running the site at www.euroworker.no, it's a linux server and the site has a backend editor. It's a smarty/php site, and when I try to update a few of the .tpl's (two or three) they don't update. I have tried uploading through FTP and that doesn't work either. I have no knowledge of how servers work or anything, please help? It runs on the livecart system. Thanks!

    Read the article

  • My -tpl file won't update!

    - by Kyle Sevenoaks
    Hi, I am running the site at www.euroworker.no, it's a linux server and the site has a backend editor. It's a smarty/php site, and when I try to update a few of the .tpl's (two or three) they don't update. I have tried uploading through FTP and that doesn't work either. I have no knowledge of how servers work or anything, please help? It runs on the livecart system. Thanks!

    Read the article

  • What are the pros (and cons) of using “Sign in with Twitter/Facebook” for a new website?

    - by Paul D. Waite
    Myself and a friend are looking to launch a little forum site. I’m considering using the “Sign in with Facebook/Twitter” APIs, possibly exclusively (a la e.g. Lanyrd), for user login. I haven’t used either of these before, nor run a site with user logins at all. What are the pros (and cons) of these APIs? Specifically: What benefits do I get as a developer from using them? What drawbacks are there? Do end users actually like/dislike them? Have you experienced any technical/logistical issues with these APIs specifically? Here are the pros and cons I’ve got so far: Pros More convenient for the user (“register” with two clicks, sign in with one) Possibly no need to maintain our own login system  Cons No control over our login process Exclude Facebook/Twitter users who are worried about us having some sort of access to their accounts Users’ accounts on our site are compromised if their Facebook/Twitter accounts are compromised. And if we don’t maintain our own alternative login system: Dependency on Facebook/Twitter for our login system Exclude non-Facebook/non-Twitter users from our site

    Read the article

  • Rewrite rule to redirect subdomains to subdirectories in IIS7?

    - by Mark Rogers
    Can someone give me an example rule, or process to create a rewrite rule, to redirect a subdomain to a subdirectory in IIS 7? Basically I want to rewrite http://s1.site.com to http://site.com/s1/, so that the user will see s1.site.com but will actually be hitting site.com/s1/. Specifically I'm a rewrite rookie and I want to be able to fill out the pattern, conditions, and action of the rewrite rule (basically everything) in IIS 7. If anyone could give me a good pattern string, and a Rewrite URL string, that would solve my problem immediately.

    Read the article

  • Revamped Google Webmaster Tools

    With a positive surprise I realized today that Google's Webmaster Tools had some minor overhauling and provide some more details than before. Most obvious are the changes on the dashboard where the Top Search Queries now provide information about impressions and clicktroughs instead of the rankings before. Only the links of the search expressions are missing. It seems that the Top search queries were in the focus of this update. The section is now spiced with detailed graphs about what happened during selectable periods on your site. Well, seems that the Webmaster Tools mimic a stripped-down version of Google Analytics... I was very pleased by the details that are offered when you click on a single query term. Really nice to see the search rankings and your responsible URLs at the same time. Before, you had to put two browser instances side-by-side to achieve this kind of overview. Personally, I like the approach to visualize statistics the way Google or other providers do. It gives you a quick and informative overview, and enables you to dig further into details about peaks and lows on your visits, page impressions or clickthroughs.

    Read the article

  • Looking for the best ec2 setup for 3 sites totaling in 1.5 mil in traffic monthly

    - by john h.
    I am looking to consolidate our current aws setup of 2 Large ubuntu ec2 servers and 2 large RDS server for our 3 websites that have a total of about 1.5 million hits a month and increasing every month with the majority of traffic (1 mil) to one forum site in the group and the rest of traffic to an ecommerce site and a small wordpress site. So here is my question/thought? Would it be better for us to combine the two ec2 large servers to just one and same with the 2 RDS servers so we run all three sites off one large ec2 and one RDS. -or- Should we setup maybe 2-3 smaller ec2 servers load balenced and a single RDS. -or- Something completely different setup? One concern is that if one site crashes it takes with it the others. It happened in the past but I am pretty sure its because of the forum software and not the server setup. -john

    Read the article

  • How do you find all the links to disavow for a Google reconsideration request? [duplicate]

    - by QF_Developer
    This question already has an answer here: How to identify spammy domains giving backlinks to my site (to submit in disavow links in WMT) 2 answers A few months ago I received the following notification on Google Webmaster for a website I look after. Unnatural links to your site—impacts links Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole. Learn more. The question here is, should we actively attempt to disavow these links given that the action is seemingly targeted to just a bunch of keywords? I've downloaded the inbound links sample from Google Webmaster and so far I've been through the disavow and reconsideration requests process 6 times, each taking 2-3 weeks only to be supplied just 2 more links that Google don't approve of. At this rate it will take me the rest of my natural life to cleanup all these spammy links! It seems disavowing is futile as they haven't implemented broad actions against the website as a whole and (from what I can gather) have already nullified the value of those offending links. Under the quoted statement above however is a reconsideration request button that seems to imply I should be actively doing something here? UPDATE 14th October -- I have since created a small .NET application that you can feed the CSV sample links file into from Google Webmaster. What this tool does is crawl all the links and looks for specific linking patterns as per some configurable match strings. I realised that many of the links that Google are taking issue with were created by a rogue SEO firm we hired several years ago. All the links are appended with 1 of 5 different descriptions. The application I built uses some regexes to isolate any link sources with these matching appendages and automatically builds the disavow txt file. In the end it had to come down to an algorithm as manually disavowing links on this scale would take weeks! I will post the app here once I've cleaned it up.

    Read the article

  • how to use iis 6.0 redirects?

    - by payling
    I'm attempting to use the below reference to create a re-direct for my local site with no luck. http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/6b855a7a-0884-4508-ba95-079f38c77017.mspx?mfr=true I want absolute links on my local site that point online to point to my local site instead. example absolute link [http://online.com/products] when I click the local version I'd like it to redirect to: [http://offline/products] I want to preserve everything after the domain name and append it to the server (local) name so that when I click a link it will redirect to the local site and not the online version. I've tried [http://offline$S] but that doesn't append the "suffix" /products the way I thought it should. What's going on here?

    Read the article

  • What are the pros (and cons) of using “Sign in with Twitter/Facebook” for a new website?

    - by Paul D. Waite
    Myself and a friend are looking to launch a little forum site. I’m considering using the “Sign in with Facebook/Twitter” APIs, possibly exclusively (a la e.g. Lanyrd), for user login. I haven’t used either of these before, nor run a site with user logins at all. What are the pros (and cons) of these APIs? Specifically: What benefits do I get as a developer from using them? What drawbacks are there? Do end users actually like/dislike them? Have you experienced any technical/logistical issues with these APIs specifically? Here are the pros and cons I’ve got so far: Pros More convenient for the user (“register” with two clicks, sign in with one) Possibly no need to maintain our own login system  Cons No control over our login process Exclude Facebook/Twitter users who are worried about us having some sort of access to their accounts Users’ accounts on our site are compromised if their Facebook/Twitter accounts are compromised. And if we don’t maintain our own alternative login system: Dependency on Facebook/Twitter for our login system Exclude non-Facebook/non-Twitter users from our site

    Read the article

  • MacPorts pHash not showing up in Python

    - by Nitzan Wilnai
    I am having a problem where python does not show pHash installed even though I installed it using macports. I made sure I am using the MacPorts version of Python by doing: sudo port select --set python python27 I then installed pHash by doing: sudo port install pHash. It installed without any errors. When I call help('modules'), I do not see pHash listed among the installed packages. Any ideas on why python is not seeing the pHash install by MacPorts? Calling port select --list python shows the following: Available versions for python: none python25-apple python26-apple python27 (active) python27-apple Printing out sys.path outputs the following: (reformatted to make it easier to read here) ['/Library/Python/2.7/site-packages/boto-2.9.9-py2.7.egg', '/Library/Python/2.7/site-packages/setuptools-0.9.8-py2.7.egg', '/Library/Python/2.7/site-packages/pip-1.4.1-py2.7.egg', '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python27.zip', '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7', '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/plat-darwin', '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/plat-mac', '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/plat-mac/lib-scriptpackages', '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-tk', '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-old', '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload', '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages', '/Library/Python/2.7/site-packages'] Can anyone help? Thanks.

    Read the article

  • DNS: forward part of a managed domain to one host, but sub domain services to another (Google Apps)

    - by Paul Zee
    I was going to post this as a comment against DNS: Forward domain to another host, but I don't seem able to do that. I'm in a similar situation. I have a DNS registered/managed by enom, except with the slight twist that the domain was originally registered with enom through a Google Apps account creation. The domain currently supports a Google Apps site/account. I now want to direct the bare primary domain and www entries to a hosting provider for the website component, but leave the Google Apps setup intact for its services such as calendar, mail etc. For now, I'm leaving the domain managed by enom. Also note that when I registered my account with the hosting provider, I gave the same domain name as the existing domain (e.g. example.com), so at their end I'm working with the same domain name in cpanel, etc. In my case, the existing enom DNS entries don't have an A record for the www.example.com, or the bare example.com domain. Instead, there are 4 x @ records with the Google Apps IP Address, 2 x TXT records with what I assume are Google Apps site verification strings/tokens, and a bunch of CNAME records for the various features of Google Apps (mail, calendar, docs, sites, etc). So, my questions: How do I point the www.example.com and example.com DNS entries at enom to my web site hosting provider, while leaving the domain managed by enom, and the Google Apps services working as they are now (with the obvious exception of Google Sites)? How do I setup the example.com mail-related DNS records (MX, etc) at the web site hosting provider, so that outbound email to [email protected] gets correctly sent to the google apps mail account, and doesn't get trapped inside the pseudo domain within the hosting providers servers?

    Read the article

  • Choosing between a dedicated and virtual dedicated server for my startup

    - by MarathonStudios
    I'm about to launch a startup site I've been working on for some time, and I'm just now looking over hosting plans. The site's main feature is fairly processor-heavy (a lot of text processing), so I probably need something other then shared hosting to ensure I don't get shut down for overusing resources. I would like to spend as little as possible on hosting until the site starts generating income, so under-$60/mo is my goal. One caveat is that I need a Windows box for this particular site, so it's harder to get a good deal. For that price, I can either get a bottom-tier dedicated (2gigs ram, pentium 4) or a middle-tier VPS (3gb RAM, a bit more traffic and HD) for a few bucks more per month. I had a bad experience with a low-end VPS a few months ago, so making sure that whatever I get can handle the basic traffic of a website as well as giving me what I need (extra processing power) is essential. Do you have any suggestions as to which way to go, or a certain hosting company you've worked with that you can recommend?

    Read the article

  • Partner Spotlight

    - by rituchhibber
    FADATA   Fadata officially became a WebCenter Content Specialized partner in the Adriatic region upon the successful completion of the corresponding Oracle specialization tests. ''Being recognized by Oracle and customers will greatly help our company in maintaining a high level of implementation services related to Oracle Web Center Content, as one of strategic products we are focused on. This certification, that our team is very proud of, will certainly help our company FADATA to gain additional advantage, competitiveness, and integrity in implementing Oracle Web Center Content solutions, both on current and future projects in the region'' according to Velimir Corovic and Marjan Nikolic from Fadata. Please put link www.fadata.bg, under Fadata Please also include logo after 1st sentence FISHBOWL SOLUTIONS Google Search Appliance Connector for Oracle WebCenter Content The Google Search Appliance (GSA) provides fast search for your intranet or website. Fishbowl Solutions provides a connector for the Google Search Appliance to integrate it with the Oracle WebCenter Content Server while retaining the security benefits of Oracle WebCenter Content. For more information and real customer example click here Fairview Health Services Case Study or Webinar recording SIGNUM Signum TTE, from Turkey and a Gold member of Oracle® PartnerNetwork (OPN), recently announced it has achieved OPN Specialized status for Oracle WebCenter Portal. Signum TTE which began operations in the IT sector in 2005, is an innovative software solution house focusing on two main issues. Signum TTE presents services and solutions on Oracle Middleware products and technologies and its own product "WinDesk Service Management".

    Read the article

  • Triangulation A* (TA*) pathfinding algorithm

    - by hyn
    I need help understanding the Triangle A* (TA*) algorithm that is described by Demyen in his paper Efficient Triangulation-Based Pathfinding, on pages 76-81. He describes how to adapt the regular A* algorithm for triangulation, to search for other possibly more optimal paths, even after the final node is reached/expanded. Regular A* stops when the final node is expanded, but this is not always the best path when used in a triangulated graph. This is exactly the problem I'm having. The problem is illustrated on page 78, Figure 5.4: I understand how to calculate the g and h values presented in the paper (page 80). And I think the search stop condition is: if (currentNode.fCost > shortestDistanceFound) { // stop break; } where currentNode is the search node popped from the open list (priority queue), which has the lowest f-score. shortestDistanceFound is the actual distance of the shortest path found so far. But how do I exclude the previously found paths from future searches? Because if I do the search again, it will obviously find the same path. Do I reset the closed list? I need to modify something, but I don't know what it is I need to change. The paper lacks pseudocode, so that would be helpful.

    Read the article

  • Is meta description still relevant?

    - by Jeff Atwood
    I received this bit of advice about the meta description tag recently: Meta descriptions are used by Google probably 80% of the time for the snippet. They don’t help with rankings but you should probably use them. You could just auto generate them from the first part of the question. The description tag exists in the header, like so: <meta name="Description" content="A brief summary of the content on the page."> I'm not sure why we would need this field, as Google seems perfectly capable of showing the relevant search terms in context in the search result pages, like so (I searched for c# list performance): In other words, where would a meta description summary improve these results? We want the page to show context around the actual search hits, not a random summary we inserted! Google Webmaster Central has this advice: For some sites, like news media sources, generating an accurate and unique description for each page is easy: since each article is hand-written, it takes minimal effort to also add a one-sentence description. For larger database-driven sites, like product aggregators, hand-written descriptions are more difficult. In the latter case, though, programmatic generation of the descriptions can be appropriate and is encouraged -- just make sure that your descriptions are not "spammy." Good descriptions are human-readable and diverse, as we talked about in the first point above. The page-specific data we mentioned in the second point is a good candidate for programmatic generation. I'm struggling to think of any scenario when I would want the Google-generated summary, that is, actual context from the page for the search terms, to be replaced by a hard-coded meta description summary of the question itself.

    Read the article

  • How to diagnose "Internet explorer cannot display the webpage"

    - by Colen
    Our web site is working great for 99.99% of our users, but a few people (all of whom use Internet Explorer) are running into an error. Most pages on the site load fine, but for one specific page (the same page for all affected users), all they get is: Internet explorer cannot display the webpage It doesn't matter whether the page is accessed over http or https - it fails to load either way. Every other page on the site, as far as I can tell, works fine for them. Not only that, the same users can load that specific page fine in Firefox. I've checked the web server logs and I can't find any smoking guns there. The site is running IIS on Windows Server 2003. Is there any way to get IE to give the user more information than just "cannot display the web page"? There's a "More information" button, but all it tells you is to make sure that your DNS servers are working, make sure you're not working offline, etc. :(

    Read the article

< Previous Page | 326 327 328 329 330 331 332 333 334 335 336 337  | Next Page >