Search Results

Search found 21372 results on 855 pages for 'google picasa'.

Page 371/855 | < Previous Page | 367 368 369 370 371 372 373 374 375 376 377 378  | Next Page >

  • CodePlex Daily Summary for Thursday, April 08, 2010

    CodePlex Daily Summary for Thursday, April 08, 2010New ProjectsBackUpAnyWhere: BackUpAnyWhereCustomFormbyEndUser: 在项目开发中,经常遇到不同的用户对同一报表有不同要求的情况,有时甚至用户需要从头生成一个报表,在以前可能使用第三方的开发工具来实现。在SQL Server2005中,通过使用Reporting Services可以使最终用户不通过编码,只要了解数据结构就能自行编辑报表。本例使用Adventur...DbExecutor - linq based database executor: IEnumerable based database reader. (linq like primitive sql executor)DeepZoomRenderingPack: A collection of libraries and plug-ins architecture that turns various files (like PDF, PS, etc.) into a "Visual" representation that the DeepZoom ...DotNetNuke Russian Language packs: DNNRussianLP - DotNetNuke Russian Language pack. F# Refactor: Deisgned to bring Code Refactoring capabilities to the F# Language in Visual Studio 2010. Invocando WebService e Site HTTP dinamicamente com HTTPWebRequest C#: Invocando Site HTTP e WebService dinamicamente com HTTPWebRequest Passando o SoapAction e Envelope XML Escrito em C# www.biztalkbrasil.c...Jitbit WYSWYG BBCode Editor: "Jitbit WYSIWYG-BBCode" is a browser-based JavaScript-powered WYSIWYG BBCode editorMRDS Services for Phidgets: MRDS (Microsoft Robotics Developer Studio) Services for Phidgets provides additional services for Phidgets sensors and controllers that are not inc...MSBuild Addin: This tool is a simple addin for VisualStudio 2008 used in association with Microsoft MSBuild. It allows you to run MSBuild directly inside Visual S...NISHIL-BizTalk Custom Eventlog Functiod: While testing our maps at times when it fails we cant trace it because we don’t know what the output of the functiods are. Normally in a single ma...Northest GNSG: Supinfo B3C Paris Northest University project. Galego, Neveu, Simon, Geissmann.Oily: Composite application project for oil parameters. It's developed in C#Outlook.Utility: The MSDN article Outlook Customization for Integrating with Enterprise Applications at http://msdn.microsoft.com/en-us/library/Aa479345 has quite a...Particle Plot Pivot: Scan select particle physics experiment web sites for plots and generate a Pivot display for easy browsing.project tca: project tca - translating chat application. Satisfyr: A new way of performing assertions on tests so that they remain agnostic to the underlying test framework, and leverage .NET built-in lambda syntax.sejce2008: jce se course wiki and projects linksSGB Controls: SGB Controls is a set of standard .net controls that include a number of enhancements to make life easier for the developer. These controls incl...Syringe: Syringe is a lightweight service container and dependency injection library designed for use with ASP.NET MVC2. Supported features: Dependency inj...topicbox: topicboxUr-Index: Ur-Index makes it a lot easier to create onomastic indexes for books in pdf format.VietGeeks ZohoDocApis: Implement .NET Zoho Document Apis library to help developer can intergrate Zoho Docs easy with their websitesWebometrics Dashboard: Webometrics Dashboardwebpress: It is a WebBased CMS and Blog platform.WPF Ink Canvas Toolbar: WPF Ink Canvas Toolbar makes it easy for WPF developers to use pen input in TabletPC or UMPC applications. The WPF InkCanvas control has drawing, e...WS-TMS: WS GISG HTT TMSNew ReleasesBatterySaver: Version 1.0: Fixed battery increase/decrease events not firing Fixed memory corruption error Added working set trimming (used very sparingly) Fixed poorly rende...Chargify.NET: Chargify.NET 0.65: Added in Transactions, Subscription Re-activation, and finally XML documentation (which has been missing in the previous releases).DbExecutor - linq based database executor: DbExecutor ver.1.0.0.1: renameDotNetNuke Russian Language packs: Russian Language Pack for DotNetNuke 04.09.02: Russian Language Pack for DotNetNuke 04.09.02Encrypted Notes: Encrypted Notes 1.6.3: This is the latest version of Encrypted Notes (1.6.3), with general improvements. It has an installer that will create a directory 'CPascoe' in My ...Invocando WebService e Site HTTP dinamicamente com HTTPWebRequest C#: Código projeto CallSiteHTTP: Código escrito em C#.NET 2.0 - VS2005 Contem: Solution completa(código e executável) XML de configuração - Config.xml ...Jitbit WYSWYG BBCode Editor: Main package: Contains the JS-file, CSS-file and a sample.Live Writer Picasa Plugin: Live Writer Picasa Plugin 1.1.0: Changelog Communication with Picasa Web Albums is done directly via HTTP now (v1.0.0 used Google's GData .NET Libraries) The plugin can search fo...MRDS Services for Phidgets: Phidgets for RDS 2008 R3: First Beta Release This ZIP file contains a web page called Readme_CodePlex.html that explains how to install the RDS Phidgets services for RDS 200...MSBuild Addin: MsBuildAddin-v1.0.0: Initial versionMSBuild Addin: MsBuildAddin-v1.0.0-src.zip: Initial versionOutlook.Utility: Outlook.Utility v1: I have used most of the code in previous projects and seems to be quite stable. Of course the point of open sourcing this is so this project is use...Scrum Dashboard: Scrum Dashboard v3 Alpha 1: Scrum Dashboard v3 is targeting .NET 4, TFS 2010 and the brand new Scrum for Team System v3 process templates. Most of the code has been rewritten ...SharePoint Labs: SPLab4004A-FRA-Level100: SPLab4004A-FRA-Level100 This SharePoint Lab will teach you the 4th best practice you should apply when writing code with the SharePoint API. Lab La...SharePoint Labs: SPLab5012A-FRA-Level100: SPLab5012A-FRA-Level100 This SharePoint Lab will teach you how to provision a new welcome page (how to change and rename the default.aspx page) on ...Shweet: SharePoint 2010 Team Messaging built with Pex: Shweet Source Code: Although the latest version pex and moles used with this project is not available, we thought it would be useful to provide a download to the source.Syringe: Syringe 1.0: Features Dependency injection on properties of services in container Dependency injection on constructors of services in container ASP.Net Mvc ...Text to HTML: 0.4.1.0: Cambios de la versiónOptimización del código de exportación reduciendo el código. Cambio en el icono de exportación. Añadido menú Seleccionar t...VsTortoise - a TortoiseSVN add-in for Microsoft Visual Studio: VsTortoise Build 23: Build 23 Fix: Executing "Blame" through the Solution Explorer on a file opens TortoiseMerge rather than TortoiseBlame. Build 22 (beta) New: Visua...WPF Ink Canvas Toolbar: WPF Ink Canvas Toolbar 1.0: First release - included custom colour selectionMost Popular ProjectsRawrWBFS ManagerMicrosoft SQL Server Product Samples: DatabaseASP.NET Ajax LibrarySilverlight ToolkitAJAX Control ToolkitWindows Presentation Foundation (WPF)ASP.NETMicrosoft SQL Server Community & SamplesFacebook Developer ToolkitMost Active ProjectsGraffiti CMSnopCommerce. Open Source online shop e-commerce solution.RawrShweet: SharePoint 2010 Team Messaging built with Pexpatterns & practices – Enterprise LibraryAcadsysAutoPocoIonics Isapi Rewrite FilterNcqrs Framework - The CQRS framework for .NETFarseer Physics Engine

    Read the article

  • Pros/Cons of switching from Exchange to GMail

    - by Brent
    We are a medium-large non-profit company, with around 1000 staff and volunteers, and have been using MS Exchange (currently 2003) for our mail system for years. I recently attended a Google conference where they were positing that "Cloud computing is the way of the future", and encouraging us to switch from doing our own email with Exchange, to using GMail and Google Apps for everything. Additionally, one of our departments has been pushing from inside to do this transition within their own department, if not throughout the entire organization. I can definitely see some benefits - such as: Archive space - we never seem to have the space our users want, and of course, the more we get, the more we have to back up OS Agnostic - Exchange is definitely built for windows, and with mac and linux users on the rise, these users increasingly demand better tools / support. Google offers this. Better archiving - potential of e-discovery, that doesn't exist in a practical way with our current setup. Switching would relieve us of a fair bit of server administration, give more options to our end users, and free up the server resources we are now using for Exchange. Our IT department wants to be perceived as providing up-to-date solutions to technical problems, and this change would definitely provide such an image. Google's infrastructure is obviously much more robust than ours, and they employ some of the world's best security and network experts. However, there are also some serious drawbacks: We would be essentially outsourcing one of our mission-critical systems to a 3rd party The switch would inevitably involve Google Apps and perhaps more as well. That means we would have a-lot more at the mercy of a single (potentially weak) password. (is there a way to make this more secure using a password plus physical key of some sort??) Our data would not remain under our roof - or even in our country (Canada). This obviously has plusses on the Disaster Recovery side, but I think there are potential negatives on the legal side. I can't imagine that somebody as large as Google would be as responsive as we would want with regard to non-critical issues such as tracing missing emails, etc. (not sure how much access we would have to basic mail logs - for instance) Can anyone help me evaluate this decision? What issues am I overlooking? What experiences have you had with this transition (or the opposite - gmail to Exchange) Can you add to the points I have already outlined?

    Read the article

  • Is there a tool that can test what SSL/TLS cipher suites a particular website offers?

    - by Jeremy Powell
    Is there a tool that can test what SSL/TLS cipher suites a particular website offers? I've tried openssl, but if you examine the output: $ echo -n | openssl s_client -connect www.google.com:443 CONNECTED(00000003) depth=1 /C=ZA/O=Thawte Consulting (Pty) Ltd./CN=Thawte SGC CA verify error:num=20:unable to get local issuer certificate verify return:0 --- Certificate chain 0 s:/C=US/ST=California/L=Mountain View/O=Google Inc/CN=www.google.com i:/C=ZA/O=Thawte Consulting (Pty) Ltd./CN=Thawte SGC CA 1 s:/C=ZA/O=Thawte Consulting (Pty) Ltd./CN=Thawte SGC CA i:/C=US/O=VeriSign, Inc./OU=Class 3 Public Primary Certification Authority --- Server certificate -----BEGIN CERTIFICATE----- MIIDITCCAoqgAwIBAgIQL9+89q6RUm0PmqPfQDQ+mjANBgkqhkiG9w0BAQUFADBM MQswCQYDVQQGEwJaQTElMCMGA1UEChMcVGhhd3RlIENvbnN1bHRpbmcgKFB0eSkg THRkLjEWMBQGA1UEAxMNVGhhd3RlIFNHQyBDQTAeFw0wOTEyMTgwMDAwMDBaFw0x MTEyMTgyMzU5NTlaMGgxCzAJBgNVBAYTAlVTMRMwEQYDVQQIEwpDYWxpZm9ybmlh MRYwFAYDVQQHFA1Nb3VudGFpbiBWaWV3MRMwEQYDVQQKFApHb29nbGUgSW5jMRcw FQYDVQQDFA53d3cuZ29vZ2xlLmNvbTCBnzANBgkqhkiG9w0BAQEFAAOBjQAwgYkC gYEA6PmGD5D6htffvXImttdEAoN4c9kCKO+IRTn7EOh8rqk41XXGOOsKFQebg+jN gtXj9xVoRaELGYW84u+E593y17iYwqG7tcFR39SDAqc9BkJb4SLD3muFXxzW2k6L 05vuuWciKh0R73mkszeK9P4Y/bz5RiNQl/Os/CRGK1w7t0UCAwEAAaOB5zCB5DAM BgNVHRMBAf8EAjAAMDYGA1UdHwQvMC0wK6ApoCeGJWh0dHA6Ly9jcmwudGhhd3Rl LmNvbS9UaGF3dGVTR0NDQS5jcmwwKAYDVR0lBCEwHwYIKwYBBQUHAwEGCCsGAQUF BwMCBglghkgBhvhCBAEwcgYIKwYBBQUHAQEEZjBkMCIGCCsGAQUFBzABhhZodHRw Oi8vb2NzcC50aGF3dGUuY29tMD4GCCsGAQUFBzAChjJodHRwOi8vd3d3LnRoYXd0 ZS5jb20vcmVwb3NpdG9yeS9UaGF3dGVfU0dDX0NBLmNydDANBgkqhkiG9w0BAQUF AAOBgQCfQ89bxFApsb/isJr/aiEdLRLDLE5a+RLizrmCUi3nHX4adpaQedEkUjh5 u2ONgJd8IyAPkU0Wueru9G2Jysa9zCRo1kNbzipYvzwY4OA8Ys+WAi0oR1A04Se6 z5nRUP8pJcA2NhUzUnC+MY+f6H/nEQyNv4SgQhqAibAxWEEHXw== -----END CERTIFICATE----- subject=/C=US/ST=California/L=Mountain View/O=Google Inc/CN=www.google.com issuer=/C=ZA/O=Thawte Consulting (Pty) Ltd./CN=Thawte SGC CA --- No client certificate CA names sent --- SSL handshake has read 1777 bytes and written 316 bytes --- New, TLSv1/SSLv3, Cipher is AES256-SHA Server public key is 1024 bit Compression: NONE Expansion: NONE SSL-Session: Protocol : TLSv1 Cipher : AES256-SHA Session-ID: 748E2B5FEFF9EA065DA2F04A06FBF456502F3E64DF1B4FF054F54817C473270C Session-ID-ctx: Master-Key: C4284AE7D76421F782A822B3780FA9677A726A25E1258160CA30D346D65C5F4049DA3D10A41F3FA4816DD9606197FAE5 Key-Arg : None Start Time: 1266259321 Timeout : 300 (sec) Verify return code: 20 (unable to get local issuer certificate) --- it just shows that the cipher suite is something with AES256-SHA. I know I could grep through the hex dump of the conversation, but I was hoping for something a little more elegant. I would prefer Linux tools, but Windows (or other) would be fine. This question is motivated by the security testing I do for PCI and general penetration testing. Update: GregS points out below that the SSL server picks from the cipher suites of the client. So it seems I would need to test all cipher suites one at a time. I think I can hack something together, but is there a tool that does particularly this?

    Read the article

  • Need help merging 2 AHK scripts

    - by Mikey
    i have two functioning scripts that i want to merge into a single AHK File. My problem is that when i combine both scripts, the second script doesnt function or causes an error on script 1. Either way, script 2 ist not functioning at all. Here are some facts: Script 1 = a simple menu script where i want to assign hotkeys to. Script 2 = A small launcher script from a user named Tertius in autohotkey forum. Can someone please look at both codes and help me merge this? The INI File for script 2 looks like this: Keywords.ini npff|Firefox|Firefox gm|Gmail|http://gmail.google.com ;;;;;;;;;;;; BEGIN SCRIPT 2 DetectHiddenWindows, On SetWinDelay, -1 SetKeyDelay, -1 SetBatchLines, -1 GoSub Remin SetTimer, Remin, % 1000 * 60 Loop, read, %A_ScriptDir%\keywords.ini { LineNumber = %A_Index% Loop, parse, A_LoopReadLine, | { if (A_Index == 1) abbrevs%LineNumber% := A_LoopField else if (A_Index == 2) tips%LineNumber% := A_LoopField else if (A_Index == 3) programs%LineNumber% := A_LoopField else if (A_Index == 4) params%LineNumber% := A_LoopField } tosay := abbrevs%LineNumber% } cnt = %LineNumber% Loop { Input, Key, L1 V, % "{LControl}{RControl}{LAlt}{RAlt}{LShift}{RShift}{LWin}{RWin}" . "{AppsKey}{F1}{F2}{F3}{F4}{F5}{F6}{F7}{F8}{F9}{F10}{F11}{F12}{Left}{Right}{Up}{Down}" . "{Home}{End}{PgUp}{PgDn}{Del}{Ins}{BS}{Capslock}{Numlock}{PrintScreen}{Pause}{Escape}" If( ( Asc(Key) = 65 && Asc(Key) <= 90 ) || ( Asc(Key) = 97 && Asc(Key) <= 122 ) ) Word .= Key Else { Word := "" Continue } tipup := false Loop %cnt% { if (Word == abbrevs%A_index%) { tip := tips%A_index% ToolTip %tip% tipup := true } else { if (tipup == false) ToolTip } } } $Tab:: Loop %cnt% { if (Word != "" && Word == abbrevs%A_index%) { Word := "" StringLen, len, abbrevs%A_index% Loop %len% Send {Shift Down}{Left} Send {Shift Up}{BS} ToolTip program := programs%A_index% param := params%A_index% run, %program% %param% return } } Word := "" Send {Tab} Return ~LButton:: ~MButton:: ~RButton:: ~XButton1:: ~XButton2:: Word := "" Tooltip Return Remin: WinMinimize, %A_ScriptFullPath% - AutoHotkey v WinHide, %A_ScriptFullPath% - AutoHotkey v Return ;;;;;;;;;; END SCRIPT 2 ;;;;;;;;;;;;;; BEGIN SCRIPT 1 ;This is a working script that creates a popup menu. ; Create the popup menu by adding some items to it. Menu, MyMenu, Add, FIS 201, MenuHandler Menu, MyMenu, Add ; Add a separator line. Menu, MyMenu, Color, Lime, Single ;Define the Menu Color ; Create another menu destined to become a submenu of the above menu. Menu, Submenu1, Add, Item2, MenuHandler Menu, Submenu1, Add, Item3, MenuHandler Menu, Submenu1, Color, Yellow ;Define the Menu Color ; Create another menu destined to become a submenu of the above menu. Menu, Submenu2, Add, Item1a, MenuHandler Menu, Submenu2, Add, Item2a, MenuHandler Menu, Submenu2, Add, Item3a, MenuHandler Menu, Submenu2, Add, Item4a, MenuHandler Menu, Submenu2, Add, Item5a, MenuHandler Menu, Submenu2, Add, Item6a, MenuHandler Menu, Submenu2, Color, Aqua ;Define the Menu Color ; Create a submenu in the first menu (a right-arrow indicator). When the user selects it, the second menu is displayed. Menu, MyMenu, Add, BKRS 119, :Submenu1 Menu, MyMenu, Add ; Add a separator line below the submenu. Menu, MyMenu, Add, BKRS 201, :Submenu2 Menu, MyMenu, Add ; Add a separator line below the submenu. Menu, MyMenu, Add ; Add a separator line below the submenu. Menu, MyMenu, Add, Google Search, Google ; Add another menu item beneath the submenu. return ; End of script's auto-execute section. Capslock & LButton::Menu, MyMenu, Show ; i.e. press the Win-Z hotkey to show the menu. MenuHandler: MsgBox You selected %A_ThisMenuItem% from the menu %A_ThisMenu%. return ;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;; Google Search ;;; FORMAT InputBox, OutputVar [, Title, Prompt, HIDE, Width, Height, X, Y, Font, Timeout, Default] Google: InputBox, SearchTerm, Google Search,,,350, 120 if SearchTerm < "" Run http://www.google.de/search?sclient=psy-ab&hl=de&site=&source=hp&q=%SearchTerm%&btnG=Suche return ; Make Window Transparent Space::WinSet, Transparent, 125, A ^!Space UP::WinSet, Transparent, OFF, A return ;;;;;;;;;;; END SCRIPT 1 Help is appreciated. Kind Regards, Mikey

    Read the article

  • Homepage 301 Redirect to SSL Homepage

    - by user33692
    I'm hoping somebody might be able to provide a bit of advice on an issue I am having. I have 1 site where we implemented a 301 redirect on the homepage from http to https. We have links on the homepage to other parts of the site that are not under SSL (in fact there is only one other page under SSL). When I go to our webmaster account I notice that we are not being provided with any webmaster information (search queries, backlinks) related to our homepage under SSL. I performed a Fetch Google on the homepage and the information it returned is: HTTP/1.1 301 Moved Permanently Date: Fri, 08 Nov 2013 17:26:24 GMT Server: Apache/2.2.16 (Debian) Location: https://mysite.com/ Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 242 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>301 Moved Permanently</title> </head><body> <h1>Moved Permanently</h1> <p>The document has moved <a href="https://mysite.com/">here</a>.</p> <hr> <address>Apache/2.2.16 (Debian) Server at mysite.com</address> </body></html> I am worried that the fact that Google Fetch is not getting the correct Title Tags and Meta information from our homepage and that this is hurting our search results. Additionally, I am worried that we need to do something specific with the SiteMap to ensure that Google is correctly indexing all our pages and being able to flow from the https to the http without issues. Does anybody have any advice on how we can correctly set this up or be sure that Google is fetching the correct information?

    Read the article

  • How do I do an exact whois search?

    - by brianegge
    When I execute the following whois command on my Ubuntu server, I get all sorts of other domains which contain google.com in the name, but clearly aren't owned by google. As this appears to be some sort of spam, I won't paste the output here. I'd like to check for exactly the name I typed in. I thought the following would work, but it doesn't. What is the proper way to do an exact match? whois -Hx google.com

    Read the article

  • Is this SPF record correct for me?

    - by DT
    I'm completely new to Stack Overflow, so Hi! I need to add an SPF record to my site "main.com" (not the real address) to allow an email publishing company "emailpublishers.com" (not the real address) to send emails on my behalf. However, I'm nervous about adding an SPF record because of the havoc it could wreak if done incorrectly. I use Google Apps. I also use "auxiliary.com" to send mail from "main.com." And, of course, I use "main.com" to send mail as well. "auxiliary.com" doesn't have an SPF record. I used Microsofts' and OpenSPF's wizards to generate the following SPF entry. Does it seem to be correct for me? "v=spf1 a mx ip4:55.55.555.55 mx:alt1.aspmx.l.google.com mx:alt2.aspmx.l.google.com mx:aspmx.l.google.com mx:aspmx2.googlemail.com mx:aspmx3.googlemail.com mx:aspmx4.googlemail.com mx:aspmx5.googlemail.com a:auxiliary.com include:_spf.google.com include:auxiliary.com mx:auxiliary.com include:emailpublishers.com mx:emailpublishers.com ~all" However, my host MediaTemple says in a knowledge base article to use: v=spf1 a:main.com/20 ~all So that added to my confusion. Thanks a lot!

    Read the article

  • Easy solution to monitoring & blocking connections to non-malicious services, IP's, and tracking companies

    - by binarybunny
    Our family lives in the middle of nowhere, so the only high-speed internet available is Verizon's 3G mobile broadband. We have the highest package available, yet continually go over the 10GB limit and get charged $10 every 1GB we go over. We run a business from home, so stopping when we hit the limit is not an option. I've found the majority of connections are to Google, Microsoft, Akamai, Facebook, and other web service companies (mainly google). I know these are harmless connections, but when it costs money for them to monitor our web activity it becomes a serious problem. Here's some things I've done, but I'm sure there's something else that could help before blocking a huge set of IP ranges: stopped using windows (on my machine) use MVPS host file on all computers use firefox on all computers (with don't track me option) ad block plugin on all browsers blocking google updates blocking windows updates block images in browsers (when possible) use comodo (paranoia-level style of blocking..) virus-free computers with ESET NOD32 bought router and installed dd-wrt in attempt to block connections more diligently (and throttle bandwidth if it comes to that) Anything I'm missing? I know Google analytics is on almost all websites, as well as FB like buttons but I would like to be able to stop these connections without blocking use of google services like gmail, etc. Any ideas?

    Read the article

  • Blocking 'good' bots in nginx with multiple conditions for certain off-limits URL's where humans can go

    - by Glenn Plas
    After 2 days of searching/trying/failing I decided to post this here, I haven't found any example of someone doing the same nor what I tried seems to be working OK. I'm trying to send a 403 to bots not respecting the robots.txt file (even after downloading it several times). Specifically Googlebot. It will support the following robots.txt definition. User-agent: * Disallow: /*/*/page/ The intent is to allow Google to browse whatever they can find on the site but return a 403 for the following type of request. Googlebot seems to keep on nesting these links eternally adding paging block after block: my_domain.com:80 - 66.x.67.x - - [25/Apr/2012:11:13:54 +0200] "GET /2011/06/ page/3/?/page/2//page/3//page/2//page/3//page/2//page/2//page/4//page/4//pag e/1/&wpmp_switcher=desktop HTTP/1.1" 403 135 "-" "Mozilla/5.0 (compatible; G ooglebot/2.1; +http://www.google.com/bot.html)" It's a wordpress site btw. I don't want those pages to show up, even though after the robots.txt info got through, they stopped for a while only to begin crawling again later. It just never stops .... I do want real people to see this. As you can see, google get a 403 but when I try this myself in a browser I get a 404 back. I want browsers to pass. root@my_domain:# nginx -V nginx version: nginx/1.2.0 I tried different approaches, using a map and plain old nono if's and they both act the same: (under http section) map $http_user_agent $is_bot { default 0; ~crawl|Googlebot|Slurp|spider|bingbot|tracker|click|parser|spider 1; } (under the server section) location ~ /(\d+)/(\d+)/page/ { if ($is_bot) { return 403; # Please respect the robots.txt file ! } } I recently had to polish up my Apache skills for a client where I did about the same thing like this : # Block real Engines , not respecting robots.txt but allowing correct calls to pass # Google RewriteCond %{HTTP_USER_AGENT} ^Mozilla/5\.0\ \(compatible;\ Googlebot/2\.[01];\ \+http://www\.google\.com/bot\.html\)$ [NC,OR] # Bing RewriteCond %{HTTP_USER_AGENT} ^Mozilla/5\.0\ \(compatible;\ bingbot/2\.[01];\ \+http://www\.bing\.com/bingbot\.htm\)$ [NC,OR] # msnbot RewriteCond %{HTTP_USER_AGENT} ^msnbot-media/1\.[01]\ \(\+http://search\.msn\.com/msnbot\.htm\)$ [NC,OR] # Slurp RewriteCond %{HTTP_USER_AGENT} ^Mozilla/5\.0\ \(compatible;\ Yahoo!\ Slurp;\ http://help\.yahoo\.com/help/us/ysearch/slurp\)$ [NC] # block all page searches, the rest may pass RewriteCond %{REQUEST_URI} ^(/[0-9]{4}/[0-9]{2}/page/) [OR] # or with the wpmp_switcher=mobile parameter set RewriteCond %{QUERY_STRING} wpmp_switcher=mobile # ISSUE 403 / SERVE ERRORDOCUMENT RewriteRule .* - [F,L] # End if match This does a bit more than I asked nginx to do but it's about the same principle, I'm having a hard time figuring this out for nginx. So my question would be, why would nginx serve my browser a 404 ? Why isn't it passing, The regex isn't matching for my UA: "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.30 Safari/536.5" There are tons of example to block based on UA alone, and that's easy. It also looks like the matchin location is final, e.g. it's not 'falling' through for regular user, I'm pretty certain that this has some correlation with the 404 I get in the browser. As a cherry on top of things, I also want google to disregard the parameter wpmp_switcher=mobile , wpmp_switcher=desktop is fine but I just don't want the same content being crawled multiple times. Even though I ended up adding wpmp_switcher=mobile via the google webmaster tools pages (requiring me to sign up ....). that also stopped for a while but today they are back spidering the mobile sections. So in short, I need to find a way for nginx to enforce the robots.txt definitions. Can someone shell out a few minutes of their lives and push me in the right direction please ? I really appreciate ANY response that makes me think harder ;-)

    Read the article

  • Different nginx rules based on referrer

    - by juana
    I'm using WordPress with WP Super Cache. I want visitors who come from Google (That inlcudes all country specific referrers like google.co.in, google.co.uk and etc.) to see uncached contents. There are my nginx rules which are not working the way I want: server { server_name website.com; location / { root /var/www/html/website.com; index index.php; if ($http_referer ~* (www.google.com|www.google.co) ) { rewrite . /index.php break; } if (-f $request_filename) { break; } set $supercache_file ''; set $supercache_uri $request_uri; if ($request_method = POST) { set $supercache_uri ''; } if ($query_string) { set $supercache_uri ''; } if ($http_cookie ~* "comment_author_|wordpress|wp-postpass_" ) { set $supercache_uri ''; } if ($supercache_uri ~ ^(.+)$) { set $supercache_file /wp-content/cache/supercache/$http_host/$1index.html; } if (-f $document_root$supercache_file) { rewrite ^(.*)$ $supercache_file break; } if (!-e $request_filename) { rewrite . /index.php last; } } location ~ \.php$ { fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME /var/www/html/website.com$fastcgi_script_name; include fastcgi_params; } } What should I do to achieve my goal?

    Read the article

  • PHP Mail() to Gmail = Spam

    - by grantw
    Recently Gmail has started marking emails sent directly from my server (using php mail()) as spam and I'm having problems trying to find the issue. If I send an exact copy of the same email from my email client it goes to the Gmail inbox. The emails are plain text, around 7 lines long and contain a URL link in plain text. As the emails sent from my client are getting through fine I'm thinking that the content isn't the issue. It would be greatly appreciated if someone could take a look at the the following headers and give me some advice why the email from the server is being marked as spam. Email from Server: Delivered-To: [email protected] Received: by 10.49.98.228 with SMTP id el4csp101784qeb; Thu, 15 Nov 2012 14:58:52 -0800 (PST) Received: by 10.60.27.166 with SMTP id u6mr2296595oeg.86.1353020331940; Thu, 15 Nov 2012 14:58:51 -0800 (PST) Return-Path: [email protected] Received: from dom.domainbrokerage.co.uk (dom.domainbrokerage.co.uk. [174.120.246.138]) by mx.google.com with ESMTPS id df4si17005013obc.50.2012.11.15.14.58.51 (version=TLSv1/SSLv3 cipher=OTHER); Thu, 15 Nov 2012 14:58:51 -0800 (PST) Received-SPF: pass (google.com: domain of [email protected] designates 174.120.246.138 as permitted sender) client-ip=174.120.246.138; Authentication-Results: mx.google.com; spf=pass (google.com: domain of [email protected] designates 174.120.246.138 as permitted sender) [email protected]; dkim=pass [email protected] DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=domainbrokerage.co.uk; s=default; h=Date:Message-Id:Content-Type:Reply-to:From:Subject:To; bh=2RJ9jsEaGcdcgJ1HMJgQG8QNvWevySWXIFRDqdY7EAM=; b=mGebBVOkyUhv94ONL3EabXeTgVznsT1VAwPdVvpOGDdjBtN1FabnuFi8sWbf5KEg5BUJ/h8fQ+9/2nrj+jbtoVLvKXI6L53HOXPjl7atCX9e41GkrOTAPw5ZFp+1lDbZ; Received: from grantw by dom.domainbrokerage.co.uk with local (Exim 4.80) (envelope-from [email protected]) id 1TZ8OZ-0008qC-Gy for [email protected]; Thu, 15 Nov 2012 22:58:51 +0000 To: [email protected] Subject: Offer Accepted X-PHP-Script: www.domainbrokerage.co.uk/admin.php for 95.172.231.27 From: My Name [email protected] Reply-to: [email protected] Content-Type: text/plain; charset=Windows-1251 Message-Id: [email protected] Date: Thu, 15 Nov 2012 22:58:51 +0000 X-AntiAbuse: This header was added to track abuse, please include it with any abuse report X-AntiAbuse: Primary Hostname - dom.domainbrokerage.co.uk X-AntiAbuse: Original Domain - gmail.com X-AntiAbuse: Originator/Caller UID/GID - [500 500] / [47 12] X-AntiAbuse: Sender Address Domain - domainbrokerage.co.uk X-Get-Message-Sender-Via: dom.domainbrokerage.co.uk: authenticated_id: grantw/from_h Email from client: Delivered-To: [email protected] Received: by 10.49.98.228 with SMTP id el4csp101495qeb; Thu, 15 Nov 2012 14:54:49 -0800 (PST) Received: by 10.182.197.8 with SMTP id iq8mr2351185obc.66.1353020089244; Thu, 15 Nov 2012 14:54:49 -0800 (PST) Return-Path: [email protected] Received: from dom.domainbrokerage.co.uk (dom.domainbrokerage.co.uk. [174.120.246.138]) by mx.google.com with ESMTPS id ab5si17000486obc.44.2012.11.15.14.54.48 (version=TLSv1/SSLv3 cipher=OTHER); Thu, 15 Nov 2012 14:54:49 -0800 (PST) Received-SPF: pass (google.com: domain of [email protected] designates 174.120.246.138 as permitted sender) client-ip=174.120.246.138; Authentication-Results: mx.google.com; spf=pass (google.com: domain of [email protected] designates 174.120.246.138 as permitted sender) [email protected]; dkim=pass [email protected] DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=domainbrokerage.co.uk; s=default; h=Content-Transfer-Encoding:Content-Type:Subject:To:MIME-Version:From:Date:Message-ID; bh=bKNjm+yTFZQ7HUjO3lKPp9HosUBfFxv9+oqV+NuIkdU=; b=j0T2XNBuENSFG85QWeRdJ2MUgW2BvGROBNL3zvjwOLoFeyHRU3B4M+lt6m1X+OLHfJJqcoR0+GS9p/TWn4jylKCF13xozAOc6ewZ3/4Xj/YUDXuHkzmCMiNxVcGETD7l; Received: from w-27.cust-7941.ip.static.uno.uk.net ([95.172.231.27]:1450 helo=[127.0.0.1]) by dom.domainbrokerage.co.uk with esmtpa (Exim 4.80) (envelope-from [email protected]) id 1TZ8Ke-0001XH-7p for [email protected]; Thu, 15 Nov 2012 22:54:48 +0000 Message-ID: [email protected] Date: Thu, 15 Nov 2012 22:54:50 +0000 From: My Name [email protected] User-Agent: Postbox 3.0.6 (Windows/20121031) MIME-Version: 1.0 To: [email protected] Subject: Offer Accepted Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 7bit X-AntiAbuse: This header was added to track abuse, please include it with any abuse report X-AntiAbuse: Primary Hostname - dom.domainbrokerage.co.uk X-AntiAbuse: Original Domain - gmail.com X-AntiAbuse: Originator/Caller UID/GID - [47 12] / [47 12] X-AntiAbuse: Sender Address Domain - domainbrokerage.co.uk X-Get-Message-Sender-Via: dom.domainbrokerage.co.uk: authenticated_id: [email protected]

    Read the article

  • Is meta description still relevant?

    - by Jeff Atwood
    I received this bit of advice about the meta description tag recently: Meta descriptions are used by Google probably 80% of the time for the snippet. They don’t help with rankings but you should probably use them. You could just auto generate them from the first part of the question. The description tag exists in the header, like so: <meta name="Description" content="A brief summary of the content on the page."> I'm not sure why we would need this field, as Google seems perfectly capable of showing the relevant search terms in context in the search result pages, like so (I searched for c# list performance): In other words, where would a meta description summary improve these results? We want the page to show context around the actual search hits, not a random summary we inserted! Google Webmaster Central has this advice: For some sites, like news media sources, generating an accurate and unique description for each page is easy: since each article is hand-written, it takes minimal effort to also add a one-sentence description. For larger database-driven sites, like product aggregators, hand-written descriptions are more difficult. In the latter case, though, programmatic generation of the descriptions can be appropriate and is encouraged -- just make sure that your descriptions are not "spammy." Good descriptions are human-readable and diverse, as we talked about in the first point above. The page-specific data we mentioned in the second point is a good candidate for programmatic generation. I'm struggling to think of any scenario when I would want the Google-generated summary, that is, actual context from the page for the search terms, to be replaced by a hard-coded meta description summary of the question itself.

    Read the article

  • Solr Autosuggest

    - by rahul
    Hi, I am using Solr (1.4) AutoSuggest feature using termsComponent. Currently, if I type 'goo' means, Solr suggest words like 'google'. But I would like to receive suggestions like 'google, google alerts, ..' . ie, suggestions with single and multiple terms. Not sure, whether I need to use edgengrams for that. for eg, indexing google like 'go', 'oo', 'og', ... . But I think I don't need this, Since I don't want partial search. Please let me know if there is any way to do multiple word suggestions . Thanks in Advance.

    Read the article

  • Gmail sends bulk messages sent by postfix to spam - spf, rDNS are set up (headers inside)

    - by snitko
    here are the headers of the blocked messages (actual domain replaced with domain.com, ip address with n.n.n.n and gmail account name with person.account): Delivered-To: [email protected] Received: by 10.216.89.137 with SMTP id c9cs247685wef; Tue, 6 Dec 2011 16:06:37 -0800 (PST) Received: by 10.224.199.134 with SMTP id es6mr14447757qab.2.1323216395590; Tue, 06 Dec 2011 16:06:35 -0800 (PST) Return-Path: <[email protected]> Received: from mail.domain.com (domain.com. [n.n.n.n]) by mx.google.com with ESMTP id b16si7471407qcv.131.2011.12.06.16.06.35; Tue, 06 Dec 2011 16:06:35 -0800 (PST) Received-SPF: pass (google.com: domain of [email protected] designates n.n.n.n as permitted sender) client-ip=n.n.n.n; Authentication-Results: mx.google.com; spf=pass (google.com: domain of [email protected] designates n.n.n.n as permitted sender) [email protected] Received: by mail.domain.com (Postfix, from userid 5001) id 26ADE381E3; Tue, 6 Dec 2011 19:06:35 -0500 (EST) Received: from domain.com (domain.com [127.0.0.1]) by mail.domain.com (Postfix) with ESMTP id 0148638030 for <[email protected]>; Tue, 6 Dec 2011 19:06:31 -0500 (EST) Date: Tue, 06 Dec 2011 19:06:31 -0500 From: DomainApp <[email protected]> Reply-To: [email protected] To: [email protected] Message-ID: <[email protected]> Subject: Roman Snitko says hi Mime-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 7bit X-No-Spam: True Precedence: bulk List-Unsubscribe: [email protected] Messages go to Spam folder on various gmail accounts, so it's not a coincidence. I followed all gmail guides on sending bulk emails from here https://mail.google.com/support/bin/answer.py?hl=en&answer=81126. I also checked my ip-address here http://www.dnsblcheck.co.uk/ and it's NOT on the blacklists. Thus I have two questions: What may be the possible reason for the messages to go to Spam folder? Is there any way to contact Google and ask them what causes this? Update: I have set up openDKIM on my server, everything works, gmail message headers say that dkim=pass, which means everything is set up correctly. Messages still end up in Spam folder.

    Read the article

  • C-panel mail goes into spam instead of inbox in gmail

    - by Robin Jain
    I have c-panel vps server. I have created a domain on the same server, but when I send mail through webmail to gmail email id it goes into the spam folder. Note--->Mail ip note blacklisted Spf records enable DKIM enable reverse dns are perfect ====================================================================== Email header Information: Delivered-To: [email protected] Received: by 10.143.93.13 with SMTP id v13csp119806wfl; Fri, 6 Jul 2012 08:01:36 -0700 (PDT) Received: by 10.182.52.42 with SMTP id q10mr26133912obo.46.1341586895571; Fri, 06 Jul 2012 08:01:35 -0700 (PDT) Return-Path: <[email protected]> Received: from lakshyacs-u.securehostdns.com ([50.97.147.134]) by mx.google.com with ESMTPS id fx3si18028369obc.144.2012.07.06.08.01.35 (version=TLSv1/SSLv3 cipher=OTHER); Fri, 06 Jul 2012 08:01:35 -0700 (PDT) Received-SPF: pass (google.com: domain of [email protected] designates 50.97.147.134 as permitted sender) client-ip=50.97.147.134; Authentication-Results: mx.google.com; spf=pass (google.com: domain of [email protected] designates 50.97.147.134 as permitted sender) [email protected] Received: from localhost.localdomain ([127.0.0.1]:39016 helo=harishjoshico.com) by lakshyacs-u.securehostdns.com with esmtpa (Exim 4.77) (envelope-from <[email protected]>) id 1SnA2J-0006Nq-05 for [email protected]; Fri, 06 Jul 2012 20:31:35 +0530 Received: from 223.189.14.213 ([223.189.14.213]) (SquirrelMail authenticated user [email protected]) by harishjoshico.com with HTTP; Fri, 6 Jul 2012 20:31:35 +0530 Message-ID: <[email protected]> Date: Fri, 6 Jul 2012 20:31:35 +0530 Subject: ggglkhl From: [email protected] To: [email protected] User-Agent: SquirrelMail/1.4.22 MIME-Version: 1.0 Content-Type: text/plain;charset=iso-8859-1 Content-Transfer-Encoding: 8bit X-Priority: 3 (Normal) Importance: Normal X-AntiAbuse: This header was added to track abuse, please include it with any abuse report X-AntiAbuse: Primary Hostname - lakshyacs-u.securehostdns.com X-AntiAbuse: Original Domain - gmail.com X-AntiAbuse: Originator/Caller UID/GID - [47 12] / [47 12] X-AntiAbuse: Sender Address Domain - harishjoshico.com jhkhl ================================================================

    Read the article

  • Rsync and wildcards

    - by Jay White
    I am trying to back up both the "Last Session" and "Current Session" files for Google Chrome in one command, but using a wildcard doesn't seem to work. I am trying with the following command rsync -e "ssh -i new.key" -r --verbose -tz --stats --progress --delete '/cygdrive/c/Users/jay/AppData/Local/Google/Chrome/User Data/Default/*Session' user@host:"/chrome\ sessions/" and get the following error rsync: link_stat "/cygdrive/c/Users/jay/AppData/Local/Google/Chrome/User Data/Default/*Session" failed: No such file or directory (2) What am I doing wrong?

    Read the article

  • SPF record for Gmail?

    - by Chris
    I have DNS, with a SPF TXT record, configured for a domain name. The primary user of the domain name now needs to be able to send both from our SMTP servers, and also from her GMail account. I've seen all the information about adding "include:_spf.google.com" to the SPF TXT record, but, as I look into it, it appears that record is outdated. In particular, I had the user send me a test message, and note that it was: Received: from mail-la0-f50.google.com (mail-la0-f50.google.com [209.85.215.50]) However, _spf.google.com doesn't list that IP address: $ dig +short _spf.google.com txt "v=spf1 ip4:216.239.32.0/19 ip4:64.233.160.0/19 ip4:66.249.80.0/20 ip4:72.14.192.0/18 ip4:209.85.128.0/17 ip4:66.102.0.0/20 ip4:74.125.0.0/16 ip4:64.18.0.0/20 ip4:207.126.144.0/20 ip4:173.194.0.0/16 ?all" (Note that a 209.85.21*8*.0 network is listed, but not 209.85.21*5*.0.) Is there a better way to enable sending from GMail? This user sends to at least one recipient with a strict SPF policy that bounces mail not from a designated host... Many thanks!

    Read the article

  • Multi language site - use of canonical link and link rel="alternate"

    - by julia
    I keep reading everywhere that if you have a multilanguage site, where the same page appears in, say, French and English, then this is considered as duplicate content by google. It is written that using canonical link is the solution, but I do not understand how to use it in this case. Should I: Choose either French URL or English URL to be the canonical (main) one, and where I will place the canonical link? If so, how do I decide which of the two URLs must be canonical? both languages are important to me and I want the content under both languages to be indexed by google and served to the user, depending on the language in which he searches. OR should I place a canonical link on both French and English URLs? If so, then I do not understand the meaning of using the canonical link? In this case would both URLs be indexed, are both of them considered as "important" by google and not duplicates? Also I read that link rel="alternate" can be used to indicate to google that, for example the French URL is the French-language equivalent of the English page. This makes sense and I understand how to use such links, but how are they combined with canonical links? Should I define both the canonical URL AND specify rel="alternate" in both URLs? Could someone help me to clarify this, cause I'm stuck with this and can't seem to find a good-enough explanation in different sources.

    Read the article

  • DNS zone file SPF configuration to support sending mail from multiple servers and gmail

    - by Tauren
    I want to configure SPF on a domain to allow mail to be sent from: the x.com website server (x.com and www.x.com - both at same IP) it's MX servers (smtp.x.com, mx.x.com, mail.x.com) another server that isn't listed as an MX server (somehost.x.com) via gmail using an account that has authenticated use of [email protected] Will this zone file work? If not, what are the problems with it? $ttl 38400 @ IN SOA ns1.x.com. hostmaster.x.com. ( 201003092 ; serial 8H ; refresh 15M ; retry 1W ; expire 1H ) ; minimum @ NS ns1.x.com. @ NS ns2.x.com. @ MX 10 mx.x.com. @ MX 20 smtp.x.com. @ MX 30 mailhost.x.com. ; SPF records @ IN TXT "v=spf1 a mx a:somehost.x.com include:_spf.google.com ~all" mx IN TXT "v=spf1 a -all" smtp IN TXT "v=spf1 a -all" mailhost IN TXT "v=spf1 a -all" Questions: Is _spf.google.com the right thing to include for gmail.com, or is it only for Google Hosted Apps? If only for Google Apps, what should I include to send from gmail.com? If mail shouldn't be sent from anywhere else, is it safe to use -all instead of ~all? Does it make sense to add specific SPF records for each of the mail servers? Any other problems with the zone file? I want to confirm these things before making changes to my zone file. The file has SPF configured basically the same now, just without google.com and somehost, but I want to make sure I won't break things when I change it.

    Read the article

  • Name resolution not working with ipv6 on centos

    - by jolivier
    I just installed CentOs 6.3 on a server to be installed in a data center, but cannot get name resolution / curl to work. I know this is because of it trying to use ipv6, since ping google.com works, curl -4 google.com works, but not curl google.com. I removed the ipv6 adress from the interface and it does not change anything. This is very problematic since most system tools like yum fail at name resolution currently. Browsers like Firefox work because they might be using another tool for name resolution than the one use by curl. I managed to fix this on workstations by completely disabling ipv6 following tutorials like this one / hardcoding name resolution in /etc/hosts. But since I am here configuring a server which will be later installed in a remote data center, I would like not to mess up, understand what is going on and fix it properly. Besides, I will face the same issue with more servers to come so I would really appreciate your help in understanding this problem and how to solve it. I would be happy to provide more information if needed to help understand what is going on. The current network configuration is a small enterprise network, with a DNS server (let's call it A) configured once a long time ago. dig google.com and dig -4 google.com are both refused by the A DNS. But this is also true for my workstation on which curl is working (and yes they both use the same A DNS server). Indeed this faulty server and my workstation have multiple nameservers in /etc/resolv.conf, and the second one is working fine for both of them, so if I remove A from my resolv.conf everything works fine! Regards, Olivier

    Read the article

  • GMail and Yahoo Mail servers not accepting mails from my slicehost slice

    - by Lakshmanan
    Hi, I have a rails in one of the slices at Slicehost. I've setup postfix (sendmail) to send emails from my rails app. All emails to Google Apps domain (to company setup google hosted paid email id) are getting delivered properly (but to spam folder). But all emails to [email protected], [email protected], .. @hotmail.com are not getting delivered and this is the line from my /var/log/mail.log Dec 21 17:33:56 staging postfix/smtp[32295]: 5EB4810545B: to=<[email protected]>, relay=j.mx.mail.yahoo.com[66.94.237.64]:25, delay=1.6, delays=0.02/0.01/1.5/0, dsn=4.0.0, status=deferred (host j.mx.mail.yahoo.com[66.94.237.64] refused to talk to me: 553 Mail from 173.203.201.186 not allowed - 5.7.1 [BL21] Connections not accepted from IP addresses on Spamhaus PBL; see http://postmaster.yahoo.com/errors/550-bl21.html [550]) and this is what i got for gmail Dec 21 17:29:17 staging postfix/smtp[32216]: 0FA3310545B: to=<[email protected]>, relay=gmail-smtp-in.l.google.com[74.125.65.27]:25, delay=0.59, delays=0.02/0.01/0.09/0.47, dsn=5.7.1, status=bounced (host gmail-smtp-in.l.google.com[74.125.65.27] said: 550-5.7.1 [173.203.201.186] The IP you're using to send mail is not authorized 550-5.7.1 to send email directly to our servers. Please use the SMTP relay at 550-5.7.1 your service provider instead. Learn more at 550 5.7.1 http://mail.google.com/support/bin/answer.py?answer=10336 v49si11176750yhc.16 (in reply to end of DATA command)) Please help. I have very little knowledge about setting dns, servers and stuff.

    Read the article

  • Gerrit ssh key setup on windows server

    - by hotpotato
    I am attempting to configure google's 'Gerrit' code review web app on a Windows server 2008 virtual machine on our internal network. We are using Apache Tomcat (6.0.36) to host the web app and have deployed the gerrit.war to tomcats webapp folder, setup the context.xml, web.xml etc for the web app correctly i believe. However when i startup Tomcat using the $CATALINA_HOME/bin/startup.bat i get the following message in the tomcat logs: *Dec 07, 2012 1:03:54 PM org.apache.catalina.core.StandardContext listenerStart SEVERE: Exception sending context initialized event to listener instance of class com.google.gerrit.httpd.WebAppInitializer com.google.inject.CreationException: Guice creation errors:* 1) No SSH keys under C:\Gerrit\config\etc while locating com.google.gerrit.sshd.HostKeyProvider at com.google.gerrit.sshd.SshModule.configure(SshModule.java:90) I have created a is_rsa.pub SSH key and placed it in the specified directory to no avail. I have been googling this for about a week now and can't seem to find any information about the file or format it is expecting... documentation on setting gerrit up on windows seems hard to come by! Can anyone provide useful information about how to correctly configure a host SSH key in this context?

    Read the article

  • When is meta description still relevant?

    - by Jeff Atwood
    I received this bit of advice about the meta description tag recently: Meta descriptions are used by Google probably 80% of the time for the snippet. They don’t help with rankings but you should probably use them. You could just auto generate them from the first part of the question. The description tag exists in the header, like so: <meta name="Description" content="A brief summary of the content on the page."> I'm not sure why we would need this field, as Google seems perfectly capable of showing the relevant search terms in context in the search result pages, like so (I searched for c# list performance): In other words, where would a meta description summary improve these results? We want the page to show context around the actual search hits, not a random summary we inserted! Google Webmaster Central has this advice: For some sites, like news media sources, generating an accurate and unique description for each page is easy: since each article is hand-written, it takes minimal effort to also add a one-sentence description. For larger database-driven sites, like product aggregators, hand-written descriptions are more difficult. In the latter case, though, programmatic generation of the descriptions can be appropriate and is encouraged -- just make sure that your descriptions are not "spammy." Good descriptions are human-readable and diverse, as we talked about in the first point above. The page-specific data we mentioned in the second point is a good candidate for programmatic generation. I'm struggling to think of any scenario when I would want the Google-generated summary, that is, actual context from the page for the search terms, to be replaced by a hard-coded meta description summary of the question itself.

    Read the article

  • Gmail.com detect mail as spam, but the server is not on any BlackList

    - by Tomer W
    I have an issue with Google. (GMail to be exact) About 1 month ago, we had a security breach, and mail was relayed through our servers. we got listed in almost ALL Black-Lists :( we fixed the problem, and requested removal from Black-lists, which was granted easily. currently (over 3 weeks), we are not sending any spam anymore. furthermore, we got clear from all the Black-lists (MxToolBox Black-List Search Result) But, GMail still refuse to get Anything from the server, stating '550 Spam'. Following, Telnet attempt to send to gmail: 220 mx.google.com ESMTP g47si45436208eep.123 helo megatec.co.il 250 mx.google.com at your service mail from: <[email protected]> 250 2.1.0 OK g47si45436208eep.123 rcpt to: <[email protected]> 250 2.1.5 OK g47si45436208eep.123 Data 354 Go ahead g47si45436208eep.123 Test123 . 550-5.7.1 [62.219.123.33 11] Our system has detected that this message is 550-5.7.1 likely unsolicited mail. To reduce the amount of spam sent to Gmail, 550-5.7.1 this message has been blocked. Please visit 550-5.7.1 http://support.google.com/mail/bin/answer.py?hl=en&answer=188131 for 550 5.7.1 more information. g47si45436208eep.123 Connection to host lost. i tried filling the form @ Gmail - Report Delivery Problem i also tried reaching Google by phone, but the message was to go to the Link mentioned above. I Checked ReverseDNS and is ok... We dont have TLS, but that shouldn't be a problem, shouldn't it? Note: we are not a Bulk sender. Anyone has an idea? what can be blocking our IP? Anyone know whom can be contacted in order to resolve this BL listing?

    Read the article

  • pingback / trackback support for a photo sharing website?

    - by cboettig
    Is there a photo sharing service, such as flickr or picasa, that will collect the urls of the locations where the photo has been posted on other blogs (or mentioned in tweets, etc?) This could be accomplished by posting each photo as a blog entry using wordpress, which would then automatically handle pingbacks, but of course a blog doesn't perform quite like a proper photo service. Perhaps this could be done with a private photo hosting server like zenphoto by editing the php, but that seems rather involved. Does such a service already exist?

    Read the article

< Previous Page | 367 368 369 370 371 372 373 374 375 376 377 378  | Next Page >