Search Results

Search found 1799 results on 72 pages for 'yahoo pipes'.

Page 67/72 | < Previous Page | 63 64 65 66 67 68 69 70 71 72  | Next Page >

  • How is incoming SMTP mail being delivered despite blocked port

    - by Josh
    I setup a MX mail server, everything works despite port 25 being blocked, I'm stumped as to why I am able to receive email with this setup, and what the consequences might be if I leave it this way. Here are the details: Connections to SMTP over port 25 and 587 both reliably connect over my local network. Connections to SMTP over port 25 are blocked from external IPs (the ISP is blocking the port). Connections to Submission SMTP over port 587 from external IPs are reliable. Emails sent from gmail, yahoo, and a few other addresses all are being delivered. I haven't found an email provider that fails to deliver mail to my MX. So, with port 25 blocked, I am assuming other MTA servers fallback to port 587, otherwise I can't imagine how the mail is received. I know port 25 shouldn't be blocked, but so far it works. Are there mail servers that this will not work with? Where can I find more about how this is working? -- edit More technical detail, to validate that I'm not missing something silly. Obviously in the transcript below I've replaced my actual domain with example.com. # DNS MX record points to the A record. $ dig example.com MX +short 1 example.com $ dig example.com A +short <Public IP address> # From a public server (not my ISP hosting the mail server) # We see port 25 is blocked, but port 587 is open $ telnet example.com 25 Trying <public ip>... telnet: Unable to connect to remote host: Connection refused # Let's try openssl $ openssl s_client -starttls smtp -crlf -connect example.com:25 connect: Connection refused connect:errno=111 # Again from a public server, we see port 587 is open $ telnet example.com 587 Trying <public ip>... Connected to example.com. Escape character is '^]'. 220 example.com ESMTP Postfix ehlo example.com 250-example.com 250-PIPELINING 250-SIZE 10485760 250-VRFY 250-ETRN 250-STARTTLS 250-ENHANCEDSTATUSCODES 250-8BITMIME 250-DSN 250-BINARYMIME 250 CHUNKING quit 221 2.0.0 Bye Connection closed by foreign host. Here is a portion from the mail log when receiving a message from gmail: postfix/postscreen[93152]: CONNECT from [209.85.128.49]:48953 to [192.168.0.10]:25 postfix/postscreen[93152]: PASS NEW [209.85.128.49]:48953 postfix/smtpd[93160]: connect from mail-qe0-f49.google.com[209.85.128.49] postfix/smtpd[93160]: 7A8C31C1AA99: client=mail-qe0-f49.google.com[209.85.128.49] The log shows that a connection was made to the local IP on port 25 (I'm not doing any port mapping, so it is port 25 on the public IP too). Seeing this leads me to hypothesize that the ISP block on port 25 only occurs when a connection is made from an IP address that is not known to be a mail server. Any other theories?

    Read the article

  • Problems sending SMTP email to large systems such as Gmail

    - by Martel
    I maintain a mail server. Recently messages sent to valid recipients on gmail, yahoo, and now roadrunner email addresses are bounced with similar messages: Here's one from gmail: The message, sent by [email protected], can not be delivered to following recipient(s): *recipient*@gmail.com There was a fatal SMTP error. Fatal DNS error: exchanger alt4.gmail-smtp-in.l.google.com. does not exist Delivery History Follows: [DLVR 000020 19-12-12 14:21:21] Delivering item 5573 [DLVR 000020 19-12-12 14:21:21] Resolving MX records for domain gmail.com [DLVR 000020 19-12-12 14:21:21] Retrieved 5 MX records for domain gmail.com [DLVR 000020 19-12-12 14:21:21] Delivering mail to 1 recipient(s) at domain gmail.com using exchanger gmail-smtp-in.l.google.com. [DLVR 000020 19-12-12 14:21:33] Host gmail-smtp-in.l.google.com. does not appear to exist... [DLVR 000020 19-12-12 14:21:33] Will try next exchanger [DLVR 000020 19-12-12 14:21:33] Delivering mail to 1 recipient(s) at domain gmail.com using exchanger alt1.gmail-smtp-in.l.google.com. [DLVR 000020 19-12-12 14:21:45] Host alt1.gmail-smtp-in.l.google.com. does not appear to exist... [DLVR 000020 19-12-12 14:21:45] Will try next exchanger [DLVR 000020 19-12-12 14:21:45] Delivering mail to 1 recipient(s) at domain gmail.com using exchanger alt2.gmail-smtp-in.l.google.com. [DLVR 000020 19-12-12 14:21:57] Host alt2.gmail-smtp-in.l.google.com. does not appear to exist... [DLVR 000020 19-12-12 14:21:57] Will try next exchanger [DLVR 000020 19-12-12 14:21:57] Delivering mail to 1 recipient(s) at domain gmail.com using exchanger alt3.gmail-smtp-in.l.google.com. [DLVR 000020 19-12-12 14:22:09] Host alt3.gmail-smtp-in.l.google.com. does not appear to exist... [DLVR 000020 19-12-12 14:22:09] Will try next exchanger [DLVR 000020 19-12-12 14:22:09] Delivering mail to 1 recipient(s) at domain gmail.com using exchanger alt4.gmail-smtp-in.l.google.com. [DLVR 000020 19-12-12 14:22:21] Host alt4.gmail-smtp-in.l.google.com. does not appear to exist... [DLVR 000020 19-12-12 14:22:21] Fatal error - host alt4.gmail-smtp-in.l.google.com. does not exist. Will bounce... [DLVR 000020 19-12-12 14:22:21] Bouncing to sender using bounce address [email protected]... Sometimes these emails get through, other times not. I'm at a loss to explain it.

    Read the article

  • Can't send email through Comcast SMTP to my domains

    - by Midnight Oil
    I am a Comcast customer with 3 computers and 3 computer users in the house. There are 2 fully updated Macs and one PC running Windows 7. We use Mail on the Macs, and Outlook on Windows 7. All computer accounts are configured to send mail through port 587 of smtp.comcast.net. I also have two personal domains registered with Network Solutions. For the sake of this discussion, call my domains myOwnDomain1.com and myOwnDomain2.com. I have email addresses at both domains. They are of the form [email protected] and [email protected]. Until recently, our email worked as expected. However, sometime between September 13, 2012 and September 19, 2012, we lost the ability to send email through Comcast's SMTP server to the email addresses at my personal domains. If we attempt to send email through Comcast's SMTP to those addresses, the email never arrives. Furthermore, the email clients give no indication of failure. The email just never arrives. The result is the same on all 3 computers and with all accounts on those computers. We can successfully send email through Comcast's SMTP from any of our accounts on any of our computers to any email address other than to my email addresses at my personal domains! However, I receive email at those domains that is not sent through smtp.comcast.net. For example, I can successfully send email from my gmail and yahoo accounts to my email addresses at my personal domains. Furthermore, I can successfully send email through smtp.myOwnDomain1.com to [email protected] and through smtp.myOwnDoman2.com to [email protected]. Comcast says the problem must be at Network Solutions. According to Network Solutions, their logs show they are not blocking reception of the email, and our IP address is not flagged as a spam source. They say the email is simply not arriving. Does anyone have any ideas why we can't send email through Comcast's SMTP server to my domains? As an odd coincidence, we recently noticed a change in Comcast's SMTP service. there is now a 5 minute delay on all outbound mail. Comcast's SMTP server seems to sit on the mail with a 5 minute timer.

    Read the article

  • .htaccess template, suggestions needed

    - by purpler
    # Defaults AddDefaultCharset UTF-8 DefaultLanguage en-US FileETag None Header unset ETag ServerSignature Off SetEnv TZ Europe/Belgrade # Rewrites Options +FollowSymLinks RewriteEngine On RewriteBase / # Redirect to WWW RewriteCond %{HTTP_HOST} ^serpentineseo.com RewriteRule (.*) http://www.serpentineseo.com/$1 [R=301,L] # Redirect index to root RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*index\.html\ HTTP/ RewriteRule ^(.*)index\.html$ /$1 [R=301,L] # Cache media files: ExpiresActive On ExpiresDefault A0 # Month <filesMatch "\.(gif|jpg|jpeg|png|ico|swf|js)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> # Week <FilesMatch "\.(css|pdf)$"> Header set Cache-Control "max-age=604800" </FilesMatch> # 10 Min <FilesMatch "\.(html|htm|txt)$"> Header set Cache-Control "max-age=600" </FilesMatch> # Do not cache <FilesMatch "\.(pl|php|cgi|spl|scgi|fcgi)$"> Header unset Cache-Control </FilesMatch> # Compress output <IfModule mod_deflate.c> <FilesMatch "\.(html|js|css)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error Documents ErrorDocument 206 /error/206.html ErrorDocument 401 /error/401.html ErrorDocument 403 /error/403.html ErrorDocument 404 /error/404.html ErrorDocument 500 /error/500.html # Prevent hotlinking RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www\.)?serpentineseo.com/.*$ [NC] RewriteRule \.(gif|jpg|png)$ http://www.serpentineseo.com/images/angryman.png [R,L] # Prevent offline browsers RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.*$ http://www.google.com [R,L] # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Deny access to sensitive files <FilesMatch "\.(htaccess|psd|log)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • .htaccess template, suggestions needed

    - by purpler
    DefaultLanguage en-US FileETag None Header unset ETag ServerSignature Off SetEnv TZ Europe/Belgrade # Rewrites Options +FollowSymLinks RewriteEngine On RewriteBase / # Redirect to WWW RewriteCond %{HTTP_HOST} ^serpentineseo.com RewriteRule (.*) http://www.serpentineseo.com/$1 [R=301,L] # Redirect index to root RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*index\.html\ HTTP/ RewriteRule ^(.*)index\.html$ /$1 [R=301,L] # Cache media files: ExpiresActive On ExpiresDefault A0 # Month <filesMatch "\.(gif|jpg|jpeg|png|ico|swf|js)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> # Week <FilesMatch "\.(css|pdf)$"> Header set Cache-Control "max-age=604800" </FilesMatch> # 10 Min <FilesMatch "\.(html|htm|txt)$"> Header set Cache-Control "max-age=600" </FilesMatch> # Do not cache <FilesMatch "\.(pl|php|cgi|spl|scgi|fcgi)$"> Header unset Cache-Control </FilesMatch> # Compress output <IfModule mod_deflate.c> <FilesMatch "\.(html|js|css)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error Documents ErrorDocument 206 /error/206.html ErrorDocument 401 /error/401.html ErrorDocument 403 /error/403.html ErrorDocument 404 /error/404.html ErrorDocument 500 /error/500.html # Prevent hotlinking RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www\.)?serpentineseo.com/.*$ [NC] RewriteRule \.(gif|jpg|png)$ http://www.serpentineseo.com/images/angryman.png [R,L] # Prevent offline browsers RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.*$ http://www.google.com [R,L] # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Deny access to sensitive files <FilesMatch "\.(htaccess|psd|log)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • Apache/2.2.20 (Ubuntu 11.10) gzip compression won't work on php pages, content is chunked

    - by FamousInteractive
    I'm running into a problem with a new production server whereto I'm transferring projects. The HTML output of the PHP applications isn't compressed by the Apache mod_deflate module. Other resources, as stylesheet and javascript files, even html pages, which are served with the same Content-type (text/html) as the PHP output, are compressed! The projects use the following rules (from HTML5 boilerplate) in the .htaccess: <IfModule mod_deflate.c> # Force deflate for mangled headers developer.yahoo.com/blogs/ydn/posts/2010/12/pushing-beyond-gzipping/ <IfModule mod_setenvif.c> <IfModule mod_headers.c> SetEnvIfNoCase ^(Accept-EncodXng|X-cept-Encoding|X{15}|~{15}|-{15})$ ^((gzip|deflate)\s*,?\s*)+|[X~-]{4,13}$ HAVE_Accept-Encoding RequestHeader append Accept-Encoding "gzip,deflate" env=HAVE_Accept-Encoding </IfModule> </IfModule> # HTML, TXT, CSS, JavaScript, JSON, XML, HTC: <IfModule filter_module> FilterDeclare COMPRESS FilterProvider COMPRESS DEFLATE resp=Content-Type $text/html FilterProvider COMPRESS DEFLATE resp=Content-Type $text/css FilterProvider COMPRESS DEFLATE resp=Content-Type $text/plain FilterProvider COMPRESS DEFLATE resp=Content-Type $text/xml FilterProvider COMPRESS DEFLATE resp=Content-Type $text/x-component FilterProvider COMPRESS DEFLATE resp=Content-Type $application/javascript FilterProvider COMPRESS DEFLATE resp=Content-Type $application/json FilterProvider COMPRESS DEFLATE resp=Content-Type $application/xml FilterProvider COMPRESS DEFLATE resp=Content-Type $application/xhtml+xml FilterProvider COMPRESS DEFLATE resp=Content-Type $application/rss+xml FilterProvider COMPRESS DEFLATE resp=Content-Type $application/atom+xml FilterProvider COMPRESS DEFLATE resp=Content-Type $application/vnd.ms-fontobject FilterProvider COMPRESS DEFLATE resp=Content-Type $image/svg+xml FilterProvider COMPRESS DEFLATE resp=Content-Type $image/x-icon FilterProvider COMPRESS DEFLATE resp=Content-Type $application/x-font-ttf FilterProvider COMPRESS DEFLATE resp=Content-Type $font/opentype FilterChain COMPRESS FilterProtocol COMPRESS DEFLATE change=yes;byteranges=no </IfModule> </IfModule> We have a testing machine that runs the same Apache, OS and PHP version. On that machine the compression works just fine on the PHP output. I've checked and compared Apache and PHP config files, all the same as far as I can tell. I've tried several manners of outputting the content of the PHP, using output buffering or just plain echoing the content. Same thing, no compression. Example response headers of a PHP output: HTTP/1.1 200 OK Date: Wed, 25 Apr 2012 23:30:59 GMT Server: Apache Accept-Ranges: bytes Expires: Thu, 19 Nov 1981 08:52:00 GMT Cache-Control: public Pragma: no-cache Vary: User-Agent Keep-Alive: timeout=5, max=98 Connection: Keep-Alive Transfer-Encoding: chunked Content-Type: text/html; charset=utf-8 Example of response headers on a css file: HTTP/1.1 200 OK Date: Wed, 25 Apr 2012 23:30:59 GMT Server: Apache Last-Modified: Mon, 04 Jul 2011 19:12:36 GMT Vary: Accept-Encoding,User-Agent Content-Encoding: gzip Cache-Control: public Expires: Fri, 25 May 2012 23:30:59 GMT Content-Length: 714 Keep-Alive: timeout=5, max=100 Connection: Keep-Alive Content-Type: text/css; charset=utf-8 Does anyone has a clue or experienced the same "problem"? thanks!

    Read the article

  • GMail detecting mail as spam

    - by Petru Toader
    I've been trying for a long time to get our company's mail server send mail that will get accepted by the GMail spam filter. I have managed making it work for Yahoo Mail and Hotmail, sadly GMail is still marking our mails as spam. I have configured DKIM, SPF, DMARC and verified our mail server IP address against blacklists. I also have pasted here the headers GMail gets when we send a mail. Delivered-To: [email protected] Received: by 10.42.215.6 with SMTP id hc6csp107427icb; Wed, 20 Aug 2014 07:34:26 -0700 (PDT) X-Received: by 10.194.100.34 with SMTP id ev2mr59101019wjb.76.1408545265402; Wed, 20 Aug 2014 07:34:25 -0700 (PDT) Return-Path: <[email protected]> Received: from mail.phyramid.com (mail.phyramid.com. [178.157.82.23]) by mx.google.com with ESMTPS id dj10si4827754wib.79.2014.08.20.07.34.24 for <[email protected]> (version=TLSv1.1 cipher=ECDHE-RSA-RC4-SHA bits=128/128); Wed, 20 Aug 2014 07:34:25 -0700 (PDT) Received-SPF: pass (google.com: domain of [email protected] designates 178.157.82.23 as permitted sender) client-ip=178.157.82.23; Authentication-Results: mx.google.com; spf=pass (google.com: domain of [email protected] designates 178.157.82.23 as permitted sender) [email protected]; dkim=pass [email protected] Received: from localhost (localhost [127.0.0.1]) by mail.phyramid.com (Postfix) with ESMTP id ED2BB2017AC for <[email protected]>; Wed, 20 Aug 2014 17:33:23 +0300 (EEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=phyramid.com; h= content-type:content-type:mime-version:x-mailer:subject:subject :message-id:to:from:from:date:date; s=dkim; t=1408545197; x= 1409409197; bh=e04RtoyF7G39lfCvA9LLhTz4nF64siZtN5IYmC18Xsc=; b=o +6mO8Uz4Uf1G4U2q6tKUiEy2N2n/5R2VtPPwIvBE5xzK/hEd2sDGMxVzQVgIDCsK Q0Xh+auPaQpxldQ+AEcL2XSZMrk/g0mJONjkpI19I5AwGIJCR1SVvxdecohTn9iR bCHzrGi2wAicfDBzOH6lUBNfh2thri79aubdCYc97U= X-Amavis-Modified: Mail body modified (using disclaimer) - mail.phyramid.com X-Virus-Scanned: Debian amavisd-new at mail.phyramid.com Received: from mail.phyramid.com ([127.0.0.1]) by localhost (mail.phyramid.com [127.0.0.1]) (amavisd-new, port 10024) with ESMTP id 3JcgXZAXeFtX for <[email protected]>; Wed, 20 Aug 2014 17:33:17 +0300 (EEST) Received: from whiterock.local (unknown [109.98.21.30]) by mail.phyramid.com (Postfix) with ESMTPSA id 05CAE200280 for <[email protected]>; Wed, 20 Aug 2014 17:33:15 +0300 (EEST) Date: Wed, 20 Aug 2014 17:34:15 +0300 From: Company Mail <[email protected]> To: [email protected] Message-ID: <[email protected]> Subject: hey there! X-Mailer: Airmail (247) MIME-Version: 1.0 Content-Type: text/plain; charset="utf-8" Content-Transfer-Encoding: 7bit Content-Disposition: inline How was your summer? ---- Thanks a lot!

    Read the article

  • Internet Pings but Does Not Load

    - by t3techcom18
    From what I've been seeing and been doing my research for the past two days, many people have been having the same issues throughout the years, however, this is the first time I've encountered this issue and many of the specific workarounds or fixes have not worked for me. I've been trying to work through this for 24 hours straight now, but to no avail so many thanks to those that can help. On Monday night, got home from work; surfing the internet for half an hour, everything was fine as always. Just after half an hour, my Internet got very sluggish and then it died completely. I thought it might have been the an update I just put through in terms of Windows Update that said was a critical update for MSE, as the same thing happened a few years ago. I did a System Restore to two different dates that were in the past two weeks, nothing. Uninstalled MSE and disabled Windows Defender and the Windows Firewall: Nothing. Reset IE Options, Reset Winsock, Dumping DNS, many of the other command prompt screens to reset items: Nothing. Reset the modem: Nothing. What DID work, however, was a ping test to Yahoo. The ping test worked, saying all four packets was recieved, yet nothing else popped up. LAN and CenturyLink said everything worked on their end and that everything was connected properly, as well as the speeds working fine. CenturyLink said in their notes that they thought Port 80 was blocked. I went and put in the Firewall to allow Port 80 but it didn't make any difference whatsoever. I remembered I had a spare modem laying around and I switched them up, both modem and the cords - nothing. I then hooked it up to my netbook to see if that would work, as it usually does - connection didn't work there either. Like I said, it's been about 24 hours now and this is increasingly frustrating, as I've tried all solutions (While browsing through 10 search results pages on my phone) suggested and still nothing. Any suggestions and tricks would be greatly appreciated! Here's my specs: Windows 7 32-bit Home Premium Intel Core 2 Duo 3.14 Ghz 4 GB Kingston DDR2 RAM eVGA nForce 750i SLI eVGA GeForce GTX 560 Ti FPB ISP: CenturyLink No router Modem: CenturyLink 660 Series Hardwired connection PLEASE NOTE: This is the only computer I have (Like I said, the netbook solution didn't work), so downloading programs and such is not an option til I get to other computers somewhere else, like right now. Unless someone knows of a way of copying/pasting a file in Windows and then transferring said info to an Android smartphone, this is gunna take a while haha. Patience is requested.

    Read the article

  • Consume WCF Service InProcess using Agatha and WCF

    - by REA_ANDREW
    I have been looking into this lately for a specific reason.  Some integration tests I want to write I want to control the types of instances which are used inside the service layer but I want that control from the test class instance.  One of the problems with just referencing the service is that a lot of the time this will by default be done inside a different process.  I am using StructureMap as my DI of choice and one of the tools which I am using inline with RhinoMocks is StructureMap.AutoMocking.  With StructureMap the main entry point is the ObjectFactory.  This will be process specific so if I decide that the I want a certain instance of a type to be used inside the ServiceLayer I cannot configure the ObjectFactory from my test class as that will only apply to the process which it belongs to. This is were I started thinking about two things: Running a WCF in process Being able to share mocked instances across processes A colleague in work pointed me to a project which is for the latter but I thought that it would be a better solution if I could run the WCF Service in process.  One of the projects which I use when I think about WCF Services is AGATHA, and the one which I have to used to try and get my head around doing this. Another asset I have is a book called Programming WCF Services by Juval Lowy and if you have not heard of it or read it I would definately recommend it.  One of the many topics that is inside this book is the type of configuration you need to communicate with a service in the same process, and it turns out to be quite simple from a config point of view. <system.serviceModel> <services> <service name="Agatha.ServiceLayer.WCF.WcfRequestProcessor"> <endpoint address ="net.pipe://localhost/MyPipe" binding="netNamedPipeBinding" contract="Agatha.Common.WCF.IWcfRequestProcessor"/> </service> </services> <client> <endpoint name="MyEndpoint" address="net.pipe://localhost/MyPipe" binding="netNamedPipeBinding" contract="Agatha.Common.WCF.IWcfRequestProcessor"/> </client> </system.serviceModel>   You can see here that I am referencing the Agatha object and contract here, but also that my binding and the address is something called Named Pipes.  THis is sort of the “Magic” which makes it happen in the same process. Next I need to open the service prior to calling the methods on a proxy which I also need.  My initial attempt at the proxy did not use any Agatha specific coding and one of the pains I found was that you obviously need to give your proxy the known types which the serializer can be aware of.  So we need to add to the known types of the proxy programmatically.  I came across the following blog post which showed me how easy it was http://bloggingabout.net/blogs/vagif/archive/2009/05/18/how-to-programmatically-define-known-types-in-wcf.aspx. First Pass So with this in mind, and inside a console app this was my first pass at consuming a service in process.  First here is the proxy which I made making use of the Agatha IWcfRequestProcessor contract. public class InProcProxy : ClientBase<Agatha.Common.WCF.IWcfRequestProcessor>, Agatha.Common.WCF.IWcfRequestProcessor { public InProcProxy() { } public InProcProxy(string configurationName) : base(configurationName) { } public Agatha.Common.Response[] Process(params Agatha.Common.Request[] requests) { return Channel.Process(requests); } public void ProcessOneWayRequests(params Agatha.Common.OneWayRequest[] requests) { Channel.ProcessOneWayRequests(requests); } } So with the proxy in place I could then use this after opening the service so here is the code which I use inside the console app make the request. static void Main(string[] args) { ComponentRegistration.Register(); ServiceHost serviceHost = new ServiceHost(typeof(Agatha.ServiceLayer.WCF.WcfRequestProcessor)); serviceHost.Open(); Console.WriteLine("Service is running...."); using (var proxy = new InProcProxy()) { foreach (var operation in proxy.Endpoint.Contract.Operations) { foreach (var t in KnownTypeProvider.GetKnownTypes(null)) { operation.KnownTypes.Add(t); } } var request = new GetProductsRequest(); var responses = proxy.Process(new[] { request }); var response = (GetProductsResponse)responses[0]; Console.WriteLine("{0} Products have been retrieved", response.Products.Count); } serviceHost.Close(); Console.WriteLine("Finished"); Console.ReadLine(); } So what I used here is the KnownTypeProvider of Agatha to easily get all the types I need for the service/proxy and add them to the proxy.  My Request handler for this was just a test one which always returned 2 products. public class GetProductsHandler : RequestHandler<GetProductsRequest,GetProductsResponse> { public override Agatha.Common.Response Handle(GetProductsRequest request) { return new GetProductsResponse { Products = new List<ProductDto> { new ProductDto{}, new ProductDto{} } }; } } Second Pass Now after I did this I started reading up some more on some resources including more by Davy Brion and others on Agatha.  Now it turns out that the work I did above to create a derived class of the ClientBase implementing Agatha.Common.WCF.IWcfRequestProcessor was not necessary due to a nice class which is present inside the Agatha code base, RequestProcessorProxy which takes care of this for you! :-) So disregarding that class I made for the proxy and changing my code to use it I am now left with the following: static void Main(string[] args) { ComponentRegistration.Register(); ServiceHost serviceHost = new ServiceHost(typeof(Agatha.ServiceLayer.WCF.WcfRequestProcessor)); serviceHost.Open(); Console.WriteLine("Service is running...."); using (var proxy = new RequestProcessorProxy()) { var request = new GetProductsRequest(); var responses = proxy.Process(new[] { request }); var response = (GetProductsResponse)responses[0]; Console.WriteLine("{0} Products have been retrieved", response.Products.Count); } serviceHost.Close(); Console.WriteLine("Finished"); Console.ReadLine(); }   Cheers for now, Andy References Agatha WCF InProcess Without WCF StructureMap.AutoMocking Cross Process Mocking Agatha Programming WCF Services by Juval Lowy

    Read the article

  • Compiling examples for consuming the REST Endpoints for WCF Service using Agatha

    - by REA_ANDREW
    I recently made two contributions to the Agatha Project by Davy Brion over on Google Code, and one of the things I wanted to follow up with was a post showing examples and some, seemingly required tid bits.  The contributions which I made where: To support StructureMap To include REST (JSON and XML) support for the service contract The examples which I have made, I want to format them so they fit in with the current format of examples over on Agatha and hopefully create and submit a third patch which will include these examples to help others who wish to use these additions. Whilst building these examples for both XML and JSON I have learnt a couple of things which I feel are not really well documented, but are extremely good practice and once known make perfect sense.  I have chosen a real basic e-commerce context for my example Requests and Responses, and have also made use of the excellent tool AutoMapper, again on Google Code. Setting the scene I have followed the Pipes and Filters Pattern with the IQueryable interface on my Repository and exposed the following methods to query Products: IQueryable<Product> GetProducts(); IQueryable<Product> ByCategoryName(this IQueryable<Product> products, string categoryName) Product ByProductCode(this IQueryable<Product> products, String productCode) I have an interface for the IProductRepository but for the concrete implementation I have simply created a protected getter which populates a private List<Product> with 100 test products with random data.  Another good reason for following an interface based approach is that it will demonstrate usage of my first contribution which is the StructureMap support.  Finally the two Domain Objects I have made are Product and Category as shown below: public class Product { public String ProductCode { get; set; } public String Name { get; set; } public Decimal Price { get; set; } public Decimal Rrp { get; set; } public Category Category { get; set; } }   public class Category { public String Name { get; set; } }   Requirements for the REST Support One of the things which you will notice with Agatha is that you do not have to decorate your Request and Response objects with the WCF Service Model Attributes like DataContract, DataMember etc… Unfortunately from what I have seen, these are required if you want the same types to work with your REST endpoint.  I have not tried but I assume the same result can be achieved by simply decorating the same classes with the Serializable Attribute.  Without this the operation will fail. Another surprising thing I have found is that it did not work until I used the following Attribute parameters: Name Namespace e.g. [DataContract(Name = "GetProductsRequest", Namespace = "AgathaRestExample.Service.Requests")] public class GetProductsRequest : Request { }   Although I was surprised by this, things kind of explained themselves when I got round to figuring out the exact construct required for both the XML and the REST.  One of the things which you already know and are then reminded of is that each of your Requests and Responses ultimately inherit from an abstract base class respectively. This information needs to be represented in a way native to the format being used.  I have seen this in XML but I have not seen the format which is required for the JSON. JSON Consumer Example I have used JQuery to create the example and I simply want to make two requests to the server which as you will know with Agatha are transmitted inside an array to reduce the service calls.  I have also used a tool called json2 which is again over at Google Code simply to convert my JSON expression into its string format for transmission.  You will notice that I specify the type of Request I am using and the relevant Namespace it belongs to.  Also notice that the second request has a parameter so each of these two object are representing an abstract Request and the parameters of the object describe it. <script type="text/javascript"> var bodyContent = $.ajax({ url: "http://localhost:50348/service.svc/json/processjsonrequests", global: false, contentType: "application/json; charset=utf-8", type: "POST", processData: true, data: JSON.stringify([ { __type: "GetProductsRequest:AgathaRestExample.Service.Requests" }, { __type: "GetProductsByCategoryRequest:AgathaRestExample.Service.Requests", CategoryName: "Category1" } ]), dataType: "json", success: function(msg) { alert(msg); } }).responseText; </script>   XML Consumer Example For the XML Consumer example I have chosen to use a simple Console Application and make a WebRequest to the service using the XML as a request.  I have made a crude static method which simply reads from an XML File, replaces some value with a parameter and returns the formatted XML.  I say crude but it simply shows how XML Templates for each type of Request could be made and then have a wrapper utility in whatever language you use to combine the requests which are required.  The following XML is the same Request array as shown above but simply in the XML Format. <?xml version="1.0" encoding="utf-8" ?> <ArrayOfRequest xmlns="http://schemas.datacontract.org/2004/07/Agatha.Common" xmlns:i="http://www.w3.org/2001/XMLSchema-instance"> <Request i:type="a:GetProductsRequest" xmlns:a="AgathaRestExample.Service.Requests"/> <Request i:type="a:GetProductsByCategoryRequest" xmlns:a="AgathaRestExample.Service.Requests"> <a:CategoryName>{CategoryName}</a:CategoryName> </Request> </ArrayOfRequest>   It is funny because I remember submitting a question to StackOverflow asking whether there was a REST Client Generation tool similar to what Microsoft used for their RestStarterKit but which could be applied to existing services which have REST endpoints attached.  I could not find any but this is now definitely something which I am going to build, as I think it is extremely useful to have but also it should not be too difficult based on the information I now know about the above.  Finally I thought that the Strategy Pattern would lend itself really well to this type of thing so it can accommodate for different languages. I think that is about it, I have included the code for the example Console app which I made below incase anyone wants to have a mooch at the code.  As I said above I want to reformat these to fit in with the current examples over on the Agatha project, but also now thinking about it, make a Documentation Web method…{brain ticking} :-) Cheers for now and here is the final bit of code: static void Main(string[] args) { var request = WebRequest.Create("http://localhost:50348/service.svc/xml/processxmlrequests"); request.Method = "POST"; request.ContentType = "text/xml"; using(var writer = new StreamWriter(request.GetRequestStream())) { writer.WriteLine(GetExampleRequestsString("Category1")); } var response = request.GetResponse(); using(var reader = new StreamReader(response.GetResponseStream())) { Console.WriteLine(reader.ReadToEnd()); } Console.ReadLine(); } static string GetExampleRequestsString(string categoryName) { var data = File.ReadAllText(Path.Combine(Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location), "ExampleRequests.xml")); data = data.Replace("{CategoryName}", categoryName); return data; } }

    Read the article

  • MSCC: Scripting - Administrator's­ toolbox of magic...

    Finally, we made it to have our April meetup - in May. The most obvious explanation is the increased amount of open source and IT activities that either the MSCC, the Linux User Group of Mauritius (LUGM), or the University of Mauritius Student's Computer Club is organising. It's absolutely incredible to see the recent hype of events here on the island. And I'm loving it! Unfortunately, we also had to deal with arranging for a location this time. It was kind of an odyssey as my requests (and phone calls) haven't been answered, even though I tried it several times - well, kind of disappointing and I have to look into that for future gatherings. In my opinion, it is essential that two parameters of a community meeting are fixed as early as possible: Location, and Date and time You can't just change one or both on the very last minute. Well, this time we had to do it due to unforeseen reasons, and I apologise to any MSCC member which couldn't make it to our April meetup. Okay, lesson learned but now back to the actual meetup report ... Shortly after the meeting I placed the following statement as my first impression: "Spontaneous and improvised :) No, seriously, Ish and Dan had well prepared presentations on shell scripting, mainly focused towards Bourne Again Shell (bash), and the pros and cons of scripting versus actually writing something in a decent programming language. I thought that I could cut myself out of the equation but the demand for information about PowerShell was higher than expected..." Well, it turned out that the interest in Windows PowerShell was high, as I even got a couple of questions on it via social media networks during the evening. I also like to mention that the number of attendees went back to what I would call a "standard" number of participation. This time there were 12 craftsmen, but again a good number of First Timers. Reactions of other attendees Here are some impressions and feedback from our participants: "Enjoyed the bash and powershell (linux / windows) presentations ..." -- Nadim on event comments "He [Daniel] also showed us some syntax loopholes in Bash that could leave someone with bad code." -- Ish on MSCC – Let's talk about Scripting   Glad to see a couple of first time attendees, especially students from the university itself. Some details on the presentations MSCC: First time visit at the University of Mauritius - Phase II Engineering Tower, room 2.9 Gimme some love ... bash and other shells Ish gave a great introduction into shell scripting as he spoke about existing shell environments and a little bit about their history. Furthermore, he talked about various built-in commands, the use of coreutils, the ability to daisy-chain multiple commands using pipes, the importance of the standard I/O streams and their file descriptors in advanced scripting techniques. Combined with a couple of sample statements in the Linux terminal on Ubuntu 14.04 machine it was a solid presentation. Have a closer look at his slides - published on his blog on MSCC – Let's talk about Scripting. Oddities of scripting After the brief introduction into bash it was Daniel's turn to highlight a good number of oddities when working with shell scripts. First of all, it should be clear that scripting is not supposed for any kind of implementations in terms of software but simply to automate administrative procedures and to simplify routine jobs on a system. One of the cool oddities that he mentioned is that everything (!) in a shell is represented by strings; there are no other types like integer, float, date-time, etc. that you'd like to use in a full-fledged programming language. Let's have a look at his sample:  more to come... What's the output? As a conclusion, Daniel suggests that shell scripting should be limited but not restricted to automatic repetitive command stacks and batch jobs, startup wrapper for applications in order to set up the execution environment, and other not too sophisticated jobs. But as soon as it might involve a little bit more logic or you might rely on performance it's better to write an application in Ruby, Python, or Perl (among others of course). This is also enables the possibility to test your code properly. MSCC: Ish talking about Bourne Again Shell (bash) and shell scripting to automate regular tasks MSCC: Daniel gives an overview about the pros and cons of shell scripting versus programming MSCC: PowerShell as your scripting solution on Windows operating systems The path of the Enlightened is long ... and tough. Honestly, even though PowerShell was mentioned without any further details on the meetup's agenda, I didn't expect that there would be demand to give a presentation on Microsoft PowerShell after all. I already took this topic out of the announcement but the audience wanted to have some information. Okay, then let's see what I could do - improvised style. While my machine booted and got hooked up to the projector, I started to talk about the beginnings of PowerShell from back in 2006, and its predecessors MS DOS and Command Prompt. A throwback in history... always good for young people. As usual, Microsoft didn't get it at that time. Instead of listening to their client's needs and demands they ignored the feasibility to administrate Windows server farms without any UI tools. PowerShell is actually a result of this, and seeing that shell scripting is a common, reliable and fast way in an administrator's toolbox for decades, Microsoft had to adapt from their Microsoft Management Console (MMC) to a broader approach. It's not like shell scripting was something new; it is in daily use by alternative operating systems like AIX, HP UX, Solaris, and last but not least Linux. Most interestingly, Microsoft is very good at renovating existing architectures, and over the years PowerShell not only replaced their own combination of Command Prompt and Scripting Hosts (VBScript and CScript) but really turned into a challenging competitor on the market. The shell is easy to extend with cmdlets, and open to other Microsoft products like SQL Server, SharePoint, as well as Third-party software applications. Similar to MMC PowerShell also offers the ability to administer other machine remotely - only without a graphical user interface and therefore it's easier to automate and schedule regular tasks. Following is a sample of a PowerShell script file (extension .ps1): $strComputer = "." $colItems = get-wmiobject -class Win32_BIOS -namespace root\CIMV2 -comp $strComputer foreach ($objItem in $colItems) {write-host "BIOS Characteristics: " $objItem.BiosCharacteristicswrite-host "BIOS Version: " $objItem.BIOSVersionwrite-host "Build Number: " $objItem.BuildNumberwrite-host "Caption: " $objItem.Captionwrite-host "Code Set: " $objItem.CodeSetwrite-host "Current Language: " $objItem.CurrentLanguagewrite-host "Description: " $objItem.Descriptionwrite-host "Identification Code: " $objItem.IdentificationCodewrite-host "Installable Languages: " $objItem.InstallableLanguageswrite-host "Installation Date: " $objItem.InstallDatewrite-host "Language Edition: " $objItem.LanguageEditionwrite-host "List Of Languages: " $objItem.ListOfLanguageswrite-host "Manufacturer: " $objItem.Manufacturerwrite-host "Name: " $objItem.Namewrite-host "Other Target Operating System: " $objItem.OtherTargetOSwrite-host "Primary BIOS: " $objItem.PrimaryBIOSwrite-host "Release Date: " $objItem.ReleaseDatewrite-host "Serial Number: " $objItem.SerialNumberwrite-host "SMBIOS BIOS Version: " $objItem.SMBIOSBIOSVersionwrite-host "SMBIOS Major Version: " $objItem.SMBIOSMajorVersionwrite-host "SMBIOS Minor Version: " $objItem.SMBIOSMinorVersionwrite-host "SMBIOS Present: " $objItem.SMBIOSPresentwrite-host "Software Element ID: " $objItem.SoftwareElementIDwrite-host "Software Element State: " $objItem.SoftwareElementStatewrite-host "Status: " $objItem.Statuswrite-host "Target Operating System: " $objItem.TargetOperatingSystemwrite-host "Version: " $objItem.Versionwrite-host} Which gives you information about your BIOS and Windows OS. Then change the computer name to another one on your network (NetBIOS based) and run the script again. There lots of samples and tutorials at the Microsoft Script Center, and I would advise you to pay a visit over there if you are more interested in PowerShell. The Script Center provides the download links, too. Upcoming Events What are the upcoming events here in Mauritius? So far, we have the following ones (incomplete list as usual) in chronological order: Hacking Defence (14. May 2014) WebCup Maurice (7. & 8. June 2014) Developers Conference (TBA ~ July 2014) Linuxfest 2014 (TBA ~ November 2014) Hopefully, there will be more announcements during the next couple of weeks and months. If you know about any other event, like a bootcamp, a code challenge or hackathon here in Mauritius, please drop me a note in the comment section below this article. Thanks! My resume of the day Spontaneous and improvised :) The new location at the University of Mauritius turned out very well, there is plenty of space, and it could be a good choice for future meetings. Especially, having the ability to get more and more students into our IT community sounds like a great opportunity. Later during the day, I got some promising mails from Nadim regarding future sessions at the local branch of the Middlesex University. Well, we will see in the future... But for now this will be on hold until approximately October when students resume their regular studies. Anyway, it was a good experience at the university, and thanks again to the UoM Student's Computer Club that made the necessary arrangements for the MSCC!

    Read the article

  • Struts 1 ActionForm - retrieving a collection from pure HTML

    - by Yaneeve
    Hi all I have (just like the rest) inherited some struts 1 code. I have had need to add a few more pages to this project. What I cannot figure out is how to map several distinct but similarly natured input elements to the my ActionForm. Let me elaborate. I create a new <Input> element dynamically as the user inputs more and more items (I use the YUI autocomplete form element and for each entered input I add it as an input element to my form and draw a new YUI autocomplete - complex sounding, I know) So... My form looks a bit like (... after some prettifying and some such...): <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN" "http://www.w3.org/TR/html4/strict.dtd"> <html> <head> <meta http-equiv="content-type" content="text/html; charset=utf-8"> <title>My Cool App - Test Case Builder</title> <link rel="stylesheet" type="text/css" href="../script/yui/fonts/fonts-min.css" /> <link rel="stylesheet" type="text/css" href="../skins/myCoolApp/button/button.css" /> <link rel="stylesheet" type="text/css" href="../script/yui/autocomplete/assets/skins/sam/autocomplete.css" /> <link rel="stylesheet" type="text/css" media="screen" href="../skins/myCoolApp/testcase.css" /> <!-- YUI JAVA SCRIPTS --> <script type="text/javascript" src="../script/yui/yahoo-dom-event/yahoo-dom-event.js"></script> <script type="text/javascript" src="../script/yui/element/element-min.js"></script> <script type="text/javascript" src="../script/yui/button/button-min.js"></script> <script type="text/javascript" src="../script/yui/datasource/datasource-min.js"></script> <script type="text/javascript" src="../script/yui/autocomplete/autocomplete-min.js"></script> <!-- APP JAVA SCRIPTS --> <script type="text/javascript" src="../script/myCoolApp/myCoolApp.js" ></script> <script type="text/javascript" src="../script/myCoolApp/stack.js" ></script> <script type="text/javascript" src="../script/myCoolApp/testcase/testcase.js"></script> <script type="text/javascript" src="../script/myCoolApp/testcase/default-data.js" ></script> <script type="text/javascript" src="../script/myCoolApp/testcase/data-structs.js" ></script> <script type="text/javascript" src="../script/myCoolApp/testcase/ui-elements.js" ></script> </head> <body class="cf010"> <div id="wrap"> <div id="header"> <div id="main-header"> COOL APP </div> </div> <div id="main-body"> <div id="content"> <div class="col main"> <div id="main"> <form method="post" id="testcaseForm" class="typea" action=""> <fieldset> <legend>Test Case Builder</legend> <div id="tk1" class="tabcontrol"> <ul class="tabs"> <li class="first active"> <a href="#"> <span>General</span> </a> </li> <li class="last"> <a href="#"> <span>Parameters</span> </a> </li> </ul> <div id="tab0" class="tc-panel"> <dl class="cls9"> <dt> <label for="scenario">Choose Scenario:</label> </dt> <dd> <input type="text" id="scenario" name="scenario" class="text" /> <span id="scenarioToggle"></span> <div class="auto-complete" id="scenarioContainer"></div> </dd> <dt> <label for="ruleID">Choose Rule ID:</label> </dt> <dd> <input type="text" id="ruleID" name="ruleID" class="text" /> <span id="ruleIDToggle"></span> <div class="auto-complete" id="ruleIDContainer"></div> </dd> <dt> <label for="Test Case Name" accesskey="t"><span class="accesskey">T</span>est Case Name:</label> </dt> <dd> <input type="text" id="testCaseName" name="testCaseName" class="text" /> </dd> </dl> </div> <div id="tab1" class="tc-panel hidden"> <div class="toolbar" id="action-bar"> <ul> <li class="first"> <a title="select all" href="#" id="btmSelectAll" class="button"> <span>select all</span> </a> </li> <li> <a title="remove row" href="#" id="btmRemove" class="button"> <span>remove row</span> </a> </li> <li> <a title="undo last" href="#" id="btmRollBack" class="button disabled"> <span>undo last</span> </a> </li> <li class="last"> <a title="accept row" href="#" id="btmAccept" class="button disabled"> <span>accept row</span> </a> </li> </ul> </div> <div id="param.list" class="gridclip"> <table id='param.list.tbl' class='grid modela' > <caption>Test Case Summary</caption> <col/><col/><col/> <thead> <tr> <th class='hl center first'> <input class='grid-select-all' type='checkbox' /> <th> <th scope='col'>Row</th> <th scope='col'>Parameter</th> <th scope='col' class='last'>Value</th> </tr> </thead> <tfoot> <tr> <th scope='row'>Total</th> <td colspan='3'>2 parameters as Test Case input</td> </tr> </tfoot> <tbody id='param.list.tbl.body'> <tr class='odd'> <td class='rowcheck center first'> <input value='param1###value1' id='cb1' name='SelectedRows' class='grid-select-row' type='checkbox'/> </td> <td class='id'>1</td> <td>param1</td> <td class='last'>value1</td> </tr> <tr class='even'> <td class='rowcheck center first'> <input value='param2###value2' id='cb1' name='SelectedRows' class='grid-select-row' type='checkbox'/> </td> <td class='id'>2</td> <td>param2</td> <td class='last'>value2</td> </tr> <tr class='odd'> <td class='rowcheck center first' /> <td class='id'><em>new</em></td> <td> <dl class='clsTable'> <dt> <input type='text' id='param' name='param' class='text paramInput' /> </dt> <dd> <span id='paramToggle' /> </dd> <div class='auto-complete' id='paramContainer' /> </dl> </td> <td class='last'> <dl class='clsTable'> <dt> <input type='text' id='value' name='value' class='text valueInput' /> </dt> </dl> </td> </tr> </tbody> </table> </div> </div> </div> <!-- tabcontrol --> </fieldset> <div class="submit-box"> <input type="submit" name="formRun" id="formRun" class="form-save" value="Execute" accesskey="x" title="Run: Press Alt + [Shift] + x" /> <input type="submit" name="formSave" id="formSave" value="Save" accesskey="s" title="Save: Press Alt + [Shift] + s" /> <input type="submit" name="formLoad" id="formLoad" value="Load" accesskey="l" title="Load: Press Alt + [Shift] + l" /> <input type="submit" name="formCancel" id="formCancel" class="form-cancel" value="Cancel" accesskey="c" title="Cancel: Press Alt + [Shift] + c" /> </div> </form> </div> </div> </div> </div> </div> </body> </html> As you can see the following is pretty much a duplicate: <tr class='odd'> <td class='rowcheck center first'> <input value='param1###value1' id='cb1' name='SelectedRows' class='grid-select-row' type='checkbox'/> </td> <td class='id'>1</td> <td>param1</td> <td class='last'>value1</td> </tr> <tr class='even'> <td class='rowcheck center first'> <input value='param2###value2' id='cb1' name='SelectedRows' class='grid-select-row' type='checkbox'/> </td> <td class='id'>2</td> <td>param2</td> <td class='last'>value2</td> </tr> The relevant part of my stuts-config.xml file is: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE struts-config PUBLIC "-//Apache Software Foundation//DTD Struts Configuration 1.2//EN" "http://struts.apache.org/dtds/struts-config_1_2.dtd"> <struts-config> <data-sources /> <form-beans> <form-bean name="TestCaseForm" type="com.blahblah.mycoolapp.forms.TestCaseForm" /> </form-beans> <action-mappings> <action path="/pages/SaveTestCase" name="TestCaseForm" type="org.springframework.web.struts.DelegatingActionProxy" scope="request"> </action> </action-mappings> <message-resources parameter="MessageResources" /> </struts-config> I also use spring 2.56 (The relevant part being): <?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-2.5.xsd"> <bean name="/pages/SaveTestCase" class="com.blahblah.mycoolapp.actions.TestCaseBuilderSaveAction" /> </beans> My Java ActionForm class (from what I had learned off the net) is: package com.blahblah.mycoolapp.forms; import java.util.ArrayList; import java.util.List; import org.apache.struts.action.ActionForm; public class TestCaseForm extends ActionForm { private static final long serialVersionUID = 2352146257739099766L; private String scenario; private String ruleID; private String testCaseName; private List<String> SelectedRows = new ArrayList<String>() ; public String getScenario() { return scenario; } public void setScenario(String scenario) { this.scenario = scenario; } public String getRuleID() { return ruleID; } public void setRuleID(String ruleID) { this.ruleID = ruleID; } public String getTestCaseName() { return testCaseName; } public void setTestCaseName(String testCaseName) { this.testCaseName = testCaseName; } public List<String> getSelectedRows() { return SelectedRows; } public void setSelectedRows(int index, String value) { this.SelectedRows.add(value); } } The question is why do I get an empty SelectedRows in my TestCaseBuilderSave Action? Thanks all who have the patience to read such a long question... and (hopefully) thanks to all you potential saviors :)

    Read the article

  • The ASP.NET Daily Community Spotlight - How posts get there, and how to make it your Visual Studio Start Page

    - by Jon Galloway
    One really cool part of my job is selecting the articles for the Daily Community Spotlight, on the home page of the ASP.NET website. The spotlight highlights a new post about ASP.NET development every day from a member of the ASP.NET community. You can find it on the home page of the ASP.NET site, at http://asp.net These posts aren't automatically drawn from a pool of RSS feeds or anything - I pick a new post for each day of the year. How I pick the posts I have a few important selection criteria: Interesting to well rounded ASP.NET developers The ASP.NET website has a lot of material for all skill and experience levels, from download / get started to advanced. I try to select community spotlight posts to round that out with fresh and timely information that working ASP.NET developers can really use. Posts highlight solutions to common problems, clever projects and code that helps you leverage ASP.NET, and important announcements about things you can use today. As part of that, I try to mix between ASP.NET MVC, Web Forms, and Web Pages (a.k.a. WebMatrix). As a professional developer, I want to keep on top of all of my options for ASP.NET development, and the common platform base they all share generally means that good ASP.NET code is good ASP.NET code. Exposing new and non-Microsoft community members as much as possible The exercise of selecting good ASP.NET community posts every day of the year has made me think about what the community is. Given the choice, I'll always favor non-Microsoft employees, but since Microsoft often hires ASP.NET community members and MVP's (myself included), I really think that the ASP.NET community includes developers who are using and writing about ASP.NET, both inside and outside of Microsoft. I'm especially excited about the opportunity to highlight new and lesser known bloggers. Usually being featured on the ASP.NET Community Spotlight gives a pretty good traffic bump, and I love being able to both provide great content to the community and encourage lesser known community members by giving them some (much deserved) attention. Announcements only when they're useful to working developers - not marketing Some of the posts are announcements about new releases, such as Scott Hanselman's post on ASP.NET Universal Providers for Session, Memebership, and Roles. I include those when I think they're interesting and of immediate use to you on projects. I occasionally get asked to link to new content from a team at Microsoft; if it's useful and timely content I'll ask them to point me to a blog post by an actual person rather than a faceless team. How the posts are managed This feed used to be managed by an internal spreadsheet on a Sharepoint site, which was painful for a lot of reasons. I took a cue from Jon Udell, who uses of a public Delicious feed feed for his Elm City project, and we moved the management of these posts over to a Delicious feed as well. You can hear more about Jon's use of Delicious in Elm City in our Herding Code interview - still one of my favorite interviews. We ended up with a simpler scenario, but Note: I watched the Yahoo/Delicious news over the past year and was happy to see that Delicious was recently acquired by the founders of YouTube. I investigated several other Delicious competitors, but am happy with Delicious for now. My Delicious feed here: http://www.delicious.com/jon_galloway You can also browse through this past year's ASP.NET Community Spotlight posts using the (pretty cool) Delicious Browse Bar Submitting articles I'm always on the lookout for new articles to feature. The best way to get them to me is to share them via Delicious. It's pretty easy - sign up for an account, then you can add a post and share it to me. Alternatively, you can send them to me via Twitter (@jongalloway) or e-mail (). If you do e-mail me, it helps to include a short description and your full name so I can credit you. Way too many developer blogs don't include names and pictures; if I can't find them I can't feature the post. Subscribing to the Community Spotlight feed The Community Spotlight is available as an RSS feed, so you might want to subscribe to it: http://www.asp.net/rss/spotlight Setting the ASP.NET Community Spotlight feed as your Visual Studio start page If you're an ASP.NET developer, you might consider setting the ASP.NET Community Spotlight as the content for your Visual Studio Start Page. It's really easy - here's how to do it in Visual Studio 2010: Display the Visual Studio Start Page if it's not already showing (View / Start Page) Click on the Latest News tab and enter the following RSS URL: http://www.asp.net/rss/spotlight If you didn't previously have RSS feeds enabled for your start page, click the Enable RSS Feed button Now, every time you start up Visual Studio you'll see great content from members of the ASP.NET community: You can also configure - and disable, if you'd like - the Visual Studio start page in the Tools / Options / Environment / Startup dialog. Credits I'll do a follow-up highlighting some places I commonly find great content for the feed, but I'd like to specifically point out two of them: Elijah Manor posts a lot of great content, which is available in his Twitter feed at @elijahmanor, on his Delicious feed, and on a dedicated website - Web Dev Tweets Chris Alcock's The Morning Brew is a must-read blog which highlights each day's best blog posts across the .NET community. He's an absolute machine, and no matter how obscure the post I find, I can guarantee he'll find it as well if he hasn't already. Did I say must read?

    Read the article

  • WordPress SEO Plugins to make your Blog Search Engine Friendly

    - by Vaibhav
    WordPress is the most common blogging system in use today and its use as a CMS is also wide spread. With hundreds of millions of sites using wordpress, getting correct SEO for your WordPress based Blog or Site is very important. We get regular queries from people who want Search Engine Optimisation for their site or blog which is made using wordpress. Here is a list of 16 of the best WordPress Plug-ins That can help you achieve better rankings: All in one SEO Pack This is most popular plugin among all SEO plugins for WordPress. It is easy to use and is compatible with most of the WordPress plugins. It works as a complete package of SEO plugin – automatically generating META tags and optimizing search engines for your titles and avoiding duplicate content. You can also include META tags manually (Met title, Meta description and Met keywords) for all pages and post in your website. HeadSpace2 HeasSpace2 is available in different languages , you can manage a wide range of SEO Tasks related with meta data, you can tag your posts, Custom descriptions and titles. So your page can rank the created relevancy on Search engines and you can load different settings for different pages. Platinum SEO plugin Automatic 301 redirects permalink changes, META tags generation, avoids duplicate content, and does SEO optimization of post and page titles and a lots of other features. TGFI.net SEO WordPress Plugin It’s a modified version of all-in-one SEO Pack. It has some unique feature over All-in-one SEO plugin, It generate titles, meta descriptions and meta keywords automatically when overrides are not present. Google XML Sitemaps Sitemaps Generated by this tool are supported by  Google,  Yahoo,  Bing, and Ask. We all know Sitemaps make indexing of web pages easier for web crawlers. Crawlers can retrieve complete structure of site and more information by sitemaps. They notify all major search engines about new posts every time you create a new post. Sitemap Generator You can generate highly customizable sitemap for your WordPress page. You can choose what to show and what not to show, you can list the items in your choice of orde. It supports pages and permalinks and multi-level categories. SEO Slugs They can generate more search engine friendly URLs for your site. Slugs are filename assigned to your post , this plugin removes all  common words like ‘a’, ‘the’, ‘in’, ‘what’, ‘you’ from slug which are assigned automatically to your post. SEO Post Links This is a similar plugin to SEO Slug, it removes unnecessary keywords from slug to make it short and SEO friendly and you can fix the number of characters in your post. Automatic SEO links With this tool you can create auto linking in your post. You can use this tool for inter linking or external linking too. Just select your words, anchor text target URL nature of links ( Do fallow / No follow ). This plugin will replace the matches found in post, WP Backlinks A helpful plugin for link exchange , whenever any webmaster submits a link for link exchange, the plugin will spider webmasters site for reciprocal link, and if everything is found good , your link will be exchanged. SEO Title Tag You can optimize your Title  tags of  Word press blog through this plugin . You can also override the title tag with custom titles , mass editing and title tags for 404 pages which are the main feature of this plugin. 404 SEO plugin With this Plugin you can customize 404 page of your site; you can give customized error message and links to relevant pages of your site. Redirection A powerful plugins to manage 301 redirection and logs related with redirection, with this plugin you can track 404 errors and track the log of all redirected URLs , this plugin can redirect  post automatically when URL changes for that post. AddToAny This plugin helps your readers to share, save, email and bookmark your posts and pages. It supports more than a hundred social bookmarking , networking and sharing sites. SEO Friendly Images You can make SEO friendly images available on your site with the help of this tool. It updates images with proper titles and ALT tags. Robots Meta A plugin which prevents Search engines to index comments on your post, login and admin pages. It also allows to add tags for individual pages.

    Read the article

  • doubleTwist is an iTunes Alternative that Supports Several Devices

    - by Mysticgeek
    There are a lot of iTunes users out there, but unfortunately you can’t use it with all of your portable devices. Today we take a look at doubleTwist, which allows you to sync your media with a multitude of portable devices and easily share it as well. Note: You can run doubleTwist on Windows or Mac, and here we take a look at the Windows version. Install & Setup doubleTwist Download and install doubleTwist using the defaults in the wizard… Installation takes several moments and you’ll see the progress while it finishes up. After installation is complete, sign up for an account if you don’t already have one. If you do have an account you can login right away. Enter in your username, email address, and password then click Sign Up.   You’ll get an confirmation email and need to activate the account before you can sign in. Once you’re all signed up, launch doubleTwist and you’ll be ready to start using it. doubleTwist Music The default music store is Amazon MP3 store which might appeal to those of you who are tired of the iTunes music store. A lot of times the music is cheaper and available at higher bit rates. You can start searching for music in the Amazon Music Store and previewing songs. To purchase anything though you will need to sign into your Amazon account.   Under Playlists it allows you to import your playlists from iTunes and Windows Media Player, which is a handy feature if you don’t want to set them up again. Of course you can play your songs through the music player on your desktop. Devices One of the coolest things about doubleTwist is that it supports a lot of different portable media devices including iPod, BlackBerry, Windows Mobile, Android, PSP, Smartphones, and much more. Unfortunately for Zune users…there isn’t any support for the Zune of Zune HD yet. Here we have a Creative Zen attached and can sync songs, pictures, and podcasts. An HTC-S620 Smartphone running Windows Mobile… Even a simple USB drive will be recognized and you can transfer your media to it as well.   Podcasts Finding your favorite audio and video podcasts is easy with the search feature. You can easily manage and subscribe to podcasts in the subscriptions section.   You can watch the video podcasts directly in doubleTwist. Sharing Media Also you can share digital media with your friends or add it to Flickr and YouTube. You can send any pictures, videos, or music in your library to other people by dragging it over. You can email users individually… Or access contacts from your Gmail and Yahoo accounts. There is a limit to how much you can send of video podcasts… only the first 10 minutes. The person you send it to will get a link in their email that points to your My Feed page on the doubleTwist site.   There they can access the media you sent…in this example it’s a video podcast but you can share any media. Other Features Under My Profile you can change your avatar and personal information.   In Preferences you can choose where media is stored, its startup actions, podcast subscriptions, and manage device syncing. Conclusion It’s still in beta stage so expect some bugs, but overall doubleTwist is a solid media player that is easy to use with a clean interface. It’s simple and doesn’t try to do too much so is fairly easy on system resources. The main annoyance is it tries to catalog all of your media out of the box. Which may be alright for some users with smaller media collections, but very irritating to advanced users with large collections. Also there is currently no support for the Zune, but according to their forums, it’s on the way. At the time of this writing it’s in public beta and can be downloaded for XP, Vista, Windows 7 (32 & 64 bit), and Mac OSX. If you’re looking for an iTunes alternative that works with several different portable devices, you might want to give DoubleTwist a try. Download DoubleTwist Public Beta See If Your Media Device is Supported by doubleTwist Similar Articles Productive Geek Tips MusicBee is a Fast and Powerful Music ManagerAvoid the Apple QuickTime Bloat with QT LiteBeginner Geek: Set Default Programs in Windows 7 and VistaBeginner Geeks: OpenOffice is a Free Cross Platform Alternative to MS OfficeManage Devices the Easy Way with Device Stage in Windows 7 TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional Play Music in Chrome by Simply Dragging a File 15 Great Illustrations by Chow Hon Lam Easily Sync Files & Folders with Friends & Family Amazon Free Kindle for PC Download Stretch popurls.com with a Stylish Script (Firefox) OldTvShows.org – Find episodes of Hitchcock, Soaps, Game Shows and more

    Read the article

  • Lync Server 2010

    - by ManojDhobale
    Microsoft Lync Server 2010 communications software and its client software, such as Microsoft Lync 2010, enable your users to connect in new ways and to stay connected, regardless of their physical location. Lync 2010 and Lync Server 2010 bring together the different ways that people communicate in a single client interface, are deployed as a unified platform, and are administered through a single management infrastructure. Workload Description IM and presence Instant messaging (IM) and presence help your users find and communicate with one another efficiently and effectively. IM provides an instant messaging platform with conversation history, and supports public IM connectivity with users of public IM networks such as MSN/Windows Live, Yahoo!, and AOL. Presence establishes and displays a user’s personal availability and willingness to communicate through the use of common states such as Available or Busy. This rich presence information enables other users to immediately make effective communication choices. Conferencing Lync Server includes support for IM conferencing, audio conferencing, web conferencing, video conferencing, and application sharing, for both scheduled and impromptu meetings. All these meeting types are supported with a single client. Lync Server also supports dial-in conferencing so that users of public switched telephone network (PSTN) phones can participate in the audio portion of conferences. Conferences can seamlessly change and grow in real time. For example, a single conference can start as just instant messages between a few users, and escalate to an audio conference with desktop sharing and a larger audience instantly, easily, and without interrupting the conversation flow. Enterprise Voice Enterprise Voice is the Voice over Internet Protocol (VoIP) offering in Lync Server 2010. It delivers a voice option to enhance or replace traditional private branch exchange (PBX) systems. In addition to the complete telephony capabilities of an IP PBX, Enterprise Voice is integrated with rich presence, IM, collaboration, and meetings. Features such as call answer, hold, resume, transfer, forward and divert are supported directly, while personalized speed dialing keys are replaced by Contacts lists, and automatic intercom is replaced with IM. Enterprise Voice supports high availability through call admission control (CAC), branch office survivability, and extended options for data resiliency. Support for remote users You can provide full Lync Server functionality for users who are currently outside your organization’s firewalls by deploying servers called Edge Servers to provide a connection for these remote users. These remote users can connect to conferences by using a personal computer with Lync 2010 installed, the phone, or a web interface. Deploying Edge Servers also enables you to federate with partner or vendor organizations. A federated relationship enables your users to put federated users on their Contacts lists, exchange presence information and instant messages with these users, and invite them to audio calls, video calls, and conferences. Integration with other products Lync Server integrates with several other products to provide additional benefits to your users and administrators. Meeting tools are integrated into Outlook 2010 to enable organizers to schedule a meeting or start an impromptu conference with a single click and make it just as easy for attendees to join. Presence information is integrated into Outlook 2010 and SharePoint 2010. Exchange Unified Messaging (UM) provides several integration features. Users can see if they have new voice mail within Lync 2010. They can click a play button in the Outlook message to hear the audio voice mail, or view a transcription of the voice mail in the notification message. Simple deployment To help you plan and deploy your servers and clients, Lync Server provides the Microsoft Lync Server 2010, Planning Tool and the Topology Builder. Lync Server 2010, Planning Tool is a wizard that interactively asks you a series of questions about your organization, the Lync Server features you want to enable, and your capacity planning needs. Then, it creates a recommended deployment topology based on your answers, and produces several forms of output to aid your planning and installation. Topology Builder is an installation component of Lync Server 2010. You use Topology Builder to create, adjust and publish your planned topology. It also validates your topology before you begin server installations. When you install Lync Server on individual servers, the installation program deploys the server as directed in the topology. Simple management After you deploy Lync Server, it offers the following powerful and streamlined management tools: Active Directory for its user information, which eliminates the need for separate user and policy databases. Microsoft Lync Server 2010 Control Panel, a new web-based graphical user interface for administrators. With this web-based UI, Lync Server administrators can manage their systems from anywhere on the corporate network, without needing specialized management software installed on their computers. Lync Server Management Shell command-line management tool, which is based on the Windows PowerShell command-line interface. It provides a rich command set for administration of all aspects of the product, and enables Lync Server administrators to automate repetitive tasks using a familiar tool. While the IM and presence features are automatically installed in every Lync Server deployment, you can choose whether to deploy conferencing, Enterprise Voice, and remote user access, to tailor your deployment to your organization’s needs.

    Read the article

  • Recover that Photo, Picture or File You Deleted Accidentally

    - by The Geek
    Have you ever accidentally deleted a photo on your camera, computer, USB drive, or anywhere else? What you might not know is that you can usually restore those pictures—even from your camera’s memory stick. Windows tries to prevent you from making a big mistake by providing the Recycle Bin, where deleted files hang around for a while—but unfortunately it doesn’t work for external USB drives, USB flash drives, memory sticks, or mapped drives. The great news is that this technique also works if you accidentally deleted the photo… from the camera itself. That’s what happened to me, and prompted writing this article. Restore that File or Photo using Recuva The first piece of software that you’ll want to try is called Recuva, and it’s extremely easy to use—just make sure when you are installing it, that you don’t accidentally install that stupid Yahoo! toolbar that nobody wants. Now that you’ve installed the software, and avoided an awful toolbar installation, launch the Recuva wizard and let’s start through the process of recovering those pictures you shouldn’t have deleted. The first step on the wizard page will let you tell Recuva to only search for a specific type of file, which can save a lot of time while searching, and make it easier to find what you are looking for. Next you’ll need to specify where the file was, which will obviously be up to wherever you deleted it from. Since I deleted mine from my camera’s SD card, that’s where I’m looking for it. The next page will ask you whether you want to do a Deep Scan. My recommendation is to not select this for the first scan, because usually the quick scan can find it. You can always go back and run a deep scan a second time. And now, you’ll see all of the pictures deleted from your drive, memory stick, SD card, or wherever you searched. Looks like what happened in Vegas didn’t stay in Vegas after all… If there are a really large number of results, and you know exactly when the file was created or modified, you can switch to the advanced view, where you can sort by the last modified time. This can help speed up the process quite a bit, so you don’t have to look through quite as many files. At this point, you can right-click on any filename, and choose to Recover it, and then save the files elsewhere on your drive. Awesome! Restore that File or Photo using DiskDigger If you don’t have any luck with Recuva, you can always try out DiskDigger, another excellent piece of software. I’ve tested both of these applications very thoroughly, and found that neither of them will always find the same files, so it’s best to have both of them in your toolkit. Note that DiskDigger doesn’t require installation, making it a really great tool to throw on your PC repair Flash drive. Start off by choosing the drive you want to recover from…   Now you can choose whether to do a deep scan, or a really deep scan. Just like with Recuva, you’ll probably want to select the first one first. I’ve also had much better luck with the regular scan, rather than the “dig deeper” one. If you do choose the “dig deeper” one, you’ll be able to select exactly which types of files you are looking for, though again, you should use the regular scan first. Once you’ve come up with the results, you can click on the items on the left-hand side, and see a preview on the right.  You can select one or more files, and choose to restore them. It’s pretty simple! Download DiskDigger from dmitrybrant.com Download Recuva from piriform.com Good luck recovering your deleted files! And keep in mind, DiskDigger is a totally free donationware software from a single, helpful guy… so if his software helps you recover a photo you never thought you’d see again, you might want to think about throwing him a dollar or two. Similar Articles Productive Geek Tips Stupid Geek Tricks: Undo an Accidental Move or Delete With a Keyboard ShortcutRestore Accidentally Deleted Files with RecuvaCustomize Your Welcome Picture Choices in Windows VistaAutomatically Resize Picture Attachments in Outlook 2007Resize Your Photos with Easy Thumbnails TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Icelandic Volcano Webcams Open Multiple Links At One Go NachoFoto Searches Images in Real-time Office 2010 Product Guides Google Maps Place marks – Pizza, Guns or Strip Clubs Monitor Applications With Kiwi

    Read the article

  • dpkg unsatisfied dependencies, now apt-get wants to remove whole system

    - by Bruno Finger
    firstly, I'm sorry for my terminal output in portuguese, but I guess it is still understandable. I am using Ubuntu GNOME 14.04 and I tried to update the GNOME Online Accounts packages by downloading the following .deb files from packages.ubuntu.com for the Ubuntu 14.10 version: libgoa-backend-1.0-dev_3.12.4-1_amd64.deb libgoa-backend-1.0-1_3.12.4-1_amd64.deb libgoa-1.0-dev_3.12.4-1_amd64.deb libgoa-1.0-0b_3.12.4-1_amd64.deb gnome-online-accounts_3.12.4-1_amd64.deb gir1.2-goa-1.0_3.12.4-1_amd64.deb After downloading them in the same folder, I run the command sudo dpkg -i *.deb, but it didn't install the packages, instead it showed errors due to packages which them depend doesn't meet the required version (and Ubuntu have no way to install them since they are not in this version's repositories). So now every time I want to install anything through apt-get, Ubuntu tells me to run apt-get -f install to fix the errors. This is the list of packages it needs to install/uninstall/update: $ sudo apt-get -f install Lendo listas de pacotes... Pronto Construindo árvore de dependências Lendo informação de estado... Pronto Corrigindo dependências... Pronto Os seguintes pacotes foram instalados automaticamente e já não são necessários: # THESE PACKAGES HAVE BEEN PREVIOUSLY INSTALLED AND ARE NO LONGER NECESSARY account-plugin-windows-live gir1.2-gweather-3.0 libatk-bridge2.0-dev libatk1.0-dev libcairo-script-interpreter2 libcairo2-dev libexpat1-dev libfontconfig1-dev libfreetype6-dev libgdk-pixbuf2.0-dev libglib2.0-dev libgtk-3-dev libharfbuzz-dev libharfbuzz-gobject0 libice-dev libpango1.0-dev libpcre3-dev libpcrecpp0 libpixman-1-dev libpng12-dev libpthread-stubs0-dev librest-dev libsm-dev libsoup2.4-dev libwayland-dev libx11-dev libx11-doc libxau-dev libxcb-render0-dev libxcb-shm0-dev libxcb1-dev libxcomposite-dev libxcursor-dev libxdamage-dev libxdmcp-dev libxext-dev libxfixes-dev libxft-dev libxi-dev libxinerama-dev libxkbcommon-dev libxml2-dev libxrandr-dev libxrender-dev pkg-config signon-plugin-password x11proto-composite-dev x11proto-core-dev x11proto-damage-dev x11proto-fixes-dev x11proto-input-dev x11proto-kb-dev x11proto-randr-dev x11proto-render-dev x11proto-xext-dev x11proto-xinerama-dev xorg-sgml-doctools xtrans-dev zlib1g-dev Utilize 'apt-get autoremove' para os remover. Os pacotes extra a seguir serão instalados: # THE FOLLOWING PACKAGES WILL BE INSTALLED debhelper dh-apparmor libatk-bridge2.0-dev libatk1.0-dev libcairo-script-interpreter2 libcairo2-dev libept1.4.12 libexpat1-dev libfontconfig1-dev libfreetype6-dev libgdk-pixbuf2.0-dev libglib2.0-dev libgtk-3-dev libharfbuzz-dev libharfbuzz-gobject0 libice-dev libmail-sendmail-perl libpango1.0-dev libpcre3-dev libpcrecpp0 libpixman-1-dev libpng12-dev libpthread-stubs0-dev librest-dev libsm-dev libsoup2.4-dev libwayland-dev libx11-dev libx11-doc libxau-dev libxcb-render0-dev libxcb-shm0-dev libxcb1-dev libxcomposite-dev libxcursor-dev libxdamage-dev libxdmcp-dev libxext-dev libxfixes-dev libxft-dev libxi-dev libxinerama-dev libxkbcommon-dev libxml2-dev libxrandr-dev libxrender-dev pkg-config po-debconf x11proto-composite-dev x11proto-core-dev x11proto-damage-dev x11proto-fixes-dev x11proto-input-dev x11proto-kb-dev x11proto-randr-dev x11proto-render-dev x11proto-xext-dev x11proto-xinerama-dev xorg-sgml-doctools xtrans-dev zlib1g-dev Pacotes sugeridos: dh-make apparmor-easyprof libcairo2-doc libglib2.0-doc libgtk-3-doc libice-doc libpango1.0-doc imagemagick libsm-doc libsoup2.4-doc libxcb-doc libxext-doc libmail-box-perl Os pacotes a seguir serão REMOVIDOS: # THE FOLLOWING PACKAGES WILL BE REMOVED account-plugin-aim account-plugin-jabber account-plugin-salut account-plugin-yahoo empathy evolution evolution-data-server evolution-data-server-online-accounts evolution-indicator evolution-plugins gdm gir1.2-gdata-0.0 gir1.2-goa-1.0 gir1.2-zpj-0.0 gnome-contacts gnome-control-center gnome-documents gnome-online-accounts gnome-online-miners gnome-shell gnome-shell-extension-weather gnome-shell-extensions grilo-plugins-0.2 gvfs-backends-goa libevolution libfolks-eds25 libgdata13 libgoa-1.0-0b libgoa-1.0-dev libgoa-backend-1.0-1 libgoa-backend-1.0-dev libzapojit-0.0-0 mcp-account-manager-uoa nautilus-sendto-empathy ubuntu-gnome-desktop Os NOVOS pacotes a seguir serão instalados: # THE NEW FOLLOWING PACKAGES WILL BE INSTALLED debhelper dh-apparmor libatk-bridge2.0-dev libatk1.0-dev libcairo-script-interpreter2 libcairo2-dev libept1.4.12 libexpat1-dev libfontconfig1-dev libfreetype6-dev libgdk-pixbuf2.0-dev libglib2.0-dev libgtk-3-dev libharfbuzz-dev libharfbuzz-gobject0 libice-dev libmail-sendmail-perl libpango1.0-dev libpcre3-dev libpcrecpp0 libpixman-1-dev libpng12-dev libpthread-stubs0-dev librest-dev libsm-dev libsoup2.4-dev libwayland-dev libx11-dev libx11-doc libxau-dev libxcb-render0-dev libxcb-shm0-dev libxcb1-dev libxcomposite-dev libxcursor-dev libxdamage-dev libxdmcp-dev libxext-dev libxfixes-dev libxft-dev libxi-dev libxinerama-dev libxkbcommon-dev libxml2-dev libxrandr-dev libxrender-dev pkg-config po-debconf x11proto-composite-dev x11proto-core-dev x11proto-damage-dev x11proto-fixes-dev x11proto-input-dev x11proto-kb-dev x11proto-randr-dev x11proto-render-dev x11proto-xext-dev x11proto-xinerama-dev xorg-sgml-doctools xtrans-dev zlib1g-dev 0 pacotes atualizados, 61 pacotes novos instalados, 35 a serem removidos e 22 não atualizados. 7 pacotes não totalmente instalados ou removidos. É preciso baixar 12,0 MB de arquivos. Depois desta operação, 25,0 MB adicionais de espaço em disco serão usados. Você quer continuar? [S/n] Along packages needed to be removed are even gdm. This is 100% sure to make the system useless. What can I do to fix this issue? I don't care if I can't install the new version of goa anymore.

    Read the article

  • Big Data – Buzz Words: What is Hadoop – Day 6 of 21

    - by Pinal Dave
    In yesterday’s blog post we learned what is NoSQL. In this article we will take a quick look at one of the four most important buzz words which goes around Big Data – Hadoop. What is Hadoop? Apache Hadoop is an open-source, free and Java based software framework offers a powerful distributed platform to store and manage Big Data. It is licensed under an Apache V2 license. It runs applications on large clusters of commodity hardware and it processes thousands of terabytes of data on thousands of the nodes. Hadoop is inspired from Google’s MapReduce and Google File System (GFS) papers. The major advantage of Hadoop framework is that it provides reliability and high availability. What are the core components of Hadoop? There are two major components of the Hadoop framework and both fo them does two of the important task for it. Hadoop MapReduce is the method to split a larger data problem into smaller chunk and distribute it to many different commodity servers. Each server have their own set of resources and they have processed them locally. Once the commodity server has processed the data they send it back collectively to main server. This is effectively a process where we process large data effectively and efficiently. (We will understand this in tomorrow’s blog post). Hadoop Distributed File System (HDFS) is a virtual file system. There is a big difference between any other file system and Hadoop. When we move a file on HDFS, it is automatically split into many small pieces. These small chunks of the file are replicated and stored on other servers (usually 3) for the fault tolerance or high availability. (We will understand this in the day after tomorrow’s blog post). Besides above two core components Hadoop project also contains following modules as well. Hadoop Common: Common utilities for the other Hadoop modules Hadoop Yarn: A framework for job scheduling and cluster resource management There are a few other projects (like Pig, Hive) related to above Hadoop as well which we will gradually explore in later blog posts. A Multi-node Hadoop Cluster Architecture Now let us quickly see the architecture of the a multi-node Hadoop cluster. A small Hadoop cluster includes a single master node and multiple worker or slave node. As discussed earlier, the entire cluster contains two layers. One of the layer of MapReduce Layer and another is of HDFC Layer. Each of these layer have its own relevant component. The master node consists of a JobTracker, TaskTracker, NameNode and DataNode. A slave or worker node consists of a DataNode and TaskTracker. It is also possible that slave node or worker node is only data or compute node. The matter of the fact that is the key feature of the Hadoop. In this introductory blog post we will stop here while describing the architecture of Hadoop. In a future blog post of this 31 day series we will explore various components of Hadoop Architecture in Detail. Why Use Hadoop? There are many advantages of using Hadoop. Let me quickly list them over here: Robust and Scalable – We can add new nodes as needed as well modify them. Affordable and Cost Effective – We do not need any special hardware for running Hadoop. We can just use commodity server. Adaptive and Flexible – Hadoop is built keeping in mind that it will handle structured and unstructured data. Highly Available and Fault Tolerant – When a node fails, the Hadoop framework automatically fails over to another node. Why Hadoop is named as Hadoop? In year 2005 Hadoop was created by Doug Cutting and Mike Cafarella while working at Yahoo. Doug Cutting named Hadoop after his son’s toy elephant. Tomorrow In tomorrow’s blog post we will discuss Buzz Word – MapReduce. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Interview with Ronald Bradford about MySQL Connect

    - by Keith Larson
    Ronald Bradford,  an Oracle ACE Director has been busy working with  database consulting, book writing (EffectiveMySQL) while traveling and speaking around the world in support of MySQL. I was able to take some of his time to get an interview on this thoughts about theMySQL Connect conference. Keith Larson: What where your thoughts when you heard that Oracle was going to provide the community the MySQL Conference ?Ronald Bradford: Oracle has already been providing various different local community events including OTN Tech Days and  MySQL community days. These are great for local regions both in the US and abroad.  In previous years there has been an increase of content at Oracle Open World, however that benefits the Oracle community far more then the MySQL community.  It is good to see that Oracle is realizing the benefit in providing a large scale dedicated event for the MySQL community that includes speakers from the MySQL development teams, invested companies in the ecosystem and other community evangelists.I fully expect a successful event and look forward to hopefully seeing MySQL Connect at the upcoming Brazil and Japan OOW conferences and perhaps an event on the East Coast.Keith Larson: Since you are part of the content committee, what did you think of the submissions that were received during call for papers?Ronald Bradford: There was a large number of quality submissions to the number of available presentation sessions. As with the previous years as a committee member for the annual MySQL conference, there is always a large variety of common cornerstone MySQL features as well as new products and upcoming companies sharing their MySQL experiences. All of the usual major players in the ecosystem will in presenting at MySQL Connect including Facebook, Twitter, Yahoo, Continuent, Percona, Tokutek, Sphinx and Amazon to name a few.  This is ensuring the event will have a large number of quality speakers and a difficult time in choosing what to attend. Keith Larson: What sessions do you look forwarding to attending? Ronald Bradford: As with most quality conferences you can only be in one place at one time, so with multiple tracks per session it is always difficult to decide. The continued work and success with MySQL Cluster, and with a number of sessions I am sure will be popular. The features that interest me the most are around the optimizer, where there are several sessions on new features, and on the importance of backups. There are three presentations in this area to choose from.Keith Larson: Are you going to cover any of the content in your books at your MySQL Connect sessions?Ronald Bradford: I will be giving two presentations at MySQL Connect. The first will include the techniques available for creating better indexes where I will be touching on some aspects of the first Effective MySQL book on Optimizing SQL Statements.  In my second presentation from experiences of managing 500+ AWS MySQL instances, I will be touching on areas including SQL tuning, backup and recovery and scale out with replication.   These are the key topics of the initial books in the Effective MySQL series that focus on performance, scalability and business continuity.  The books however cover a far greater amount of detail then can be presented in a 1 hour session. Keith Larson: What features of MySQL 5.6 do you look forward to the most ?Ronald Bradford: I am very impressed with the optimizer trace feature. The ability to see exposed information is invaluable not just for MySQL 5.6, but to also apply information discerned for optimizing SQL statements in earlier versions of MySQL.  Not everybody understands that it is easy to deploy a MySQL 5.6 slave into an existing topology running an older version if MySQL for evaluation of many new features.  You can use the new mysqlbinlog streaming feature for duplicating master binary logs on an older version with a MySQL 5.6 slave.  The improvements in instrumentation in the Performance Schema are exciting.   However, as with my upcoming Replication Techniques in Depth title, that will be available for sale at MySQL Connect, there are numerous replication features, some long overdue with provide significant management benefits. Crash Save Slaves, Global transaction Identifiers (GTID)  and checksums just to mention a few.Keith Larson: You have been to numerous conferences, what would you recommend for people at the conference? Ronald Bradford: Make the time to meet and introduce yourself to the speakers that cover the topics that most interest you. The MySQL ecosystem has a very strong community.  The relationships you build with presenters, developers and architects in MySQL can be invaluable, however they are created over time. Get to know these people, interact with them over time.  This is the opportunity to learn more then just the content from a 1 hour session. Keith Larson: Any additional tips to handling the long hours ? Ronald Bradford: Conferences can be hard, especially with all the post event drinking.  This is a two day event and I am sure will include additional events on Friday and Saturday night so come well prepared, and leave work behind. Take the time to learn something new.   You can always catchup on sleep later. Keith Larson: Thank you so much for taking some time to do this I look forward to seeing you at the MySQL Connect conference.  Please stay tuned here for more updates on MySQL. 

    Read the article

  • Windows Azure Recipe: Software as a Service (SaaS)

    - by Clint Edmonson
    The cloud was tailor built for aspiring companies to create innovative internet based applications and solutions. Whether you’re a garage startup with very little capital or a Fortune 1000 company, the ability to quickly setup, deliver, and iterate on new products is key to capturing market and mind share. And if you can capture that share and go viral, having resiliency and infinite scale at your finger tips is great peace of mind. Drivers Cost avoidance Time to market Scalability Solution Here’s a sketch of how a basic Software as a Service solution might be built out: Ingredients Web Role – this hosts the core web application. Each web role will host an instance of the software and as the user base grows, additional roles can be spun up to meet demand. Access Control – this service is essential to managing user identity. It’s backed by a full blown implementation of Active Directory and allows the definition and management of users, groups, and roles. A pre-built ASP.NET membership provider is included in the training kit to leverage this capability but it’s also flexible enough to be combined with external Identity providers including Windows LiveID, Google, Yahoo!, and Facebook. The provider model provides extensibility to hook into other industry specific identity providers as well. Databases – nearly every modern SaaS application is backed by a relational database for its core operational data. If the solution is sold to organizations, there’s a good chance multi-tenancy will be needed. An emerging best practice for SaaS applications is to stand up separate SQL Azure database instances for each tenant’s proprietary data to ensure isolation from other tenants. Worker Role – this is the best place to handle autonomous background processing such as data aggregation, billing through external services, and other specialized tasks that can be performed asynchronously. Placing these tasks in a worker role frees the web roles to focus completely on user interaction and data input and provides finer grained control over the system’s scalability and throughput. Caching (optional) – as a web site traffic grows caching can be leveraged to keep frequently used read-only, user specific, and application resource data in a high-speed distributed in-memory for faster response times and ultimately higher scalability without spinning up more web and worker roles. It includes a token based security model that works alongside the Access Control service. Blobs (optional) – depending on the nature of the software, users may be creating or uploading large volumes of heterogeneous data such as documents or rich media. Blob storage provides a scalable, resilient way to store terabytes of user data. The storage facilities can also integrate with the Access Control service to ensure users’ data is delivered securely. Training & Examples These links point to online Windows Azure training labs and examples where you can learn more about the individual ingredients described above. (Note: The entire Windows Azure Training Kit can also be downloaded for offline use.) Windows Azure (16 labs) Windows Azure is an internet-scale cloud computing and services platform hosted in Microsoft data centers, which provides an operating system and a set of developer services which can be used individually or together. It gives developers the choice to build web applications; applications running on connected devices, PCs, or servers; or hybrid solutions offering the best of both worlds. New or enhanced applications can be built using existing skills with the Visual Studio development environment and the .NET Framework. With its standards-based and interoperable approach, the services platform supports multiple internet protocols, including HTTP, REST, SOAP, and plain XML SQL Azure (7 labs) Microsoft SQL Azure delivers on the Microsoft Data Platform vision of extending the SQL Server capabilities to the cloud as web-based services, enabling you to store structured, semi-structured, and unstructured data. Windows Azure Services (9 labs) As applications collaborate across organizational boundaries, ensuring secure transactions across disparate security domains is crucial but difficult to implement. Windows Azure Services provides hosted authentication and access control using powerful, secure, standards-based infrastructure. Developing Applications for the Cloud, 2nd Edition (eBook) This book demonstrates how you can create from scratch a multi-tenant, Software as a Service (SaaS) application to run in the cloud using the latest versions of the Windows Azure Platform and tools. The book is intended for any architect, developer, or information technology (IT) professional who designs, builds, or operates applications and services that run on or interact with the cloud. Fabrikam Shipping (SaaS reference application) This is a full end to end sample scenario which demonstrates how to use the Windows Azure platform for exposing an application as a service. We developed this demo just as you would: we had an existing on-premises sample, Fabrikam Shipping, and we wanted to see what it would take to transform it in a full subscription based solution. The demo you find here is the result of that investigation See my Windows Azure Resource Guide for more guidance on how to get started, including more links web portals, training kits, samples, and blogs related to Windows Azure.

    Read the article

  • Rules for Naming

    - by PointsToShare
    © 2011 By: Dov Trietsch. All rights reserved Naming Documents (or is it “Document, Naming”?) Tis but thy name that is my enemy; Thou art thyself, though not a Montague. What's Montague? It is nor hand, nor foot, Nor arm, nor face, nor any other part Belonging to a man. O, be some other name! What's in a name? That which we call a rose By any other name would smell as sweet; So Romeo would, were he not Romeo call'd, Retain that dear perfection which he owes Without that title. Romeo, doff thy name And for that name which is no part of thee Take all myself.  Shakespeare – Romeo and Juliet Act II, Scene 2 We normally only use the bold portion of the famous Shakespearean quote above, but it is really out of context. As the play unfolds, we learn that a name is all too powerful. Indeed it is because of their names that the doomed lovers die. There might be life and death in a name (BTW, when I wrote this monogram, I was in Hatfield, PA. Remember the Hatfields and the McCoys?) This is a bit extreme, but in the field of Knowledge Management (KM) names are of the utmost importance as well. When I write an article about managing SharePoint sites, how should I name it? “Managing a site” or “Site, managing”? Nine times out of ten I’d opt for the latter. Almost everything we do is “Managing” so to make life easier for a person looking for meaningful content, we title our articles starting with the differentiator rather than the common factor. As a rule of thumb, we start the name with the noun rather than the verb. It is not what we do that is the primary key; it is what we do it to. So, answer this – is it a “rule of thumb” or a “thumb rule?” This is tough. A lot of what we do when naming is a judgment call. Both thumb and rule are nouns, albeit concrete and abstract (more about this later), but to most people “thumb rule” is meaningless while “rule of thumb” is an idiom. The difference between knowledge and information is that knowledge is meaningful information placed in context. Thus I elect the “rule of thumb”. It is the more meaningful title. Abstract and Concrete are relative terms. Many nouns (and verbs) that are abstract to a commoner, are concrete to a practitioner of one profession or another and may even have different concrete meanings in different professional jargons. Think about “running”. To an executive it means running a business, to a marathoner its meaning is much more literal. Generally speaking, we store and disseminate knowledge within a practice more than we do it in general. Even dictionaries encyclopedias define terms as they apply to different audiences. The rule of thumb is to put the more concrete first, but within the audience’s jargon. Even the title of this monogram is a question. Do I name it “Naming Documents” or “Documents, Naming”? Well, my own rule of thumb (“Here he goes again!?”) states that the latter is better because it starts with a noun, but this is a document about naming more than it about documents. The rules of naming also apply to graphs and charts, excel spreadsheets, and so on. Thus, I vote for the former.  A better title could have been “Naming Objects” only the word “Object” is a bit too abstract. How about just “Naming” or “Naming, rules of”? You get the drift. One of the ways to resolve all of this is to store the documents in Knowledge-Bases, which may become the subjects of a future punditry. Knowledge bases use keywords to describe their content.  Use a Metadata store for the keywords to at least attempt some common grounds. Here is another general rule (rule of thumb?!!) – put at least the one keyword in the title. Use subtitles. Here is an example: Migrating documents – Screening, cleaning, and organizing our knowledge. The main keyword is “documents”, next is “migrating”, other keywords also appear in the subtitle. They are “screening”, “cleaning”, and “organizing”. Any questions? Send me an amply named document by email: [email protected]

    Read the article

  • Spam Activity From my computer

    - by Bnymn
    I'm using Ubuntu 12.04 64bit. I'm using HTTP proxy over ssh as mentioned here. If I do not start TinyProxy, everything is OK. But, when I start TinyProxy, I'm getting the following. I think there is an application running on my machine and watching the proxy to start. But I could not decide which one it could be. ps ax PID TTY STAT TIME COMMAND 1 ? Ss 0:01 /sbin/init 2 ? S 0:00 [kthreadd] 3 ? S 0:00 [ksoftirqd/0] 6 ? S 0:00 [migration/0] 7 ? S 0:00 [watchdog/0] 21 ? S< 0:00 [cpuset] 22 ? S< 0:00 [khelper] 23 ? S 0:00 [kdevtmpfs] 24 ? S< 0:00 [netns] 26 ? S 0:00 [sync_supers] 27 ? S 0:00 [bdi-default] 28 ? S< 0:00 [kintegrityd] 29 ? S< 0:00 [kblockd] 30 ? S< 0:00 [ata_sff] 31 ? S 0:00 [khubd] 32 ? S< 0:00 [md] 34 ? S 0:00 [khungtaskd] 35 ? S 0:00 [kswapd0] 36 ? SN 0:00 [ksmd] 37 ? SN 0:00 [khugepaged] 38 ? S 0:00 [fsnotify_mark] 39 ? S 0:00 [ecryptfs-kthrea] 40 ? S< 0:00 [crypto] 48 ? S< 0:00 [kthrotld] 49 ? S 0:00 [scsi_eh_0] 50 ? S 0:00 [scsi_eh_1] 51 ? S 0:00 [scsi_eh_2] 52 ? S 0:00 [scsi_eh_3] 75 ? S< 0:00 [devfreq_wq] 240 ? S< 0:00 [xfs_mru_cache] 241 ? S< 0:00 [xfslogd] 242 ? S< 0:00 [xfsdatad] 243 ? S< 0:00 [xfsconvertd] 245 ? S 0:00 [xfsbufd/sda3] 246 ? S 0:01 [xfsaild/sda3] 330 ? S 0:00 upstart-udev-bridge --daemon 333 ? Ss 0:00 /sbin/udevd --daemon 472 ? S< 0:00 [cfg80211] 479 ? S< 0:00 [kpsmoused] 671 ? S 0:00 upstart-socket-bridge --daemon 779 ? S 0:00 [xfsbufd/sda4] 781 ? S 0:01 [xfsaild/sda4] 785 ? S< 0:00 [ttm_swap] 800 ? S< 0:00 [hd-audio0] 803 ? S< 0:00 [hd-audio1] 857 ? Sl 0:00 rsyslogd -c5 869 ? Ss 0:04 dbus-daemon --system --fork --activation=upstart 881 ? Ss 0:00 /usr/sbin/modem-manager 883 ? Ss 0:00 /usr/sbin/bluetoothd 905 ? Ssl 0:02 NetworkManager 906 ? Ss 0:00 /usr/sbin/cupsd -F 910 ? Sl 0:02 /usr/lib/policykit-1/polkitd --no-debug 918 ? S 0:00 avahi-daemon: running [bunyamin-hp.local] 919 ? S 0:00 avahi-daemon: chroot helper 920 ? S< 0:00 [krfcommd] 956 ? Ss 0:00 /sbin/wpa_supplicant -B -P /run/sendsigs.omit.d/wpasupplicant.pid -u -s -O /var/run/wpa_supplicant 980 tty4 Ss+ 0:00 /sbin/getty -8 38400 tty4 985 tty5 Ss+ 0:00 /sbin/getty -8 38400 tty5 1000 tty2 Ss+ 0:00 /sbin/getty -8 38400 tty2 1006 tty3 Ss+ 0:00 /sbin/getty -8 38400 tty3 1009 tty6 Ss+ 0:00 /sbin/getty -8 38400 tty6 1024 ? Ss 0:00 acpid -c /etc/acpi/events -s /var/run/acpid.socket 1025 ? Ss 0:00 atd 1026 ? Ss 0:00 cron 1029 ? Ss 0:01 /usr/sbin/irqbalance 1034 ? Ssl 0:00 whoopsie 1091 ? Ssl 0:00 lightdm 1216 tty1 Ss+ 0:00 /sbin/getty -8 38400 tty1 1224 ? Sl 0:00 /usr/lib/accountsservice/accounts-daemon 1241 ? Sl 0:00 /usr/sbin/console-kit-daemon --no-daemon 1356 ? Sl 0:00 /usr/lib/upower/upowerd 1447 ? Sl 0:00 /usr/lib/x86_64-linux-gnu/colord/colord 1539 ? SNl 0:00 /usr/lib/rtkit/rtkit-daemon 1723 ? Sl 0:00 /usr/lib/udisks/udisks-daemon 1724 ? S 0:00 udisks-daemon: not polling any devices 2077 ? Z 0:00 [lightdm] <defunct> 2433 ? Z 0:00 [lightdm] <defunct> 3491 ? S 0:00 [flush-8:0] 4023 ? S 0:00 [kworker/u:14] 4034 ? S 0:00 [migration/1] 4035 ? S 0:00 [kworker/1:3] 4036 ? S 0:00 [ksoftirqd/1] 4037 ? S 0:00 [watchdog/1] 4038 ? S 0:00 [migration/2] 4040 ? S 0:00 [ksoftirqd/2] 4041 ? S 0:00 [watchdog/2] 4042 ? S 0:00 [migration/3] 4043 ? S 0:00 [kworker/3:1] 4044 ? S 0:00 [ksoftirqd/3] 4045 ? S 0:00 [watchdog/3] 4047 ? S 0:00 [irq/43-mei] 4070 ? S 0:00 [kworker/3:0] 4072 ? S 0:00 [kworker/1:0] 4164 ? Ss 0:00 anacron -s 4549 tty7 Ss+ 1:13 /usr/bin/X :0 -auth /var/run/lightdm/root/:0 -nolisten tcp vt7 -novtswitch 4683 ? Sl 0:00 lightdm --session-child 12 47 4718 ? Sl 0:00 /usr/bin/gnome-keyring-daemon --daemonize --login 4729 ? Ssl 0:00 gnome-session --session=gnome-fallback 4765 ? Ss 0:00 /usr/bin/ssh-agent /usr/bin/dbus-launch --exit-with-session gnome-session --session=gnome-fallback 4768 ? S 0:00 /usr/bin/dbus-launch --exit-with-session gnome-session --session=gnome-fallback 4769 ? Ss 0:00 //bin/dbus-daemon --fork --print-pid 5 --print-address 7 --session 4779 ? Sl 0:01 /usr/lib/gnome-settings-daemon/gnome-settings-daemon 4786 ? S 0:00 /usr/lib/gvfs/gvfsd 4788 ? Sl 0:00 /usr/lib/gvfs//gvfs-fuse-daemon -f /home/bunyamin/.gvfs 4797 ? Sl 0:00 /usr/lib/gnome-settings-daemon/gsd-printer 4799 ? Sl 0:03 metacity 4805 ? S 0:00 /usr/lib/x86_64-linux-gnu/gconf/gconfd-2 4811 ? Sl 0:10 gnome-panel 4814 ? S 0:00 syndaemon -i 2.0 -K -R -t 4819 ? S<l 0:00 /usr/bin/pulseaudio --start --log-target=syslog 4821 ? Sl 0:00 /usr/lib/dconf/dconf-service 4826 ? Sl 0:00 /usr/lib/gnome-settings-daemon/gnome-fallback-mount-helper 4828 ? Sl 0:06 nautilus -n 4830 ? Sl 0:02 nm-applet 4832 ? Sl 0:00 /usr/lib/policykit-1-gnome/polkit-gnome-authentication-agent-1 4835 ? Sl 0:00 bluetooth-applet 4851 ? S 0:00 /usr/lib/pulseaudio/pulse/gconf-helper 4854 ? Sl 0:04 /usr/lib/indicator-applet/indicator-applet-complete 4859 ? S 0:00 /usr/lib/gvfs/gvfs-gdu-volume-monitor 4863 ? S 0:00 /usr/lib/gvfs/gvfs-gphoto2-volume-monitor 4865 ? Sl 0:00 /usr/lib/gvfs/gvfs-afc-volume-monitor 4871 ? S 0:00 /usr/lib/gvfs/gvfsd-trash --spawner :1.6 /org/gtk/gvfs/exec_spaw/0 4874 ? Sl 0:00 /usr/lib/indicator-application/indicator-application-service 4876 ? Sl 0:00 /usr/lib/indicator-datetime/indicator-datetime-service 4878 ? Sl 0:00 /usr/lib/indicator-messages/indicator-messages-service 4887 ? Sl 0:00 /usr/lib/indicator-printers/indicator-printers-service 4888 ? Sl 0:00 /usr/lib/indicator-session/indicator-session-service 4889 ? Sl 0:00 /usr/lib/indicator-sound/indicator-sound-service 4906 ? S 0:00 /usr/lib/geoclue/geoclue-master 4929 ? S 0:00 /usr/lib/ubuntu-geoip/ubuntu-geoip-provider 4938 ? Sl 0:11 /usr/lib/gnome-applets/multiload-applet-2 4939 ? Sl 0:01 /usr/lib/gnome-applets/cpufreq-applet 4953 ? S 0:00 /usr/lib/gvfs/gvfsd-metadata 4955 ? S 0:00 /usr/lib/gvfs/gvfsd-burn --spawner :1.6 /org/gtk/gvfs/exec_spaw/1 4957 ? Sl 3:22 /usr/lib/firefox/firefox 4973 ? Sl 0:00 /usr/lib/x86_64-linux-gnu/at-spi2-core/at-spi-bus-launcher 4997 ? Sl 0:00 /usr/lib/gnome-disk-utility/gdu-notification-daemon 5000 ? Sl 0:00 telepathy-indicator 5007 ? Sl 0:00 /usr/lib/telepathy/mission-control-5 5012 ? Sl 0:00 /usr/lib/gnome-online-accounts/goa-daemon 5018 ? Sl 0:00 gnome-screensaver 5019 ? Sl 0:01 zeitgeist-datahub 5025 ? Sl 0:00 /usr/bin/zeitgeist-daemon 5033 ? Sl 0:00 /usr/lib/zeitgeist/zeitgeist-fts 5041 ? S 0:00 /bin/cat 5052 ? Sl 0:08 /usr/bin/gnome-terminal -x /bin/sh -c '/home/bunyamin/Desktop/SSH Tunnel' 5058 ? S 0:00 gnome-pty-helper 5067 ? Sl 0:00 update-notifier 5090 ? S 0:00 /usr/bin/python /usr/lib/system-service/system-service-d 5130 ? Sl 0:00 /usr/lib/deja-dup/deja-dup/deja-dup-monitor 5135 ? S 0:00 /bin/sh -c nice run-parts --report /etc/cron.daily 5136 ? SN 0:00 run-parts --report /etc/cron.daily 5358 pts/4 Ss 0:00 bash 5482 ? S 0:00 [kworker/0:1] 5487 ? S 0:01 [kworker/2:0] 5550 ? Sl 1:15 /usr/lib/firefox/plugin-container /usr/lib/flashplugin-installer/libflashplayer.so -greomni /usr/lib/firefox/omni.ja 4957 true plugin 5717 ? S 0:00 /usr/lib/cups/notifier/dbus dbus:// 5824 ? SN 0:00 /bin/sh /etc/cron.daily/update-notifier-common 5825 ? SN 0:00 /usr/bin/python /usr/lib/update-notifier/package-data-downloader 5872 ? Sl 0:00 /usr/lib/notify-osd/notify-osd 5888 ? S 0:00 /sbin/udevd --daemon 5889 ? S 0:00 /sbin/udevd --daemon 5909 ? S 0:00 /sbin/dhclient -d -4 -sf /usr/lib/NetworkManager/nm-dhcp-client.action -pf /var/run/sendsigs.omit.d/network-manager.dhclient-eth1.pid -lf /var/lib/dhcp/dhclient-f5f0 5912 ? S 0:00 /usr/sbin/dnsmasq --no-resolv --keep-in-foreground --no-hosts --bind-interfaces --pid-file=/var/run/sendsigs.omit.d/network-manager.dnsmasq.pid --listen-address=127. 5975 pts/1 Ss+ 0:00 /bin/sh -c '/home/bunyamin/Desktop/SSH Tunnel' 5976 pts/1 S+ 0:00 /bin/sh /home/bunyamin/Desktop/SSH Tunnel 5977 pts/1 S+ 0:00 ssh -p443 [email protected] -L 8000:127.0.0.1:8000 5980 ? Sl 0:00 /usr/lib/gvfs/gvfsd-http --spawner :1.6 /org/gtk/gvfs/exec_spaw/2 6034 ? S 0:00 [kworker/u:0] 6054 ? S 0:00 [kworker/2:2] 6070 ? S 0:00 [kworker/0:3] 6094 ? Sl 0:02 gedit /home/bunyamin/Desktop/a.html 6101 ? S 0:00 [kworker/0:2] 6130 pts/4 R+ 0:00 ps ax TinyProxy LOG connect to ad.adserverplus.com:80 mx1.u4gf.com - - [17/Oct/2012 07:38:53] "GET http://ad.tagjunction.com/imp?Z=160x600&s=2959021&T=3&_salt=1516586745&B=12&m=2&u=http%3A%2F%2Fsunshinefelling.com%2Findex.php%3Fview%3Darticle%26catid%3D45%253Aplus-size-dresses%26id%3D7512%253A2012-01-25-22-42-00%26format%3Dpdf%26option%3Dcom_content%26Itemid%3D101&r=1 HTTP/1.0" - - bye bye bye connect to ad.adserverplus.com:80 connect to ad.bharatstudent.com:80 connect to ad.yieldmanager.com:80 142.91.199.250.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/imp?Z=0x0&y=29&s=2913320&_salt=2228719469&B=12&m=2&r=1 HTTP/1.0" - - 173.208.94.117 - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/imp?Z=0x0&y=29&s=3187816&_salt=462045326&B=12&m=2&r=1 HTTP/1.0" - - mx1.a54m.com - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/imp?Z=300x250&s=2887338&T=3&_salt=2925281520&B=12&m=2&u=http%3A%2F%2Fsecretskirt.com%2Findex.php%3Foption%3Dcom_contact%26view%3Dcontact%26id%3D1%26Itemid%3D95&r=1 HTTP/1.0" - - 108.62.75.54.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=300x250&s=3218437&T=3&_salt=2939054384&B=12&m=2&u=http%3A%2F%2Fwww.vifinances.com%2Ffinance-investing%2Finsurance-investment%2Fis-life-insurance-investment-necessarily-the-way-to-go.html&r=1 HTTP/1.0" - - connect to ad.yieldmanager.com:80 connect to ad.globe7.com:80 bye connect to ad.globe7.com:80 connect to ad.globe7.com:80 bye 173.208.94.22 - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=728x90&s=2922824&T=3&_salt=705371051&B=12&m=2&u=%3A%2F%2Fsunshinefelling.com%2Findex.php%3Fview%3Darticle%26catid%3D44%3Amature-womens-fashion%26id%3D6917%3A2012-01-25-22-37-27%26tmpl%3Dcomponent%26print%3D1%26layout%3Ddefault%26page%3D&r=1 HTTP/1.0" - - bye 23.19.10.44.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.globe7.com/st?ad_type=iframe&ad_size=160x600&section=3512129&pub_url=${PUB_URL} HTTP/1.0" - - connect to ad.yieldmanager.com:80 bye 142.91.189.27.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.globe7.com/imp?Z=0x0&y=29&s=3660215&_salt=2921537966&B=12&m=2&r=1 HTTP/1.0" - - connect to ad.scanmedios.com:80 bye 142.91.217.158.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.globaltakeoff.net/st?ad_type=iframe&ad_size=160x600&section=2077929&pub_url=${PUB_URL} HTTP/1.0" - - 23.19.76.194.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=728x90&s=3127996&T=3&_salt=1952612979&B=12&m=2&u=http%3A%2F%2Fwww.oseey.com%2Fpure-core-watch%2Fcarbon-fiber-watch%2Fcarbon-monoxide-poisoning-awareness.html&r=1 HTTP/1.0" - - mx1.e6sb.com - - [17/Oct/2012 07:38:53] "GET http://ad.scanmedios.com/imp?Z=728x90&s=3522638&T=3&_salt=3444993091&B=12&m=2&u=http%3A%2F%2Fsunshinefelling.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D6013%3A2012-01-25-22-25-54%26catid%3D40%3Abig-beautiful-women-fashion%26Itemid%3D96&r=1 HTTP/1.0" - - connect to ad.tagjunction.com:80 connect to ad.yieldmanager.com:80 bye connect to ad.yieldmanager.com:80 23.19.76.154.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/st?ad_type=iframe&ad_size=300x250&section=2569393 HTTP/1.0" - - connect to ads.creafi-online-media.com:80 bye 108.62.109.115.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=0x0&y=29&s=3315330&_salt=2385926515&B=12&m=2&r=1 HTTP/1.0" - - 142.91.217.214.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=160x600&s=3634166&T=3&_salt=1590442300&B=12&m=2&u=http%3A%2F%2Fwealthterritory.com%2Findex.php%3Foption%3Dcom_mailto%26tmpl%3Dcomponent%26link%3DaHR0cDovL3dlYWx0aHRlcnJpdG9yeS5jb20vaW5kZXgucGhwP29wdGlvbj1jb21fY29udGVudCZ2aWV3PWFydGljbGUmaWQ9NDY2NDoyMDExLTA3LTA2LTEzLTI2LTUwJmNhdGlkPTQxOnNlcnZpY2VzJkl0ZW1pZ&r=1 HTTP/1.0" - - 108.62.185.184.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ads.creafi-online-media.com/imp?Z=728x90&s=2885766&T=3&_salt=107120374&B=12&m=2&u=http%3A%2F%2Feconomicccore.com%2Findex.php%3Foption%3Dcom_content%26view%3Dcategory%26layout%3Dblog%26id%3D48%26Itemid%3D98%26limitstart%3D45&r=1 HTTP/1.0" - - bye bye bye connect to ad.adserverplus.com:80 connect to ad.yieldmanager.com:80 connect to ad.tagjunction.com:80 bye 108.62.75.252.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/st?ad_type=iframe&ad_size=728x90&section=3213387&pub_url=${PUB_URL} HTTP/1.0" - - bye connect to ad.tagjunction.com:80 bye connect to ad.yieldmanager.com:80 173.208.94.29 - - [17/Oct/2012 07:38:53] "GET http://ad.tagjunction.com/st?ad_type=iframe&ad_size=728x90&section=3006024&pub_url=${PUB_URL} HTTP/1.0" - - 23.19.31.84.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=0x0&y=29&s=2586703&_salt=2905995697&B=12&m=2&r=1 HTTP/1.0" - - oxx-ef-Words.ipwagon.net - - [17/Oct/2012 07:38:53] "GET http://ad.tagjunction.com/imp?Z=0x0&y=29&s=3630499&_salt=4037530564&B=12&m=2&r=1 HTTP/1.0" - - 142.91.185.53.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.tagjunction.com/imp?Z=0x0&y=29&s=3512541&_salt=1134875077&B=12&m=2&r=1 HTTP/1.0" - - connect to ad.globe7.com:80 108.177.187.37.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=300x250&s=3168350&T=3&_salt=548860046&B=12&m=2&u=http%3A%2F%2Flifehealthyliving.com%2Findex.php%3Fview%3Darticle%26catid%3D34%253Ahealthy-food%26id%3D4681%253A2012-05-16-20-40-19%26tmpl%3Dcomponent%26print%3D1%26layout%3Ddefault%26page%3D%26option%3Dcom_content%26Itemid%3D53&r=1 HTTP/1.0" - - connect to ad.adserverplus.com:80 bye connect to ads.creafi-online-media.com:80 108.177.223.180.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/imp?Z=300x250&s=3331290&T=3&_salt=1270334669&B=12&m=2&u=http%3A%2F%2Fwww.vegls.com%2Faccident-attorneys-firms%2Fauto-accident-attorney%2Ffind-the-correct-auto-accident-attorney.html&r=1 HTTP/1.0" - - bye 142.91.185.38.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.globe7.com/st?ad_type=iframe&ad_size=160x600&section=818253 HTTP/1.0" - - connect to ad.yieldmanager.com:80 bye bye bye 108.62.75.230.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ads.creafi-online-media.com/st?ad_type=pop&ad_size=0x0&section=3323456&banned_pop_types=29&pop_times=1&pop_frequency=86400&pub_url=${PUB_URL} HTTP/1.0" - - connect to ad.adserverplus.com:80 bye connect to ad.adserverplus.com:80 bye 142.91.217.194.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=300x250&s=3068801&T=3&_salt=1246107431&B=12&m=2&u=http%3A%2F%2Fmoodoffashionandbeauty.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D756%3A2011-07-13-13-13-43%26catid%3D36%3Afashion-clothes%26Itemid%3D55&r=1 HTTP/1.0" - - connect to ad.smxchange.com:80 108.62.185.235.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/st?ad_type=iframe&ad_size=300x250&section=3307618&pub_url=${PUB_URL} HTTP/1.0" - - connect to ad.globe7.com:80 bye connect to ad.yieldmanager.com:80 bye bye connect to ad.adserverplus.com:80 connect to ad.yieldmanager.com:80 connect to ad.adserverplus.com:80 connect to ad.yieldmanager.com:80 108.177.168.183.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.globe7.com/imp?Z=300x250&s=3582877&T=3&_salt=3271923155&B=12&m=2&u=http%3A%2F%2Fwomenhealthroad.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D5780%3A2011-12-12-16-56-53%26catid%3D40%3Ahealth-issues%26Itemid%3D96&r=1 HTTP/1.0" - - 23.19.3.100.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=160x600&s=2895969&T=3&_salt=207805714&B=12&m=2&u=http%3A%2F%2Feconomicccore.com%2Findex.php%3Fview%3Darticle%26catid%3D46%253Aeconomic-news%26id%3D6079%253A2011-09-29-07-39-13%26format%3Dpdf%26option%3Dcom_content%26Itemid%3D96&r=1 HTTP/1.0" - - bye 142.91.199.212.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/st?ad_type=iframe&ad_size=300x250&section=2956039&pub_url=${PUB_URL} HTTP/1.0" - - bye 142.91.189.169.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=728x90&s=3004691&T=3&_salt=2747591679&B=12&m=2&u=http%3A%2F%2Fwww.qtsfinancial.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D5406%3Afinancial-statement-english-page%26catid%3D43%3Afinancial-analysis%26Itemid%3D99&r=1 HTTP/1.0" - - connect to ad.adserverplus.com:80 23.19.31.58.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=0x0&y=29&s=3323560&_salt=3172064457&B=12&m=2&r=1 HTTP/1.0" - - connect to ad.adserverplus.com:80 iei-ix-Words.ipwagon.net - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/imp?Z=728x90&s=3187813&T=3&_salt=1110944041&B=12&m=2&u=http%3A%2F%2Fwww.workinhouses.com%2Fhtml%2Fwallingford-ct-connecticuts-best-places-for-your-home.html&r=1 HTTP/1.0" - - connect to cookex.amp.yahoo.com:80 173.208.94.116 - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/st?ad_type=iframe&ad_size=300x250&section=3213592&pub_url=${PUB_URL} HTTP/1.0" - - bye bye connect to ad.yieldmanager.com:80 connect to ads.creafi-online-media.com:80 bye 108.62.75.99.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/imp?Z=160x600&s=2913321&T=3&_salt=333033369&B=12&m=2&u=http%3A%2F%2Ffashionstreetlight.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D28850%3A2011-12-20-12-59-39%26catid%3D45%3Afashion-accessories%26Itemid%3D101&r=1 HTTP/1.0" - - bye 142.91.217.208.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://cookex.amp.yahoo.com/v2/cexposer/SIG=18kthu27g/*http%3A//ad.yieldmanager.com/imp?Z=300x250&s=2682517&T=3&_salt=1378331643&B=12&m=2&u=http%3A%2F%2Fwww.economicwindows.com%2Findex.php%3Fview%3Darticle%26catid%3D40%253Afinancial-info%26id%3D3854%253A2011-07-06-13-25-37%26format%3Dpdf%26option%3Dcom_content%26Itemid%3D96&r=1 HTTP/1.0" - - bye bye bye 108.62.185.228.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=0x0&y=29&s=3315448&_salt=4241487555&B=12&m=2&r=1 HTTP/1.0" - - 108.62.185.220.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ads.creafi-online-media.com/st?ad_type=iframe&ad_size=728x90&section=3269968 HTTP/1.0" - - connect to ad.tagjunction.com:80 bye connect to ad.globe7.com:80 bye 142.91.185.47.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.tagjunction.com/st?ad_type=pop&ad_size=0x0&section=2958317&banned_pop_types=29&pop_times=1&pop_frequency=0&pub_url=${PUB_URL} HTTP/1.0" - - bye 108.177.168.183.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.globe7.com/imp?Z=160x600&s=3582877&T=3&_salt=1313872999&B=12&m=2&u=http%3A%2F%2Fwomenhealthroad.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D5753%3A2011-12-12-16-56-46%26catid%3D40%3Ahealth-issues%26Itemid%3D96&r=1 HTTP/1.0" - - connect to ad.tagjunction.com:80 bye connect to ad.globe7.com:80 bye connect to ad.adserverplus.com:80 108.62.75.53.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.tagjunction.com/imp?Z=300x250&s=3127172&T=3&_salt=2152278771&B=12&m=2&u=http%3A%2F%2Fwww.oslims.com%2Ffashion-coffee%2Ffashion-slimming-coffee%2Fso-whats-your-poison-coffee-or-tea.html&r=1 HTTP/1.0" - - connect to ad.yieldmanager.com:80 bye bye 108.62.75.170.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/imp?Z=0x0&y=29&s=2909210&_salt=1773835502&B=12&m=2&r=1 HTTP/1.0" - - 23.19.79.3.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.globe7.com/st?ad_type=iframe&ad_size=728x90&section=3571505&pub_url=${PUB_URL} HTTP/1.0" - - 142.91.217.216.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=160x600&s=3630472&T=3&_salt=462936220&B=12&m=2&u=http%3A%2F%2Fwww.economicwindows.com%2Findex.php%3Fview%3Darticle%26catid%3D41%253Afinancial-services%26id%3D4854%253A2011-07-06-13-26-56%26tmpl%3Dcomponent%26print%3D1%26layout%3Ddefault%26page%3D%26option%3Dcom_content%26Itemid%3D97&r=1 HTTP/1.0" - - connect to ad.yieldmanager.com:80 connect to ad.adserverplus.com:80 connect to ad.yieldmanager.com:80 bye connect to ad.yieldmanager.com:80 bye connect to ad.yieldmanager.com:80 142.91.189.176.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=160x600&s=3187822&T=3&_salt=325267799&B=12&m=2&u=http%3A%2F%2Feconomysea.com%2Findex.php%3Foption%3Dcom_mailto%26tmpl%3Dcomponent%26link%3DaHR0cDovL2Vjb25vbXlzZWEuY29tL2luZGV4LnBocD9vcHRpb249Y29tX2NvbnRlbnQmdmlldz1hcnRpY2xlJmlkPTYzNDk6MjAxMS0wOS0yOC0yMC0wNC0xOSZjYXRpZD00NzplY29ub21pYy1uZXdzJkl0ZW1pZD05Nw&r=1 HTTP/1.0" - - connect to ad.adserverplus.com:80 142.91.190.240.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=160x600&s=2956040&T=3&_salt=3354730349&B=12&m=2&u=http%3A%2F%2Fdomarketings.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D279%3AWhy-Contractor-Leads-Are-Best-For-Getting-Ideal-Construction-Prospects%26catid%3D2%3Abusiness&r=1 HTTP/1.0" - - bye 108.62.75.6.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=160x600&s=3323456&T=3&_salt=1244915826&B=12&m=2&u=http%3A%2F%2Fdomarketings.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D989%3AThe-Basics-of-Failure-Mode-and-Effective-Analysis%26catid%3D2%3Abusiness&r=1 HTTP/1.0" - - bye 142.91.217.220.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=728x90&s=2921135&T=3&_salt=1337464905&B=12&m=2&u=http%3A%2F%2Financezone.com%2Findex.php%3Foption%3Dcom_content%26view%3Darticle%26id%3D7236%3A2011-09-05-19-56-54%26catid%3D49%3Acareer-banking%26Itemid%3D99&r=1 HTTP/1.0" - - bye connect to ad.yieldmanager.com:80 108.62.178.229.rdns.ubiquityservers.com - - [17/Oct/2012 07:38:53] "GET http://ad.adserverplus.com/st?ad_type=iframe&ad_size=160x600&section=3168350&pub_url=${PUB_URL} HTTP/1.0" - - connect to ad.yieldmanager.com:80 108.177.168.187.rdns.ubiquity.io - - [17/Oct/2012 07:38:53] "GET http://ad.smxchange.com/st?ad_type=iframe&ad_size=300x250&section=3285387&pop_nofreqcap=1&pub_url=${PUB_URL} HTTP/1.0" - - skg-wr-Words.ipwagon.net - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=0x0&y=29&s=3153972&_salt=3512711469&B=12&m=2&r=1 HTTP/1.0" - - bye connect to ad.yieldmanager.com:80 bye connect to ad.yieldmanager.com:80 mx1.u4gf.com - - [17/Oct/2012 07:38:53] "GET http://ad.yieldmanager.com/imp?Z=160x600&s=2959021&T=3&_salt=1516586745&B=12&m=2&u=http%3A%2F%2Fsunshinefelling.com%2Findex.php%3Fview%3Darticle%26catid%3D45%253Aplus-size-dresses%26id%3D7512%253A2012-01-25-22-42-00%26format%3Dpdf%26option%3Dcom_content%26Itemid%3D101&r=1 HTTP/1.0" - -

    Read the article

  • Portable scripting language for a multi-server admin?

    - by Aaron
    Please Note: Portable as in portableapps.com, not the traditional definition. Originally posted on stackoverflow.com, asking here at another user's suggestion. I'm a DBA and sysadmin, mostly for Windows machines running SQL Server. I'm looking for a programming/scripting language for Windows that doesn't require Admin access or an installer, needing no install process other than expanding it into a folder. My intent is to have a language for automation on which I can standardize. Up to this point, I've been using a combination of batch files and Unix shell, using sh.exe from UnxUtils but it's far from a perfect solution. I've evaluated a handful of options, all of them have at least one serious shortcoming or another. I have a strong preference for something open source or dual license, but I'm more interested in finding the right tool than anything else. Not interested that anything that relies on Cygwin or Java, but at this point I'd be fine with something that needs .NET. Requirements: Manageable footprint (1-100 files, under 30 MB installed) Run on Windows XP and Server (2003+) No installer (exe, msi) Works with external pipes, processes, and files Support for MS SQL Server or ODBC connections Bonus Points: Open Source FFI for calling functions in native DLLs GUI support (native or gtk, wx, fltk, etc) Linux, AIX, and/or OS X support Dynamic, object oriented and/or functional, interpreted or bytecode compiled; interactive development Able to package or compile scripts into executables So far I've tried: Ruby: 148 MB on disk, 23000 files Portable Python: 54 MB on disk, 2800 files Strawberry Perl: 123 MB on disk, 3600 files REBOL: Great, except closed source and no MSSQL or ODBC in free version Squeak Smalltalk: Great, except poor support for scripting ---- cut: points of clarification ---- Why all the limitations? I realize some of my criteria seem arbitrarily confining. It's primarily a product my environment. I work as a SQL Server DBA and backup Unix admin at a division of a large company. In addition to near a hundred boxes running some version or another of SQL Server on Windows, I also support the SQL Server Express Edition installs on over a thousand machines in the field. Because of our security policies, I don't login rights on every machine. Often enough, an issue comes up and I'm given local Admin for some period of time. Often enough, it's some box I've never touched and don't have my own environment setup yet. I may have temporary admin rights on the box, but I'm not the admin for the machine- I'm just the DBA. I've no interest in stepping on the toes of the Windows admins, nor do I want to take over any of their duties. If I bring up "installing" something, suddenly it becomes a matter of interest for Production Control and the Windows admins; if I'm copying up a script, no one minds. The distinction may not mean much to the readers, but if someone gets the wrong idea I've suddenly got a long wait and significant overhead before I can get the tool installed and get the problem solved. That's why I want something that can be copied and run in the manner of a portable app. What about the small footprint? My company has three divisions, each in a different geographical location, and one of them is a new acquisition. We have different production control/security policies in each division. I support our MSSQL databases in all three divisions. The field machines are spread around the US, sometimes connecting to the VPN over very slow links. Installing Ruby \using psexec has taken a long time over these connections. In these instances, the bigger time waster seems to be archives with thousands and thousands of files rather than their sheer size. You could say I'm spoiled by Unix, where the admins usually have at least some modern scripting language installed; I'd use PowerShell, but I don't know it well and more importantly it isn't everywhere I need to work. It's a regular occurrence that I need to write, deploy and execute some script on short notice on some machine I've never on which logged in. Since having Ruby or something similar installed on every machine I'll ever need to touch is effectively impossible because of the approvals, time and and Windows admin labor needed I makes more sense find a solution that allows me to work on my own terms.

    Read the article

  • I'm trying to run some PHP scripts as CLI instead of over HTTP. How do I make them play nice?

    - by gnfti
    Hi everyone. I'm using some PHP scripts from FeedForAll to join together RSS feeds (RSSmesh) and display them as HTML (RSS2HTML). Because I intend to run these scripts fairly intensively and don't want the resulting HTTP requests and bandwidth to count towards my hosting quota, I am in the process of moving to running them on the web host's server in an umbrella PHP "batch" script, and call this script via cron (this is a Linux server, by the way). Here's a (working) sample request over HTTP: http://www.mydomain.com/a/rss2htmlcore/rss2html2.php?XMLFILE=http://www.mydomain.com/a/myapp/xmlcache/feed.xml&TEMPLATE=template.html This will produce the desired HTML output. An example of how I want this to work on the command line: /srv/customers/mycustomer#/mydomain.com/www/a/rss2htmlcore/rss2html2-cli.php /srv/customers/mycustomer#/mydomain.com/www/a/myapp/xmlcache/feed.xml /srv/customers/mycustomer#/mydomain.com/www/a/template.html This is with the correct shebang line added to "rss2html2-cli.php". I could just as well specify the executable ("/usr/local/bin/php") in the request, I doubt it makes a difference because I am able to run another script (that I wrote myself) either way without problems. Now, RSS2HTML and RSSmesh are different in that, for starters, they include secondary files -- for example, both include an XML parser script -- and I suspect that this is where I am getting a bit in over my head. Right now I'm calling exec() from the "umbrella" batch script, like so: exec("/srv/customers/mycustomer#/mydomain.com/www/a/rss2htmlcore/rss2html2-cli.php /srv/customers/mycustomer#/mydomain.com/www/a/myapp/xmlcache/feed.xml /srv/customers/mycustomer#/mydomain.com/www/a/template.html", $output) But no output is being produced. What's the best way to go about this and what "gotchas" should I keep in mind? Is exec() the right way to approach this? It works fine for the other (simple) script but that writes its own output. For this I want to get the output and write it to a file from within the umbrella script if possible. I've also tried output buffering but to no avail. Do I need to pay attention to anything specific with regard to the includes? Right now they're specified in the scripts as include_once("FeedForAll_XMLParser.inc.php"); and the specified files are indeed in the same folder. Further info: -This is a Linux server. -I have no direct access to the shell, so I can't test things directly on a command line, everything is via crontab. -I will admit that support for the FeedForAll scripts leaves a lot to be desired, but I'd like to keep using their scripts if at all possible, if only because I know them and have been using them for a while. I have looked into Simplepie, but the FFA scripts do some things that I've seen no obvious solutions for with Simplepie, like limiting the number of items per individual feed (RSSmesh) or limiting the description length (RSS2HTML). -Yahoo! Pipes is out, they cache their data for too long for my application. Should you want to take a look at the code, here are the scripts as txt files. RSS2HTML2 and RSSmesh are the FeedForAll scripts, FeedForAll_XMLParser... is the included parser. Note that I have not yet amended these to handle $argv etc. I have however in "scraper-universal-rss-cli", which works fine with CLI. If anyone has any thoughts to share on this it would be very much appreciated. Thank you in advance.

    Read the article

< Previous Page | 63 64 65 66 67 68 69 70 71 72  | Next Page >