Search Results

Search found 35354 results on 1415 pages for 'joe even'.

Page 683/1415 | < Previous Page | 679 680 681 682 683 684 685 686 687 688 689 690  | Next Page >

  • Tunlr Gives Non-US Residents Access to Hulu, Netflix, and More

    - by Jason Fitzpatrick
    If you’re outside the US market and looking to enjoy US streaming services like Hulu, Netflix, and more, Tunlr is a free and simple service that will get you connected. Unlike other tools that are more expensive (both in price and in hardware/bandwidth overhead) like VPN services, Tunlr doesn’t set up a full tunnel but instead serves as an alternative DNS server that allows you to access previously blocked content. From the Tunlr FAQ: Tunlr does not provide a virtual private network (VPN). Tunlr is a DNS (domain name system) unblocking service. We’re using sophisticated technologies (a.k.a. the Tunlr Secret Sauce ©) to re-adress certain data envelopes, tricking the receiver into thinking the envelope originated from within the U.S. For these data envelopes, Tunlr is transparently creating a network tunnel from your location to our U.S.-based servers. Any data that’s not directly related to the video or music content providers which Tunlr supports is not only left untouched, it’s also not even routed through Tunlr. Hit up the link below for more information about the service, including how to set it up on various operating systems, portable devices, and gaming consoles. Tunlr [via gHacks] HTG Explains: Why You Only Have to Wipe a Disk Once to Erase It HTG Explains: Learn How Websites Are Tracking You Online Here’s How to Download Windows 8 Release Preview Right Now

    Read the article

  • The Open Data Protocol

    - by Bobby Diaz
    Well, day 2 of the MIX10 conference did not disappoint.  The keynote speakers introduced the preview release of IE9, which looks really cool and quick, and Visual Studio 2010 RC that is scheduled to RTM on April 12th.  It seemed to have a lot of improvements aimed at making developers more productive.  Here are the current links to these two offerings: Internet Explorer 9 – Platform Preview Visual Studio 2010 and .NET 4 – Release Candidate While both of these were interesting, the demos that really blew me away today centered around the work being done with The Open Data Protocol, or OData for short!  OData is a recommended standard being pushed by Microsoft that uses a REST based interface to interact with various types of data in a uniform manner.  Data producers then provide the data to consumer in either ATOM or JSON formats as requested by the client application. The OData SDK contains client and server libraries for many of the popular languages in use today, including .NET, Java, PHP, Objective C and JavaScript, so you consume or even produce your own OData services.  More information can be found using the following links: OData.org How to navigate an OData compliant service Query Functions (WCF Data Services) Netflix has made available one of the first live OData services by exposing their entire movie catalog.  You can browse and query using URLs similar to the following: http://odata.netflix.com/ http://odata.netflix.com/Catalog/Genres('Horror')/CatalogTitles http://odata.netflix.com/Catalog/CatalogTitles?$filter=startswith(Title/Regular,%20'Star%20Wars')&$orderby=Title/Regular So now I just need to find an excuse reason to start using OData in a real project! Enjoy!

    Read the article

  • How to Make Your Verizon FIOS Router 1000% More Secure

    - by The Geek
    If you’ve just switched to Verizon FIOS and they’ve installed the new router in your house, there’s just one problem: it’s set to use lousy WEP encryption by default, instead of the much more secure WPA2. Here’s how to fix it. The problem with WEP encryption is that it can be cracked really easily—a skilled hacker can do it in a few minutes, and even an unskilled geek can do it in just a little more time with the right tools. Once they’ve done that, they can leech off your internet connection and do anything they want—including illegal stuff coming from your network. Note: if you are using an old Nintendo DS connected to the internet, they usually only support WEP encryption, so you may not want to do this Latest Features How-To Geek ETC The Complete List of iPad Tips, Tricks, and Tutorials The 50 Best Registry Hacks that Make Windows Better The How-To Geek Holiday Gift Guide (Geeky Stuff We Like) LCD? LED? Plasma? The How-To Geek Guide to HDTV Technology The How-To Geek Guide to Learning Photoshop, Part 8: Filters Improve Digital Photography by Calibrating Your Monitor The Spam Police Parts 1 and 2 – Goodbye Spammers [Videos] Snow Angels Theme for Windows 7 Exploring the Jungle Ruins Wallpaper Protect Your Privacy When Browsing with Chrome and Iron Browser Free Shipping Day is Friday, December 17, 2010 – National Free Shipping Day Find an Applicable Quote for Any Programming Situation

    Read the article

  • Geek it Up

    - by BuckWoody
    I’ve run into a couple of kinds of folks in IT. Some really like technology a lot – a whole lot –and others treat it more as a job. For those of you in the second camp, you can go back to your drab, meaningless jobs – this post is for the first group. I’m a geek. Not a little bit of a geek, a really big one. I love technology, I get excited about science and electronics in general, and I read math books when I don’t have to. Yes, I have a Star Trek item or two around the house. My daughter is fluent in both Monty Python AND Serenity. I totally admit it. So if you’re like me (OK, maybe a little less geeky than that), then go for it. Put those toys in your cubicle, wear your fan shirt, but most of all, geek up your tools. No, this isn’t an April Fool’s post – I really mean it. I’ve noticed that when I get the larger monitor, better mouse, cooler keyboard, I LIKE coming to work. It’s a way to reward yourself – I’ve even found that it makes work easier if I have the kind of things I enjoy around to work with. So buy that old “clicky” IBM keyboard, get three monitors, and buy a nice headset so that you can set all of your sounds to Monty Python WAV’s. And get to work. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Object model design: collections on classes

    - by Luke Puplett
    Hi all, Consider Train.Passengers, what type would you use for Passengers where passengers are not supposed to be added or removed by the consuming code? I'm using .NET Framework, so this discussion would suit .NET, but it could apply to a number of modern languages/frameworks. In the .NET Framework, the List is not supposed to be publicly exposed. There's Collection and ICollection and guidance, which I tend to agree with, is to return the closest concrete type down the inheritance tree, so that'd be Collection since it is already an ICollection. But Collection has read/write semantics and so possibly it should be a ReadOnlyCollection, but its arguably common sense not to alter the contents of a collection that you don't have intimate knowledge about so is it necessary? And it requires extra work internally and can be a pain with (de)serialization. At the extreme ends I could just return Person[] (since LINQ now provides much of the benefits that previously would have been afforded by a more specified collection) or even build a strongly-typed PersonCollection or ReadOnlyPersonCollection! What do you do? Thanks for your time. Luke

    Read the article

  • If some standards apply when "it depends" then should I stick with custom approaches?

    - by Travis J
    If I have an unconventional approach which works better than the industry standard, should I just stick with it even though in principal it violates those standards? What I am talking about is referential integrity for relational database management systems. The standard for enforcing referential integrity is to CASCADE delete. In practice, this is just not going to work all the time. In my current case, it does not. The alternative suggested is to either change the reference to NULL, DEFAULT, or just to take NO ACTION - usually in the form of a "soft delete". I am all about enforcing referential integrity. Love it. However, sometimes it just does not fully apply to use all the standards in practice. My approach has been to slightly abandon a small part of one of those practices which is the part about leaving "hanging references" around. Oops. The trade off is plentiful in this situation I believe. Instead of having deprecated data in the production database, a splattering of "soft delete" logic all across my controllers (and views sometimes depending on how far down the chain the soft delete occurred), and the prospect of queries taking longer and longer - instead of all that - I now have a recycle bin and centralized logic. The only tradeoff is that I must explicitly manage the possibility of "hanging references" which can be done through generics with one class. Any thoughts?

    Read the article

  • Word 2010 Navigation Pane and more

    - by Daniel Moth
    I have been using Office 2010 since Beta1 and have not looked back since. I am currently on an internal RC, but will upgrade tomorrow to the RTM version. There are a plethora of new productivity features and for Word 2010 the one that overshadows everything else, IMO, is the Navigation Pane. I could spend time describing it here, but I'll never be able to cover it more thoroughly than what the product team has on their blog post. You enable it via the "Navigation Pane" checkbox in the "Show" group of the "View" tab on the Word ribbon. Even if you have come across this new Word 2010 feature, trust me you will learn something more about it, you will thank me later. Go learn how to make the most of the new Navigation Pane.             As an aside, there are many new benefits in PowerPoint 2010 too, my favorite being support for sections. Not to leave Excel 2010 out, you should check Excel's integration with HPC Server. Comments about this post welcome at the original blog.

    Read the article

  • Big Success for the 2012 edition of the Oracle EMEA Cloud CRM Partner Community Forum!

    - by Richard Lefebvre
    The 2012 edition of the Oracle EMEA Cloud Partner Community Forum took place on march 28&29 in Madrid. 100 participants from all over Europe had a chance to interact with the Oracle Cloud CRM Product Management team about multiple subjects such as Oracle Cloud CRM and Social Network solutions strategy, RightNow acquisition, Fusion CRM Business opportunities for partners, etc. During his opening keynote, Anthony Lye (Oracle Senior Vice-President and head of Oracle CRM) presented the current Fusion CRM business status, disclosed the overall Oracle CRM product strategy and responded to many questions from the audience. Later on that day, 8 Oracle ISV's presented their Oracle Cloud CRM add-on's and highlighted the value that System Integrators can benefit from as part of a Cloud CRM project. After a very friendly networking diner in a Spanish restaurant, the second day was dedicated to Fusion CRM, with a deeper dive into its major components (Sales, Sales Planning, Marketing) including the Fusion Composers. Briefings on Oracle Consulting Services Fusion CRM dedicated offerings and Fusion CRM Partner Programs concluded the day and the event. All participants rated the event as "good" to "Excellent" and mentioned that it was meaningful for them to plan their Oracle Cloud CRM based business in the near future. We look forward to organise a similar event next year and to welcome even more partners! Richard Lefebvre

    Read the article

  • It&rsquo;s About You: Tell Microsoft How They&rsquo;re Doing!

    - by juanlarios
    Every fall and spring, a survey goes out to a few hundred thousand IT folk in Canada asking what they think of Microsoft as a company. The information they get from this survey helps them understand what problems and issues you’re facing and how they can do better. The team at Microsoft Canada takes the input they get from this survey very seriously. Now I don’t know who of you will get the survey and who won’t but if you do find an email in your inbox from "Microsoft Feedback” with an email address of “ [email protected] ” and a subject line “Help Microsoft Focus on Customers and Partners” from now until April 13th — it’s not a hoax or phishing email. Please open it and take a few minutes to tell them what you think. This is your chance to get your voice heard: If they’re doing well, feel free to pile on the kudos (they love positive feedback!) and if you see areas they can improve, please point them out so they can make adjustments (they also love constructive criticism!). The Microsoft team would like to thank you for all your feedback in the past — to those of you who have filled out the survey and sent them emails. Thank you to all who engage with them in so many different ways through events, the blogs, online and in person. You are why they do what they do and they feel lucky to work with such a great community! One last thing - even if you don’t get the survey you can always give the team feedback by emailing us directly through the Microsoft Canada IT Pro Feedback email address . They want to make sure they are serving you in the best possible way. Tell them what you want more of. What should they do less of or stop altogether? How can they help? Do you want more cowbell ? Let them know through the survey or the email alias. They love hearing from you!

    Read the article

  • Tab Sweep: FacesMessage enhancements, Look up thread pool resources, JQuery/JSF integration, Galleria, ...

    - by arungupta
    Recent Tips and News on Java, Java EE 6, GlassFish & more : • Fixing remote GlassFish server errors on NetBeans (Igor Cardoso) • FacesMessage Enhancements (PrimeFaces) • How to create and look up thread pool resource in GlassFish (javahowto) • Jersey 1.12 is released (Jakub Podlesak) • VisualVM problem connecting to monitor Glassfish (Raymond Reid) • JSF 2.0 JQuery-JSF Integration (John Yeary) • JDBC-ODBC Bridge Example (John Yeary) • The Java EE 6 Example - Gracefully dealing with Errors in Galleria - Part 6 (Markus Eisele) • Logout functionality in Java web applications (JavaOnly) • LDAP PASSWORD POLICIES AND JAVAEE (Ricky's Hodgepodge) • Java User Groups Promote Java Education (java.net Editor's Daily Blog) • JavaEE Revisits Design Patterns: Aspects (Interceptor) (Developer Chronicles) • Java EE 6 Hand-on Workshop @ IIUI (Shahzad Badar) • javaee6-crud-example (Arjan Tims) • Sample CRUD application with JSF and RichFaces (Mark van der Tol) • 5 useful methods JSF developers should know (Java Code Geeks) Here are some tweets from this week ... Almost 9000 Parleys views at the #JavaEE6 #Devoxx talk I did with @BertErtman. Not even made available for free yet! #JavaEE6 is hot :-) Sent three proposals for Øredev, about #JavaEE6, #OSGi and a case study about Leren-op-Maat (OSGi in the cloud) together with @m4rr5 [blog] The Java EE 6 #Example - Gracefully dealing with #Errors in #Galleria - Part 6 http://t.co/Drg1EQvf #javaee6 Tomorrow, there is a session about Java EE6 #javaee6 at islamia university #bahawalpur under #pakijug.about 150 students going to attend it.

    Read the article

  • Ur/Web new purely functional language for web programming?

    - by Phuc Nguyen
    I came across the Ur/Web project during my search for web frameworks for Haskell-like languages. It looks like a very interesting project done by one person. Basically, it is a domain-specific purely functional language for web programming, taking the best of ML and Haskell. The syntax is ML, but there are type classes and monad from Haskell, and it's strictly evaluated. Server-side is compiled to native code, client to Javascript. See the slides and FAQ page for other advertised advantages. Looking at the demos and their source code, I think the project is very promising. The latest version is something 20110123, so it seems to be under active development at this time. My question: Has anybody here had any further experience with it? Are there problems/annoyances compared to Haskell, apart from ML's slightly more verbose syntax? Even if it's not well known yet, I hope more people will know of it. OMG this looks very cool to me. I don't want this project to die!!

    Read the article

  • Go Big or Go Home

    - by Justin Kestelyn
    The Oracle Develop conference (#oracledevelop10), being co-located for the first time ever with JavaOne in San Francisco, is guaranteed to be the ultimate rush for developers this year. Where else can you go to learn about, interact with, and meet fellow devotees of the entire Oracle Development stack (welcome, Oracle Solaris)? This will also be the first time that the community space traditionally located at Oracle OpenWorld - and hosted by Oracle Technology Network, as always - will be present at the "developer" conference during this busy week. So, Oracle OpenWorld's loss is Oracle Develop's gain. And what a community space it will be: nearly 4,000 square feet for meeting space, contests and give-aways, consumption of various beverages, special speakers (Oracle ACEs among them, no doubt), and video-casting. The entire Oracle Technology Network crew will be on hand to "facilitate" your experience, of course. Even better, you can rub shoulders and share war stories with attendees from that "other" conference, JavaOne. (You have access to both conferences as a single package, so you may be having a conversation with yourself.) We call the whole enchilada "The Zone". As time goes on, we'll bring you more news about the activities described above, as well as OTN Night (which proves to be more raucous than ever), technical sessions and keynotes not to be missed, the unconference/open sessions, things to do at night, and more. In the meantime, stay in touch with us via Twitter or Oracle Mix.

    Read the article

  • Ranking hit after site migration

    - by Ben
    I migrated my site from its old domain over a month ago. I followed Google Webmaster Tools completely, including 301 redirects from every existing URL to the new domain, and then submitting a change of address. Traffic continued as normal, but then a few days after submitting the change of address traffic plummeted to about 20-30% of what it was previously. Most of my traffic comes from organic search, and I can see that for the keywords I had targeted before and performed well with and am now ranking much much lower for. In some cases for low competition keywords I've only lost a few places, for higher competition terms I have really suffered. This has started to pick up a bit (one of my keywords I have risen from 195 to 100 in the last week), but it seems to be a very slow process. How seamless is this process normally? I was under the impression that this would not affect my rankings too severely, but it has now been a month since the move and recovery seems to be very slow, if at all. Is it likely that I've missed something? The only change is that I have moved what was the home page to be more of a sub-page, and now in its place is a magazine-style home page. I understand that links to the old site will now be pointing to the latter which means that rankings for some keywords attributed to the old home page will take a hit, but even on other pages that seem to fit in exactly the same page structure as the previous site I have seen a drop in rankings.

    Read the article

  • Killer content for my Kindle - The Economist with no need for an iPad - yipeee!

    - by Liam Westley
    I admin it, I was jealous of someone's iPad. They were reading The Economist, for free, as they were a print subscriber. I'm a print subscriber too. However, I don't have an iPad or an iPhone, just an Android phone and a Kindle. As soon as I got the Kindle, I looked up how to get The Economist on it. £9.99 per month. Hmmm, twice as much again as a my print subscription and I wanted to maintain the print subscription. No way Amazon. Fortunately some nice person wrote similar comments on The Economist subscription for Kindle, but added a very important additional nugget of information; and there is no need, as a print subscriber you can just use the free Calibre e-book creation tool anyway. So I downloaded it, searched for The Economist online 'recipe', entered my login name and password (part of my print subscription) and off went Calibre to screen scrape every single article from the Christmas 2010 issue into a .mobi file, complete with front cover image and full indexing. It's wonderful. Truely wonderful. Every section individually indexed, with each article separated and all inline images preserved. It even feels wonderfully retro, back to the days when The Economist only used black and white images. So many thanks the guys behind Calibre and The Economist recipe creators. Finally, I have my essential Kindle content that I've been waiting for.

    Read the article

  • Is Tax Localization a good use for Workflow Foundation?

    - by JustinDoesWork
    Scenario: We have both Winforms and MVC code that is being used to work on a nation wide multi-user platform that does lots of logistics for lots of users. Tax rules change per state and even per city or county. These tax rules make a huge difference for our industry. The other issue is that rules can change based on legislation. The system will have to handle cases where before a date it works one way and then different after that date. This changeover will need to be entered into the system and tested before that date comes. Proposed Solution: Use Workflow Foundation to create a time based system where our users can change and add rules that change the way taxes are calculated. Question: I have not used Workflow Foundation and searching has returned books to look at but not a lot of examples of people using this technology successfully. Is my scenario a good use of Workflow Foundation?(I think so.) If you have any experience with Workflow Foundation, any tips on making this work well?

    Read the article

  • Procedural Generation of tile-based 2d World

    - by Matthias
    I am writing a 2d game that uses tile-based top-down graphics to build the world (i.e. the ground plane). Manually made this works fine. Now I want to generate the ground plane procedurally at run time. In other words: I want to place the tiles (their textures) randomised on the fly. Of course I cannot create an endless ground plane, so I need to restrict how far from the player character (on which the camera focuses on) I procedurally generate the ground floor. My approach would be like this: I have a 2d grid that stores all tiles of the floor at their correct x/y coordinates within the game world. When the players moves the character, therefore also the camera, I constantly check whether there are empty locations in my x/y map within a max. distance from the character, i.e. cells in my virtual grid that have no tile set. In such a case I place a new tile there. Therefore the player would always see the ground plane without gaps or empty spots. I guess that would work, but I am not sure whether that would be the best approach. Is there a better alternative, maybe even a best-practice for my case?

    Read the article

  • Access the Options for Your Favorite Extensions Easier in Firefox

    - by Asian Angel
    Would you prefer a much quicker way to access the options for your favorite extensions in Firefox? Now you can skip opening the Add-ons Manager Tab and access them directly by menu using the Extension Options Menu add-on for Firefox. There is a toolbar button available if you prefer an even quicker method for accessing the options for extensions. Left clicking on the toolbar button displays a menu as shown here and right clicking automatically opens the Add-ons Manager Tab. The options are simple to work with…select or deselect display methods to best suit your needs. Note: Works with Firefox 3.7a5pre – 4.0.* Install Extension Options Menu Add-on (Mozilla Add-ons) [via Ghacks] Latest Features How-To Geek ETC Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions How to Enable User-Specific Wireless Networks in Windows 7 Access the Options for Your Favorite Extensions Easier in Firefox Don’t Sleep Keeps Your Windows Machine Awake DropSpace Syncs Android Files to Dropbox Field of Poppies Wallpaper The History Of Operating Systems [Infographic] DriveSafe.ly Reads Your Text Messages Aloud

    Read the article

  • vsftpd: chroot_local_user causes GNU/TLS-error

    - by akrosikam
    Distro: Ubuntu 12.04.2 Server 32-bit Server client: vsftpd 2.3.5 (from default "main" repository) Problem: Since upgrading from Ubuntu 10.04 to Ubuntu 12.04 (nothing changed on client-side), vsftp has refused to make chroot-jails with the "chroot_local_user" directive on FTP(e/i)S-connections. Here's my vsftpd.conf: anonymous_enable=NO local_enable=YES write_enable=YES local_umask=022 dirmessage_enable=YES xferlog_enable=YES xferlog_std_format=YES ftpd_banner=How are you gentlemen. listen=YES pam_service_name=vsftpd userlist_enable=YES userlist_deny=NO tcp_wrappers=YES connect_from_port_20=YES ftp_data_port=20 listen_port=21 pasv_enable=YES pasv_promiscuous=NO pasv_min_port=4242 pasv_max_port=4252 pasv_addr_resolve=YES pasv_address=your.domain.com ssl_enable=YES allow_anon_ssl=NO force_local_logins_ssl=YES force_local_data_ssl=YES ssl_tlsv1=YES ssl_sslv2=NO ssl_sslv3=NO rsa_cert_file=/home/maw/ssl_ftp_test/vsftpd.pem rsa_private_key_file=/home/maw/ssl_ftp_test/vsftpd.pem debug_ssl=YES log_ftp_protocol=YES ssl_ciphers=HIGH chroot_local_user=NO How to reproduce: Have a working SSL/TLS-secured vsftpd-configuration (I suggest similar to the one above) ready. Try to connect with an FTP user client and upload some files. With my setup, the above listed config works well at this point. Edit /etc/vsftpd.conf and set chroot_local_user= to YES. Make sure that chroot_list_enable= and/or chroot_list_file= are not set. Comment them out if they are. Save and exit. Run sudo restart vsftpd (or sudo service vsftpd restart if you like) in a terminal. Try to connect with an FTP user client. You should see a message more or less like this: GnuTLS error -15: An unexpected TLS packet was received. This is an issue for me, as I do not want FTP-sessions to be able to list files outside the user's home folder. I have checked with several client-side apps, and I get the same results with every one of them. Filezilla is not so good regarding cipher methods nowadays, but as I am able to make an FTP(e)s-connection over TLS (as long as chroot'ing is disabled and ssl_ciphers is set to HIGH) I have a feeling ciphers are not the issue this time, and that I won't find the answer by tweaking configs on the client side. My vsftpd.log stays empty, even though debug_ssl and log_ftp_protocol are enabled, so no info there either.

    Read the article

  • Calculating Delta time , what is wrong?

    - by SteveL
    For 2 days now i am trying to calculate the correct delta time for my game , I am starting to getting crazy since i tried all the solutions that i found on the 5 first google pages... What is wrong? I cant get the correct delta time ,whatever i tried is just not working , the delta goes from 1 to 4 and then back 1 and then to 3 even if i take the averange delta between many frames.Plus the game runs way much faster(i mean the movement) on slow devices than in fast. The game runs on android so the spikes between frames are expected. My code is this: void Game::render() { timesincestart=getTimeMil(); _director->Render(); _director->Update(); float dif=(getTimeMil()-timesincestart);//usally its about 5 milliseconds lastcheck++; sumdelta+=dif; if(lastcheck>20) { sumdelta=sumdelta/20; delta=sumdelta; sumdelta=0; lastcheck=0; } LOGI("delta:%f",delta); }

    Read the article

  • Ribbon Search: Locate MS Office Ribbon Menu Features/Functions Quickly

    - by Kavitha
    In the new versions of Microsoft Office  everything has changed with the introduction of Ribbon menus. Even though Ribbon menus has many advantages that simplifies accessing features, at times it’s a daunting task to navigate the Ribbon menus and find a specific command. Ribbon search is one of the interesting freeware tools to overcome these complaints from users, with this one can search Office ribbon for any feature or function easily. It supports both Office 2007 and  Office 2010(the versions which have ribbon). Once Installation has completed, you can find a text box on top of the ribbon in all the office applications (Outlook, Word, PowerPoint, Excel etc.). As you type few letters of the feature you are looking for, Ribbon Search instantly displays the path through which you can access the feature. Here is a screen grab search of Ribbon Search in action When you start typing itself shows results instantly. And also it gives the path through which you can access feature you are searching for. If there are multiple ways to access the feature, it is also shown in the list. Download Ribbon Search This article titled,Ribbon Search: Locate MS Office Ribbon Menu Features/Functions Quickly, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • Is “Application Programming Interface” a bad name?

    - by Taylor Hawkes
    Application programming interface seems like a bad name for what it is. Is there a reason it was named such? I understand that people used to call them Advanced Programming Interfaces and then renamed to Application Programming Interface. Is that why it is poorly named? Why is it not named Application (to) Programmer Interface. I guess I'm just confused of the meaning behind that name? I write more about my confusion around the name here: BREAKING DOWN THE WORD “APPLICATION PROGRAMMING INTERFACE” This is a very confusing word. We mostly understand what the word Interface means, but “Application Programming”, what even is that. Honestly I'm confused. Is that suppose to be two words like “Application”, “Programming” and then the “Interface” is suppose to mean between the two? Like would a “Computer Human Interface” be an interface between a “Computer” and a “Human” (monitor , keyboard, mouse ) or is a “Computer Human” a real thing - perhaps the terminator. So a CHI is our boy Kyle Reese who is the only way we are able to work with the computer human. I think more likely “Application Programming Interface” was simply poorly named and doesn't really make sense. It was originally called an “Advanced Programming Interface” , but perhaps being a bit to ostentatious merged into the now wildly accepted “Application Programming Interface”. So now, not wanting to change an acronym has confused the living heck out everyone.... Any thoughts or clarification would be great, I'm giving a lecture on this topic in a month, so I would prefer not to BS my way through it.

    Read the article

  • Making a Case For The Command Line

    - by Jesse Taber
    Originally posted on: http://geekswithblogs.net/GruffCode/archive/2013/06/30/making-a-case-for-the-command-line.aspxI have had an idea percolating in the back of my mind for over a year now that I’ve just recently started to implement. This idea relates to building out “internal tools” to ease the maintenance and on-going support of a software system. The system that I currently work on is (mostly) web-based, so we traditionally we have built these internal tools in the form of pages within the app that are only accessible by our developers and support personnel. These pages allow us to perform tasks within the system that, for one reason or another, we don’t want to let our end users perform (e.g. mass create/update/delete operations on data, flipping switches that turn paid modules of the system on or off, etc). When we try to build new tools like this we often struggle with the level of effort required to build them. Effort Required Creating a whole new page in an existing web application can be a fairly large undertaking. You need to create the page and ensure it will have a layout that is consistent with the other pages in the app. You need to decide what types of input controls need to go onto the page. You need to ensure that everything uses the same style as the rest of the site. You need to figure out what the text on the page should say. Then, when you figure out that you forgot about an input that should really be present you might have to go back and re-work the entire thing. Oh, and in addition to all of that, you still have to, you know, write the code that actually performs the task. Everything other than the code that performs the task at hand is just overhead. We don’t need a fancy date picker control in a nicely styled page for the vast majority of our internal tools. We don’t even really need a page, for that matter. We just need a way to issue a command to the application and have it, in turn, execute the code that we’ve written to accomplish a given task. All we really need is a simple console application! Plumbing Problems A former co-worker of mine, John Sonmez, always advocated the Unix philosophy for building internal tools: start with something that runs at the command line, and then build a UI on top of that if you need to. John’s idea has a lot of merit, and we tried building out some internal tools as simple Console applications. Unfortunately, this was often easier said that done. Doing a “File –> New Project” to build out a tool for a mature system can be pretty daunting because that new project is totally empty.  In our case, the web application code had a lot of of “plumbing” built in: it managed authentication and authorization, it handled database connection management for our multi-tenanted architecture, it managed all of the context that needs to follow a user around the application such as their timezone and regional/language settings. In addition, the configuration file for the web application  (a web.config in our case because this is an ASP .NET application) is large and would need to be reproduced into a similar configuration file for a Console application. While most of these problems are could be solved pretty easily with some refactoring of the codebase, building Console applications for internal tools still potentially suffers from one pretty big drawback: you’d have to execute them on a machine with network access to all of the needed resources. Obviously, our web servers can easily communicate the the database servers and can publish messages to our service bus, but the same is not true for all of our developer and support personnel workstations. We could have everyone run these tools remotely via RDP or SSH, but that’s a bit cumbersome and certainly a lot less convenient than having the tools built into the web application that is so easily accessible. Mix and Match So we need a way to build tools that are easily accessible via the web application but also don’t require the overhead of creating a user interface. This is where my idea comes into play: why not just build a command line interface into the web application? If it’s part of the web application we get all of the plumbing that comes along with that code, and we’re executing everything on the web servers which means we’ll have access to any external resources that we might need. Rather than having to incur the overhead of creating a brand new page for each tool that we want to build, we can create one new page that simply accepts a command in text form and executes it as a request on the web server. In this way, we can focus on writing the code to accomplish the task. If the tool ends up being heavily used, then (and only then) should we consider spending the time to build a better user experience around it. To be clear, I’m not trying to downplay the importance of building great user experiences into your system; we should all strive to provide the best UX possible to our end users. I’m only advocating this sort of bare-bones interface for internal consumption by the technical staff that builds and supports the software. This command line interface should be the “back end” to a highly polished and eye-pleasing public face. Implementation As I mentioned at the beginning of this post, this is an idea that I’ve had for awhile but have only recently started building out. I’ve outlined some general guidelines and design goals for this effort as follows: Text in, text out: In the interest of keeping things as simple as possible, I want this interface to be purely text-based. Users will submit commands as plain text, and the application will provide responses in plain text. Obviously this text will be “wrapped” within the context of HTTP requests and responses, but I don’t want to have to think about HTML or CSS when taking input from the user or displaying responses back to the user. Task-oriented code only: After building the initial “harness” for this interface, the only code that should need to be written to create a new internal tool should be code that is expressly needed to accomplish the task that the tool is intended to support. If we want to encourage and enable ourselves to build good tooling, we need to lower the barriers to entry as much as possible. Built-in documentation: One of the great things about most command line utilities is the ‘help’ switch that provides usage guidelines and details about the arguments that the utility accepts. Our web-based command line utility should allow us to build the documentation for these tools directly into the code of the tools themselves. I finally started trying to implement this idea when I heard about a fantastic open-source library called CLAP (Command Line Auto Parser) that lets me meet the guidelines outlined above. CLAP lets you define classes with public methods that can be easily invoked from the command line. Here’s a quick example of the code that would be needed to create a new tool to do something within your system: 1: public class CustomerTools 2: { 3: [Verb] 4: public void UpdateName(int customerId, string firstName, string lastName) 5: { 6: //invoke internal services/domain objects/hwatever to perform update 7: } 8: } This is just a regular class with a single public method (though you could have as many methods as you want). The method is decorated with the ‘Verb’ attribute that tells the CLAP library that it is a method that can be invoked from the command line. Here is how you would invoke that code: Parser.Run(args, new CustomerTools()); Note that ‘args’ is just a string[] that would normally be passed passed in from the static Main method of a Console application. Also, CLAP allows you to pass in multiple classes that define [Verb] methods so you can opt to organize the code that CLAP will invoke in any way that you like. You can invoke this code from a command line application like this: SomeExe UpdateName -customerId:123 -firstName:Jesse -lastName:Taber ‘SomeExe’ in this example just represents the name of .exe that is would be created from our Console application. CLAP then interprets the arguments passed in order to find the method that should be invoked and automatically parses out the parameters that need to be passed in. After a quick spike, I’ve found that invoking the ‘Parser’ class can be done from within the context of a web application just as easily as it can from within the ‘Main’ method entry point of a Console application. There are, however, a few sticking points that I’m working around: Splitting arguments into the ‘args’ array like the command line: When you invoke a standard .NET console application you get the arguments that were passed in by the user split into a handy array (this is the ‘args’ parameter referenced above). Generally speaking they get split by whitespace, but it’s also clever enough to handle things like ignoring whitespace in a phrase that is surrounded by quotes. We’ll need to re-create this logic within our web application so that we can give the ‘args’ value to CLAP just like a console application would. Providing a response to the user: If you were writing a console application, you might just use Console.WriteLine to provide responses to the user as to the progress and eventual outcome of the command. We can’t use Console.WriteLine within a web application, so I’ll need to find another way to provide feedback to the user. Preferably this approach would allow me to use the same handler classes from both a Console application and a web application, so some kind of strategy pattern will likely emerge from this effort. Submitting files: Often an internal tool needs to support doing some kind of operation in bulk, and the easiest way to submit the data needed to support the bulk operation is in a file. Getting the file uploaded and available to the CLAP handler classes will take a little bit of effort. Mimicking the console experience: This isn’t really a requirement so much as a “nice to have”. To start out, the command-line interface in the web application will probably be a single ‘textarea’ control with a button to submit the contents to a handler that will pass it along to CLAP to be parsed and run. I think it would be interesting to use some javascript and CSS trickery to change that page into something with more of a “shell” interface look and feel. I’ll be blogging more about this effort in the future and will include some code snippets (or maybe even a full blown example app) as I progress. I also think that I’ll probably end up either submitting some pull requests to the CLAP project or possibly forking/wrapping it into a more web-friendly package and open sourcing that.

    Read the article

  • Silverlight Cream for May 01, 2010 -- #853

    - by Dave Campbell
    In this Issue: Damian Schenkelman, Rob Eisenberg, Sergey Barskiy, Victor Gaudioso, CorrinaB, Mike Snow, and Adam Kinney. From SilverlightCream.com: Prism’s future: Trying to summarize things Damian Schenkelman collected links to the latest Prism information to provide a reference post, including discussing WP7. MVVM Study - Interlude Rob Eisenberg discusses MVVM - it's beginnings and links out to all the major players old and new. Windows Phone 7 Database Here we go... Sergey Barskiy converted his Silverlight database project to WP7, and it's available on CodePlex... cool! New Silverlight Video Tutorial: How to Save an Image in Your Silverlight Applications Victor Gaudioso has a new video tutorial up... demonstrating saving an image from Silverlight to your hard disk. He also has the source files for download. Enforce Design Guidelines With Styles And Behaviors CorrinaB has a post up discussing attaching behaviors in styles. She has a couple good examples and a sample project to download. Silverlight Tip of the Day #9 – Obtaining Your clients IP Address Mike Snow has Tip number 9 up and he's explaining how to find the client IP address even though it's not natively available from Silverlight or jscript. Expression Blend 4 for Windows Phone in 90 seconds Adam Kinney talks about the release of a new version of the Expression Blend add-in for WP7. He's got links and instructions for removing and upgrading. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Friday Tips #33

    - by Chris Kawalek
    Happy Friday, everyone! Our tip this week is from an excellent white paper written by our own Greg King titled Oracle VM 3: Building a Demo Environment using Oracle VM VirtualBox. In it, Greg gives you everything you need to know to set up Oracle VM Server inside of Oracle VM VirtualBox for testing and demoing. The section we're highlighting below is on how to configure the network interfaces of your virtual machines: VirtualBox comes with a few different types of network interfaces that can be used to allow communication between the VM guests and the host operating system, including network interfaces that will allow the VM guests to communicate with local and wide area networks accessed from your laptop or personal computer. However, for the purpose of the demonstration environment we will limit the network communication to include access just between your desktop and the virtual machines being managed by VirtualBox. The install process for Oracle VM VirtualBox creates a single host-only network device on your laptop or personal computer. Using the host-only network device will allow you to open a browser on your desktop to access the Oracle VM Manager running within the VirtualBox VM guest. The device will only allow network traffic between the VM guests and your host operating system, but nothing outside the confines of your laptop or personal computer. We will need to add a second host-only network since the Oracle VM Server appliance has both eth0 and eth1 configured. You can choose to use eth1 on the Oracle VM Servers or not use them – the choice is yours. But, at least the host side network device will exist if you decide to use it. Greg goes on to describe in detail how to setup the network interfaces, so you can head on over to the paper and get even more info. See you next week! -Chris 

    Read the article

  • Third-party open-source projects in .NET and Ruby and NIH syndrome

    - by Anton Gogolev
    The title might seem to be inflammatory, but it's here to catch your eye after all. I'm a professional .NET developer, but I try to follow other platforms as well. With Ruby being all hyped up (mostly due to Rails, I guess) I cannot help but compare the situation in open-source projects in Ruby and .NET. What I personally find interesting is that .NET developers are for the most part severely suffering from the NIH syndrome and are very hesitant to use someone else's code in pretty much any shape or form. Comparing it with Ruby, I see a striking difference. Folks out there have gems literally for every little piece of functionality imaginable. New projects are popping out left and right and generally are heartily welcomed. On the .NET side we have CodePlex which I personally find to be a place where abandoned projects grow old and eventually get abandoned. Now, there certainly are several well-known and maintained projects, but the number of those pales in comparison with that of Ruby. Granted, NIH on the .NET devs part comes mostly from the fact that there are very few quality .NET projects out there, let alone projects that solve their specific needs, but even if there is such a project, it's often frowned upon and is reinvented in-house. So my question is multi-fold: Do you find my observations anywhere near being correct? If so, what are your thoughts on quality and quantitiy of OSS projects in .NET? Again, if you do agree with my thoughts on "NIH in .NET", what do you think is causing it? And finally, is it Ruby's feature set & community standpoint (dynamic language, strong focus on testing) that allows for such easy integration of third-party code?

    Read the article

< Previous Page | 679 680 681 682 683 684 685 686 687 688 689 690  | Next Page >