Search Results

Search found 14265 results on 571 pages for 'technical favorite features'.

Page 403/571 | < Previous Page | 399 400 401 402 403 404 405 406 407 408 409 410  | Next Page >

  • How to Add Any Application to the Windows Desktop Right-Click Menu

    - by The Geek
    If you want really quick access to launch a frequently used application without putting extra icons on your desktop, you can add that application to the context menu for the desktop with a simple registry hack. Here’s how to do it. Naturally, we’ve also covered the opposite scenario—how to clean up your messy Windows context menu, which is an equally useful read if you’ve got a bunch of items you want to remove from the menu. Note: this article was originally published a few years ago, but we’ve updated and polished it for Windows 7 and are republishing it for you today. Latest Features How-To Geek ETC The How-To Geek Guide to Learning Photoshop, Part 8: Filters Get the Complete Android Guide eBook for Only 99 Cents [Update: Expired] Improve Digital Photography by Calibrating Your Monitor The How-To Geek Guide to Learning Photoshop, Part 7: Design and Typography How to Choose What to Back Up on Your Linux Home Server How To Harmonize Your Dual-Boot Setup for Windows and Ubuntu Hang in There Scrat! – Ice Age Wallpaper How Do You Know When You’ve Passed Geek and Headed to Nerd? On The Tip – A Lamborghini Theme for Chrome and Iron What if Wile E. Coyote and the Road Runner were Human? [Video] Peaceful Winter Cabin Wallpaper Store Tabs for Later Viewing in Opera with Tab Vault

    Read the article

  • Data binding in web UI frameworks, what's the deal?

    - by c-smile
    I believe that most of modern Web frameworks that pretend to be MVC ones also has a notion of data binding in one form or another. Examples: AngularJS, EmberJS, KnockoutJS, etc. I am assuming that "data binding" is a declarative definition (oxymoron, no?) of live link between data (a.k.a. model) and its representation (a.k.a. view). With some transformers in between (a.k.a. controllers). I understand why declarativeness is kind of appealing but also understand that as usual it comes with the price. In particular: 1. Live binding is quite heavy, either with dirty watch (high CPU consumption) or with Object.observe() (high memory consumption with high CPU load in some scenarios). 2. There is a "frame" part in the framework word, means there are some boundaries/limits that can be hard to overcome if you need slightly more than it was designed for. Quite usual time split: 90% of features are made in 10% of project time. But 10% rest take 90% of project time. I suspect (a.k.a. educated guess) that those MVC things are not helping to implement more functionality in less time... If so their usage motivation is not quite clear. As an example: last week wanted to find virtual list idea/solution. Found one in vanilla JavaScript that is 120 LOC. Implementation of the same but in AngualrJS is about 420 LOC. Most of the code there seems like a fight with the framework itself... So is my question: what benefits that MVC stuff or data binding give us? Is it just a buzzword popular among project managers or they give us something useful. If later one then what exactly?

    Read the article

  • Would Using a PHP Framework Be Beneficial in My Context?

    - by Fractal
    I've just started work at a small start-up company who mainly uses PHP to develop their front-end apps. I had no prior PHP experience before joining, and this has led to my apps becoming large pieces of spaghetti code. I essentially started by adding code to implement an initial feature, and then continued to hack in more code to implement further features – without much thought for the overall design. The apps themselves output XML to render on small mobile devices. I recently started looking into frameworks that I could use. I reckon an advantage would be that they seem to force developers to modularise their programs using good-practice design patterns. This seems great for someone in my position. The extra functions they provide, for example: interfacing with databases in such a way as to make SQL injection impossible, would be very useful too. The downside I can see is that there will be a lot of overhead for me in terms of the time taken to learn the framework itself (while still getting to grips with PHP itself). I'm also worried that it will be overkill for the scale of the apps we develop. They tend to be programs that interface with a fairly simple back-end DB, and will generate about 5 different XML screens. Probably around 1 or 2 thousand lines of code. The time it takes just to configure the frameworks may not be worth it. The final problem I can see is that developers in the company – who have to go over my code, and who do not know the PHP framework I may use – will have a much harder time understanding it. Given those pros and cons, I'm still not sure on what the best course of action will be; so any advice will be greatly appreciated.

    Read the article

  • Exadata X3 In-Memory Database Machine: To be or not to be

    - by Luis Moreno Campos
    Since Larry Ellison announced Oracle Exadata X3 as the new generation of the Database Machine, he established the product in the In-Memory Database arena. And that annoyed some people. We all know that In-Memory Databases are the ones that *only* execute in memory and use the other layers of storage for persistency (mainly disk). Oracle database has always been a technology that uses memory as a caching mechanism and that hasn't change nor it will change with Oracle Database 12c. So this is the central point of fuss when it comes to announcing an Engineered Systems as In-Memory Database, when in fact it still runs Oracle Database, not vanilla but still the same product. Let me tell you purist people out there: when you find no new ground breaking point to get all excited about you decide to bash it, and go against its claims. It's not like a car manufacturer that launches a mini-van in the market and calls it a Sports Car, we are talking about a fundamental change in the ILM stack: level 2 of caching is now self sufficient. It's not DRAM? Who cares, still let's you put in flash amounts of data not done up until now, so I guess Oracle can name it whatever Larry wants because in the end it's something never done before. Now let's imagine that you hop on the pure In-Memory Database bandwagon. You would be stuck with a database technology that lags behind the Oracle Database hundreds of light years in man/hours innovations and features. Do you really want to travel back in time? Remember, the first rule about time travelling is that "Security is not Guaranteed". Your choice. LMC

    Read the article

  • Solving Big Problems with Oracle R Enterprise, Part II

    - by dbayard
    Part II – Solving Big Problems with Oracle R Enterprise In the first post in this series (see https://blogs.oracle.com/R/entry/solving_big_problems_with_oracle), we showed how you can use R to perform historical rate of return calculations against investment data sourced from a spreadsheet.  We demonstrated the calculations against sample data for a small set of accounts.  While this worked fine, in the real-world the problem is much bigger because the amount of data is much bigger.  So much bigger that our approach in the previous post won’t scale to meet the real-world needs. From our previous post, here are the challenges we need to conquer: The actual data that needs to be used lives in a database, not in a spreadsheet The actual data is much, much bigger- too big to fit into the normal R memory space and too big to want to move across the network The overall process needs to run fast- much faster than a single processor The actual data needs to be kept secured- another reason to not want to move it from the database and across the network And the process of calculating the IRR needs to be integrated together with other database ETL activities, so that IRR’s can be calculated as part of the data warehouse refresh processes In this post, we will show how we moved from sample data environment to working with full-scale data.  This post is based on actual work we did for a financial services customer during a recent proof-of-concept. Getting started with the Database At this point, we have some sample data and our IRR function.  We were at a similar point in our customer proof-of-concept exercise- we had sample data but we did not have the full customer data yet.  So our database was empty.  But, this was easily rectified by leveraging the transparency features of Oracle R Enterprise (see https://blogs.oracle.com/R/entry/analyzing_big_data_using_the).  The following code shows how we took our sample data SimpleMWRRData and easily turned it into a new Oracle database table called IRR_DATA via ore.create().  The code also shows how we can access the database table IRR_DATA as if it was a normal R data.frame named IRR_DATA. If we go to sql*plus, we can also check out our new IRR_DATA table: At this point, we now have our sample data loaded in the database as a normal Oracle table called IRR_DATA.  So, we now proceeded to test our R function working with database data. As our first test, we retrieved the data from a single account from the IRR_DATA table, pull it into local R memory, then call our IRR function.  This worked.  No SQL coding required! Going from Crawling to Walking Now that we have shown using our R code with database-resident data for a single account, we wanted to experiment with doing this for multiple accounts.  In other words, we wanted to implement the split-apply-combine technique we discussed in our first post in this series.  Fortunately, Oracle R Enterprise provides a very scalable way to do this with a function called ore.groupApply().  You can read more about ore.groupApply() here: https://blogs.oracle.com/R/entry/analyzing_big_data_using_the1 Here is an example of how we ask ORE to take our IRR_DATA table in the database, split it by the ACCOUNT column, apply a function that calls our SimpleMWRR() calculation, and then combine the results. (If you are following along at home, be sure to have installed our myIRR package on your database server via  “R CMD INSTALL myIRR”). The interesting thing about ore.groupApply is that the calculation is not actually performed in my desktop R environment from which I am running.  What actually happens is that ore.groupApply uses the Oracle database to perform the work.  And the Oracle database is what actually splits the IRR_DATA table by ACCOUNT.  Then the Oracle database takes the data for each account and sends it to an embedded R engine running on the database server to apply our R function.  Then the Oracle database combines all the individual results from the calls to the R function. This is significant because now the embedded R engine only needs to deal with the data for a single account at a time.  Regardless of whether we have 20 accounts or 1 million accounts or more, the R engine that performs the calculation does not care.  Given that normal R has a finite amount of memory to hold data, the ore.groupApply approach overcomes the R memory scalability problem since we only need to fit the data from a single account in R memory (not all of the data for all of the accounts). Additionally, the IRR_DATA does not need to be sent from the database to my desktop R program.  Even though I am invoking ore.groupApply from my desktop R program, because the actual SimpleMWRR calculation is run by the embedded R engine on the database server, the IRR_DATA does not need to leave the database server- this is both a performance benefit because network transmission of large amounts of data take time and a security benefit because it is harder to protect private data once you start shipping around your intranet. Another benefit, which we will discuss in a few paragraphs, is the ability to leverage Oracle database parallelism to run these calculations for dozens of accounts at once. From Walking to Running ore.groupApply is rather nice, but it still has the drawback that I run this from a desktop R instance.  This is not ideal for integrating into typical operational processes like nightly data warehouse refreshes or monthly statement generation.  But, this is not an issue for ORE.  Oracle R Enterprise lets us run this from the database using regular SQL, which is easily integrated into standard operations.  That is extremely exciting and the way we actually did these calculations in the customer proof. As part of Oracle R Enterprise, it provides a SQL equivalent to ore.groupApply which it refers to as “rqGroupEval”.  To use rqGroupEval via SQL, there is a bit of simple setup needed.  Basically, the Oracle Database needs to know the structure of the input table and the grouping column, which we are able to define using the database’s pipeline table function mechanisms. Here is the setup script: At this point, our initial setup of rqGroupEval is done for the IRR_DATA table.  The next step is to define our R function to the database.  We do that via a call to ORE’s rqScriptCreate. Now we can test it.  The SQL you use to run rqGroupEval uses the Oracle database pipeline table function syntax.  The first argument to irr_dataGroupEval is a cursor defining our input.  You can add additional where clauses and subqueries to this cursor as appropriate.  The second argument is any additional inputs to the R function.  The third argument is the text of a dummy select statement.  The dummy select statement is used by the database to identify the columns and datatypes to expect the R function to return.  The fourth argument is the column of the input table to split/group by.  The final argument is the name of the R function as you defined it when you called rqScriptCreate(). The Real-World Results In our real customer proof-of-concept, we had more sophisticated calculation requirements than shown in this simplified blog example.  For instance, we had to perform the rate of return calculations for 5 separate time periods, so the R code was enhanced to do so.  In addition, some accounts needed a time-weighted rate of return to be calculated, so we extended our approach and added an R function to do that.  And finally, there were also a few more real-world data irregularities that we needed to account for, so we added logic to our R functions to deal with those exceptions.  For the full-scale customer test, we loaded the customer data onto a Half-Rack Exadata X2-2 Database Machine.  As our half-rack had 48 physical cores (and 96 threads if you consider hyperthreading), we wanted to take advantage of that CPU horsepower to speed up our calculations.  To do so with ORE, it is as simple as leveraging the Oracle Database Parallel Query features.  Let’s look at the SQL used in the customer proof: Notice that we use a parallel hint on the cursor that is the input to our rqGroupEval function.  That is all we need to do to enable Oracle to use parallel R engines. Here are a few screenshots of what this SQL looked like in the Real-Time SQL Monitor when we ran this during the proof of concept (hint: you might need to right-click on these images to be able to view the images full-screen to see the entire image): From the above, you can notice a few things (numbers 1 thru 5 below correspond with highlighted numbers on the images above.  You may need to right click on the above images and view the images full-screen to see the entire image): The SQL completed in 110 seconds (1.8minutes) We calculated rate of returns for 5 time periods for each of 911k accounts (the number of actual rows returned by the IRRSTAGEGROUPEVAL operation) We accessed 103m rows of detailed cash flow/market value data (the number of actual rows returned by the IRR_STAGE2 operation) We ran with 72 degrees of parallelism spread across 4 database servers Most of our 110seconds was spent in the “External Procedure call” event On average, we performed 8,200 executions of our R function per second (110s/911k accounts) On average, each execution was passed 110 rows of data (103m detail rows/911k accounts) On average, we did 41,000 single time period rate of return calculations per second (each of the 8,200 executions of our R function did rate of return calculations for 5 time periods) On average, we processed over 900,000 rows of database data in R per second (103m detail rows/110s) R + Oracle R Enterprise: Best of R + Best of Oracle Database This blog post series started by describing a real customer problem: how to perform a lot of calculations on a lot of data in a short period of time.  While standard R proved to be a very good fit for writing the necessary calculations, the challenge of working with a lot of data in a short period of time remained. This blog post series showed how Oracle R Enterprise enables R to be used in conjunction with the Oracle Database to overcome the data volume and performance issues (as well as simplifying the operations and security issues).  It also showed that we could calculate 5 time periods of rate of returns for almost a million individual accounts in less than 2 minutes. In a future post, we will take the same R function and show how Oracle R Connector for Hadoop can be used in the Hadoop world.  In that next post, instead of having our data in an Oracle database, our data will live in Hadoop and we will how to use the Oracle R Connector for Hadoop and other Oracle Big Data Connectors to move data between Hadoop, R, and the Oracle Database easily.

    Read the article

  • Configuring Transmission for faster download

    - by Luis Alvarado
    I have tested on the same PC with the same torrent/magnet links the following Torrent Clients: Transmission Ktorrent Deluge qBittorrent Vuze After 7 days of testing I noticed that the only one that took longer to start downloading and to keep an optimum/max download speed was Transmission. It was the slowest of them all to download the same torrents or magnet links which I tested 8 torrents and 4 magnet links from different sites and the one that took the most to start downloading or start after a pause/resume event. The other 4 just took less than 2 seconds for example to start downloading and to download the same content between 50% less time to 80% less time. I think that Transmission has the same capabilities about downloading/resuming than the other torrent clients but it may be because of some configuration I need to do to get the same speed and effect than the others. In my tests all torrent clients were tested with their default configurations. No changes were made. They were tested on the same PC, with the same network connection in the same time periods. So I am thinking that Transmission just needs a little bit of configuration tunning. I also set the ports for use to the same one for each. Checked the router for any blocking and anything related to the network. What options can I change to make it so Transmission resumes a download faster (grabs the seeds faster) and keeps a fast download all the time (Stays with the seeds that offer the best connection for example). Both of which by the look of it are features that the rest of the torrent clients do already.

    Read the article

  • Is there a visual web application builder or rapid webapp prototyping framework?

    - by Jesper Mortensen
    Question: Is there such a thing as a self-hosted framework or CMS especially tailored towards the creation of interactive web applications without -- or with an absolute minimum of -- programming? (Substantially less programming than say a simple Rails app or a plugin for Wordpress, Joomla etc would require.) As for desired features I'd settle for whatever is available, but some ideas could be: A User authentication and Permissions system. A GUI-driven input form builder. A GUI-driven template / visual site design builder. A simple scripting language (think AppleScript-like simplicity) A highly modular architecture, with high-level business objects (users, forms data, etc) exposed for easy re-use. If something like the above doesn't exist, then what comes near this? Need: This is for self-hosted rapid prototyping of web applications, and limited user testing of webapp user interface designs in a closed user test. Notes: I know about Ruby on Rails (Rails), Django, Pyramid etc. I'm looking for something much faster to work in, for making prototypes. I know about CMS's in general but find that most of them are tailored towards displaying information to the end users. If there is an exceptionally easy-to-master CMS with easy scripting (lets say much more so than for example Wordpress) then I'd be interested.

    Read the article

  • Oracle Brings Analytics to Project Management

    - by Sylvie MacKenzie, PMP
    Excerpt from PROFIT - ORACLE - by Alison Weiss  Nonprofit and for-profit organizations have many differences, but there is one way they are alike—managers struggle with huge amounts of data generated every day. Project data by itself has limited use—but any organization that can gain insight to make accurate predictions or to use resources more effectively can gain an operational advantage. Oracle’s Primavera P6 Analytics 2.0 business intelligence solution enables organizations using Oracle’s Primavera P6 Professional Project Management to do just that: identify critical issues and uncover trends in stores of project data. Primavera P6 Analytics provides management with the ability to look at not only how a single effort is progressing, but also how the entire organization is doing from a project perspective. The latest release includes new features that make it even easier to gather and analyze critical information. For example, the addition of geocoding gives Primavera P6 Analytics users the ability to track resources geographically on longitude and latitude and use a map to get an overall view of how projects, programs, and activities are deployed. “A nonprofit with relief projects in Vietnam, for example, can drill down to the project and get a world view and a regional view,” says Yasser Mahmud, vice president of product strategy and industry marketing in Oracle’s Primavera Global Business Unit. “Then they can drill down further to show statistics; key performance indicators; and how that program, portfolio, or project work is actually getting done.” The addition of new mobile capabilities to Primavera P6 Analytics puts deep-dive analysis into project managers’ hands with compatibility with major tablet operating systems. Now, nonprofits or for-profits working in remote locations can provide real-time visibility into projects to alert management if issues are occurring that need to be addressed immediately. “Primavera P6 Analytics generates information that can help organizations improve their utilization and trim down overall operating costs,” says Mahmud. “But more importantly, it gives organizations improved visibility.”

    Read the article

  • What web technology could I use which would support a decision tree?

    - by Rami Alhamad
    I am a big game development fan but I haven't done any commercial work in the past. I have been asked by a non-profit to look at developing a game similar to the award-winning www.playspent.org They want the following features: support 5 scenarios mobile isn't important but compatibility with older browsers would be a big bonus they want it to be visual and audible bonus is to have it easily modifiable support 4 languages I don't have much knowledge of Flash and would rather avoid using it as a solution. I started breaking down the problem into segments that I will need to examine, they are as follows: ability to read the game flow from a file that they can produce (xml, etc.) db design to store decision tree language challenge browser compatibility I am leaning towards an Google app engine/GWT solution but I am not sure what technology is best for this. I am really hoping to get your opinion/recommendation on my approach and on what technology is best. A special thanks (and beer if you live in Toronto) will be awarded to anyone who can help give me a ballpark estimate on how much such a game should go for. I know it's tough to estimate but any rough figure will help (how much would you charge for building something like playspent.org?) Thanks in advance

    Read the article

  • Self hosted PHP shopping cart with no storefront?

    - by Question
    I am looking for a shopping cart to implement on a simple website instead of the default paypal cart that is used with their add to cart buttons (I don't like the non-styeable new tab/window cart). However, I really like the ability to simply add the buttons to existing pages. I do not have a lot of products and do not want to deal with a storefront and complex templates. The main features I need: Self-hosted Easy to implement with existing website (copy and paste button code, etc.) Ability to have variations on one button with different prices (dropdown with sizes, colors, etc.) Ability to track inventory and disallow out of stock orders Ability to pass cart details to PayPal Website Payments Standard I have seen most of the large storefront options: oscommerce, zencart, cubecart, opencart, prestashop, magento, cs-cart, lemonstand, etc. but these are way more than I need. I don't need the storefront or customer accounts or templated pages, etc. I have seen e-junkie, which is not far off from what I would like, but it is not self-hosted and I would prefer an in-site cart (or dynamic overlay cart) rather than a lightbox or new tab/window cart. I also love the paypal minicart and its implementation, but there is no way to track inventory. So, does anyone have any recommendations that might meet these requests?

    Read the article

  • Upgrading/Installing Demantra 7.3.1.1? Check this out!

    - by user702295
    Here is a summary for relase 7.3.1.1 install/upgrade/features Data Preservation Setting for General Levels  Deploying Demantra Application Server 10g  Important upgrade Information  Known upgrade issues  Mozilla Firefox Browser  Installer Issues  Reviewing / Simulating General Level Data Such as CTO Base Model Demand  Failure Rate Calculation  Demantra SSL Client Authentication and Java 6  CTO functionality does not work in release 7.3.1.1 after upgrading from 7.3.0 using the ‘Platform Upgrade Only’ option.  User Privileges and Export Worksheet to Excell  Cookie Attribute Causes Logging Issue in Worksheet  List of bugs fixed in 7.3.1.1 See the following for details. Demantra 7.3.1.1 Install / Upgrade Known Issues, Notes, Guidance, Defects, Workarounds (Doc ID 1370518.1) Related Documents For Demantra Version 7.3.1.1 And If Demantra Supports The Required Stacks (Doc ID 1367141.1)

    Read the article

  • Partner Webcasts: EMEA Alliances & Channels Hardware Webinars, July 2012

    - by swalker
    Dear partner Oracle is pleased to invite you to the following webcasts dedicated to our EMEA partner community and designed to provide you with important news on our SPARC and Storage product portfolios. Please ensure you don't miss these unique learning opportunities! 1. How to Make Money Selling SPARC! TOMORROW: 3PM CET (2pm UKT), Tuesday, July 10, 2012 The webcast will be hosted by - Rob Ludeman, from SPARC Product Management Agenda: To bring our partners timely, valuable information, focused on increase in their success during selling SPARC systems. The webcast will be focused and targeted on specific topics and will last approximately in 30 minutes.You can submit your questions via WebEx chat and there will be a live Q&A session at the end of the webcast. REGISTER 2. Introduction to Oracle’s New StorageTek SL150 Modular Tape Library 3pm CET (2pm UK), Thursday, July 12, 2012 This webcast will help you to understand Oracle's New StorageTek SL150 Modular tape library which is the first scalable tape library designed for small and midsized companies that are experiencing high growth. Built from Oracle software and StorageTek library technology, it delivers a cost-effective combination of ease of use and scalability, resulting in overall TCO savings. During the webcast Cindy McCurley, from Tape Product Management will introduce you to the latest addition to the Oracle Tape Storage product portfolio, the SL150 Modular Tape Library. This 60 minutes webcast will cover the product’s features, positioning, unique selling points and a competitive overview on StorageTek. You can submit your questions via WebEx chat and there will be a live Q&A session at the end of the webcast. REGISTER Delivery Format This FREE online LIVE eSeminar will be delivered over the Web and Conference Call. Note: Please join the call 10 minutes before the scheduled start time. We look forward to your participation. Best regards, Giuseppe Facchetti EMEA Partner Business Development Manager, Oracle Hardware Sales Sasan Moaveni EMEA Storage Sales Manager, Oracle Hardware Sales

    Read the article

  • Should GeeksWithBlogs move to the Wordpress Platform?

    - by Martin Hinshelwood
    Geekswithblogs was my first ever blog and my first post was on 22nd June 2006. Since then very little functionality has been added. This is not a complaint, but rather an observation that it is very hard to keep up with all of the blogging capabilities that people want. My point would be: “Why bother!” Vote now for a migration of the awesome Geekswithblogs content from SubText to Wordpress. Having been a long time user of GWB I have been worried of late by my envy of other blogging platforms. I made a number of requests around 10 months ago for things that almost all blogging platforms provide, but which are not available on GWB. Support other comment frameworks – The current comment system is so antiquated that it does not even have the common filters like Facebook, twitter or Google login to help prevent spam. Tags are not listed in the RSS – This can prevent you from getting the Google juice that you deserve 301 redirects to a single URL– If you use a custom URL then all your posts are split between both the GWB URL and the custom one. This is VERY bas for SEO. I realise that it is difficult to find time to add all of the features that all the uber geeks on this site want, so why bother… lets move to the most popular and moded platform available and allow everyone to add whatever widgets they like. Figure: Vote for Geekswithblogs moving to Wordpress Why would I want this? There are over 13k plugins available Easy augmentation model Full mobile support Regular releases lots more… This could be a turning point in the legendary history of Geeks With Blogs, be part of it… What can I do to make this happen? I need your help to make this happen: Vote for it Discuss it

    Read the article

  • How-To Geek Gets the Microsoft MVP Award, Thanks to You

    - by The Geek
    The How-To Geek has won a Microsoft MVP award for the second year in a row, and it’s all thanks to you, our great readers that keep the site going. Join us for some mutual back-patting and some terrible photography of all the award stuff. Of course, if you’re familiar with the MVP award you’ll probably know that it’s actually for a single person, but in my opinion the award belongs to the entire How-To Geek community, without which this site would be nothing. Latest Features How-To Geek ETC HTG Projects: How to Create Your Own Custom Papercraft Toy How to Combine Rescue Disks to Create the Ultimate Windows Repair Disk What is Camera Raw, and Why Would a Professional Prefer it to JPG? The How-To Geek Guide to Audio Editing: The Basics How To Boot 10 Different Live CDs From 1 USB Flash Drive The 20 Best How-To Geek Linux Articles of 2010 Five Sleek Audi R8 Car Themes for Chrome and Iron MS Notepad Replacement Metapad Returns with a New Beta Version Spybot Search and Destroy Now Available as a Portable App (PortableApps.com) ShapeShifter: What Are Dreams? [Video] This Computer Runs on Geek Power Wallpaper Bones, Clocks, and Counters; A Look at the First 35,000 Years of Computing

    Read the article

  • How to practice object oriented programming?

    - by user1620696
    I've always programmed in procedural languages and currently I'm moving towards object orientation. The main problem I've faced is that I can't see a way to practice object orientation in an effective way. I'll explain my point. When I've learned PHP and C it was pretty easy to practice: it was just matter of choosing something and thinking about an algorithm for that thing. In PHP for example, it was matter os sitting down and thinking: "well, just to practice, let me build one application with an administration area where people can add products". This was pretty easy, it was matter of thinking of an algorithm to register some user, to login the user, and to add the products. Combining these with PHP features, it was a good way to practice. Now, in object orientation we have lots of additional things. It's not just a matter of thinking about an algorithm, but analysing requirements deeper, writing use cases, figuring out class diagrams, properties and methods, setting up dependency injection and lots of things. The main point is that in the way I've been learning object orientation it seems that a good design is crucial, while in procedural languages one vague idea was enough. I'm not saying that in procedural languages we can write good software without design, just that for sake of practicing it is feasible, while in object orientation it seems not feasible to go without a good design, even for practicing. This seems to be a problem, because if each time I'm going to practice I need to figure out tons of requirements, use cases and so on, it seems to become not a good way to become better at object orientation, because this requires me to have one whole idea for an app everytime I'm going to practice. Because of that, what's a good way to practice object orientation?

    Read the article

  • Oracle Enterprise Manager Extensibility News - June 2014

    - by Joe Diemer
    Introducing Extensibility Exchange Version 2 On the heals of Enterprise Manager 12c Release 4 this week comes version 2.0 of the Extensibility Exchange.  A new theme allows optimal viewing on a number of different computing devices from large monitor displays to tablets to smartphones.   One of the first things you'll notice is a scrollable banner with the latest news related to Enterprise Manager and extensibility.  Along with the "slider" and the latest entries from Oracle and the Partner community, new features like a tag cloud and an auto-complete search box provide a better way to find the plug-in, connector or other Enterprise Manager entity you are looking for.  Once you find it, a content details page with specific info related to that particular entity will enable you to access it at the provider's site and also rate and comment on that particular item. You can also send an email from the content details page which is routed to the developer.   And if you want to use version 1 of the Extensibility Exchange instead, you will be able to do so via the "Classic" option.  Check it out today at http://www.oracle.com/goto/emextensibility. Recent Additions from Oracle's Partner Community A number of important 3rd party plug-ins have been contributed by Oracle's partner community, which can be accessed via the Extensibility Exchange or by clicking the links in this blog: Dell Open Manage Fusion I-O ION Accelerator NetApp SANtricity E-Series PostgreSQL by Blue Medora You can also check out the following best practices and labs available via the Exchange: Riverbed Stingray Traffic Manager Reference Architecture Datavail Alert Optimizer Custom Templates Apps Associates' Oracle Enterprise Manager "Test Drives" for Oracle Database 12c Management Oracle Enterprise Manager Monitoring Essentials Oracle Application Management Suite for Oracle E-Business Suite

    Read the article

  • How to See Your Current Wi-Fi Connection Speed in Mac OS X

    - by The Geek
    Ever since I’ve been using my new MacBook Air, I’ve been befuddled by how to do some of the simplest tasks in Mac OS X that I would normally do from my Windows laptop—like show the connection speed for the current Wi-Fi network. So am I using Wireless-N or not? Normally, on my Windows 7 laptop, all I’d have to do is hover over the icon, or pop up the list—you can even go into the network details and see just about every piece of data about the network, all from the system tray. Here’s how to see your current connection information on your Mac Latest Features How-To Geek ETC The Complete List of iPad Tips, Tricks, and Tutorials The 50 Best Registry Hacks that Make Windows Better The How-To Geek Holiday Gift Guide (Geeky Stuff We Like) LCD? LED? Plasma? The How-To Geek Guide to HDTV Technology The How-To Geek Guide to Learning Photoshop, Part 8: Filters Improve Digital Photography by Calibrating Your Monitor Free Shipping Day is Friday, December 17, 2010 – National Free Shipping Day Find an Applicable Quote for Any Programming Situation Winter Theme for Windows 7 from Microsoft Score Free In-Flight Wi-Fi Courtesy of Google Chrome Peaceful Winter Road at Sunset Wallpaper Everything You Ever Wanted to Know About Why Pac-Man’s Ghosts Move the Way They Do

    Read the article

  • Watch a Machine Get Upgraded from MS-DOS to Windows 7 [Video]

    - by ETC
    What happens if you try to upgrade a machine from MS-DOS to Windows 7? One curious geek ran the experiment using VMWare and recorded the whole, surprisingly fluid, ride for our enjoyment. Andrew Tait was curious, what would happen if you followed the entire upgrade arc for Windows from the 1980s to the present all on one machine? Thanks to VMWare he was able to find out, following the upgrade path all the way from MS-DOS to Windows 7. Check out the video below to see what happens: Chain of Fools: Upgrading Through Every Version of Windows [YouTube via WinRumors] Latest Features How-To Geek ETC Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions Access and Manage Your Ubuntu One Account in Chrome and Iron Mouse Over YouTube Previews YouTube Videos in Chrome Watch a Machine Get Upgraded from MS-DOS to Windows 7 [Video] Bring the Whole Ubuntu Gang Home to Your Desktop with this Mascots Wallpaper Hack Apart a Highlighter to Create UV-Reactive Flowers [Science] Add a “Textmate Style” Lightweight Text Editor with Dropbox Syncing to Chrome and Iron

    Read the article

  • Leveraging Code in Ever Bigger Games

    - by ashes999
    Summary: The same way that I continually build complex engines and libraries within a single platform and technology to allow me to build increasingly bigger and better games, how to continue this when development crosses into different platforms? If I switch platforms, how do I leverage past code and experiences? Games are hard to build. Big games are even harder to build. I've decided that to be able to make big games, I need to start building smaller games, and building up an asset base of code, assets (graphics, sounds), tools, and most importantly, game engines, so that I can eventually get there. One game at a time. Let me give an analogy. To build an MMO 3D RPG, I would approach this by building and releasing small games with increasingly more features. This could entail, for example: A simple 2D game A tile-based game A game with RPG elements (items, equipment, monsters, battle) A full-fledged RPG A 3D RPG The problem now is if I have to change platforms or tools, I don't know how to leverage past code-bases (and experience) to start with a mature product. Right now, I'm writing Silverlight (FlatRedBall) games. Let's say I stick with this for ten years, and then suddenly decide to write a PS6 game, which is in a different programming language entirely. Granted, I have ten years of game-development experience (and correspondingly ten years of professional software development experience from my day job) to back me up. But I would still like some way to transplant that 2D RPG engine into the new programming language, or else leverage it somehow. Is this even possible? What are my options?

    Read the article

  • How to Change the Default Application for a File Type in Mac OS X

    - by The Geek
    If you’re a recent Mac OS X convert, you might be wondering how to force a particular file type to open in a different application than the default. No? Well, we’re going to explain it anyway. This is most useful when you’ve installed something like VLC and want to open your video files in that instead of the default, which is QuickTime Player. Latest Features How-To Geek ETC RGB? CMYK? Alpha? What Are Image Channels and What Do They Mean? How to Recover that Photo, Picture or File You Deleted Accidentally How To Colorize Black and White Vintage Photographs in Photoshop How To Get SSH Command-Line Access to Windows 7 Using Cygwin The How-To Geek Video Guide to Using Windows 7 Speech Recognition How To Create Your Own Custom ASCII Art from Any Image Vintage Posters Showcase the History of Tech Advertising Google Cloud Print Extension Lets You Print Doc/PDF/Txt Files from Web Sites Hack a $10 Flashlight into an Ultra-bright Premium One Firefox Personas Arrive on Firefox Mobile Focus Booster Is a Sleek and Free Productivity Timer What is the Internet? From the Today Show January 1994 [Historical Video]

    Read the article

  • Book Review: Professional WCF 4

    - by Sam Abraham
    My Investigation of WCF internals have set the right stage to revisit Professional WCF 4 by Pablo Cibraro, Kurt Claeys, Fabio Cozzolino and Johann Grabner. In this book, the authors dive deep into all aspects of the WCF API in a reading targeted towards intermediate and advanced developers. Book quality so far as presentation, code completeness, content clarity and organization was superb. The authors have taken a hands-on approach to thoroughly covering the WCF 4.0 API with three chapters totaling 100+ pages completely dedicated to business cases with downloadable source code readily available. Chapter 1 outlines SOA best-practice considerations. Next three chapters take a top-down approach to the WCF API covering service and data contracts, bindings, clients, instancing and Workflow Services followed by another carefully-thought three chapters covering the security options available via the WCF API. In conclusion, Professional WCF 4.0 provides a thorough coverage of the WCF API and is a recommended read for anybody looking to reinforce their understanding of the various features available in the WCF framework. Many thanks to the Wiley/Wrox User Group Program for their support of our West Palm Beach Developers’ Group.   All the best, --Sam

    Read the article

  • Exadata X3 Sales/Pre-Sales/Support Resell Rights Enablement Day

    - by mseika
    Dear Partner, Avnet and Oracle would like to invite you to a FREE Exadata X3 Sales/Pre-Sales/Support Resell Rights Enablement Day which is taking place on Wednesday 16th January 2013 at the Oracle London City offices in Moorgate. We will give you the opportunity to get an in depth understanding of how hardware and software is engineered to work together to create the power and scalability of Exadata. The session will focus on Oracle Exadata fundamentals, features, components and capabilities. The event will be a day long and will give you the opportunity to put any questions to the presenters that will help you understand how to spot an Exadata opportunity and position Exadata in an opportunity.   Register Now Register now or call our Hotline on 01925 856999. When Wednesday 16th January 2013 Duration: 9.30am to 17.00pm Where Oracle Corporation UK Ltd. One South PlaceLondon EC2M 2RB. For directionsplease see thelocation map.   Cost No charge Contact Us Avnet Technology Solutions LimitedClarity House103 Dalton AvenueBirchwood ParkWarringtonWA3 6YBUKT: 01925 856900 F: 01925 856901 E: [email protected] Or find us online:Avnet websiteLinkedInTwitterFacebook

    Read the article

  • Logarithmic spacing of FFT bins

    - by Mykel Stone
    I'm trying to do the examples within the GameDev.net Beat Detection article ( http://archive.gamedev.net/archive/reference/programming/features/beatdetection/index.html ) I have no issue with performing a FFT and getting the frequency data and doing most of the article. I'm running into trouble though in the section 2.B, Enhancements and beat decision factors. in this section the author gives 3 equations numbered R10-R12 to be used to determine how many bins go into each subband: R10 - Linear increase of the width of the subband with its index R11 - We can choose for example the width of the first subband R12 - The sum of all the widths must not exceed 1024 He says the following in the article: "Once you have equations (R11) and (R12) it is fairly easy to extract 'a' and 'b', and thus to find the law of the 'wi'. This calculus of 'a' and 'b' must be made manually and 'a' and 'b' defined as constants in the source; indeed they do not vary during the song." However, I cannot seem to understand how these values are calculated...I'm probably missing something simple, but learning fourier analysis in a couple of weeks has left me Decimated-in-Mind and I cannot seem to see it.

    Read the article

  • Message Driven Bean JMS integration

    - by Anthony Shorten
    In Oracle Utilities Application Framework V4.1 and above the product introduced the concept of real time JMS integration within the Framework for interfacing. Customer familiar with older versions of the Framework will recall that we used a component called the Multi-purpose Listener (MPL) which was a very light service bus for calling interface channels (including JMS). The MPL is not supplied with all products and customers prefer to use Oracle SOA Suite and native methods rather then MPL. In Oracle Utilities Application Framework V4.1 (and for Oracle Utilities Application Framework V2.2 via Patches 9454971, 9256359, 9672027 and 9838219) we introduced real time JMS integration natively for outbound JMS integration and using Message Driven Beans (MDB) for incoming integration. The outbound integration has not changed a lot between releases where you create an Outbound Message Type to indicate the record types to send out, create a JMS sender (though now you use the Real Time Sender) and then create an External System definition to complete the configuration. When an outbound message appears in the table of the type and external system configured (via a business event such as an algorithm or plug-in script) the Oracle Utilities Application Framework will place the message on the configured Queue linked to the JMS Sender. The inbound integration has changed. In the past you created XAI Receivers and specified configuration about what types of transactions to process. This is now all configuration file driven. The configuration files for the Business Application Server (ejb-jar.xml and weblogic-ejb-jar.xml) define Message Driven Beans and the queues to monitor. When a message appears on the queue, the MDB processes it through our web services interface. Configuration of the MDB can be native (via editing the configuration files) or through the new user exit capabilities (which is aimed at maintaining custom configuration across upgrades). The latter is better as you build fragments of configuration to make it easier to maintain. In the next few weeks a number of new whitepaper will be released to illustrate the features of the Oracle WebLogic JMS and Oracle SOA Suite integration capabilities.

    Read the article

  • How can I generate a view or projection matrix for OpenGL 3.+

    - by Ken
    I'm transitioning from OpenGL 2 to OpenGL 3.+ and to GLSL 1.5. I'm trying to avoid using the deprecated features. My question how do we now generate the view or projection matrix. I was using the matrix stack to calculate the projection matrix for me; GLfloat ptr[16]; gluPerspective(...); glGetFloatv(GL_MODELVIEW_MATRIX, ptr); //then pass ptr via a uniform to the shader But obviously the matrix stack is deprecated. So this approach is not the best an option going forward. I have the 'Red Book', 7th ed, which covers 3.0 & 3.1 and it still uses the deprecated matrix functions in it's examples. I could write some utility-code myself to generate the matrices. But I don't want to re-invent this particular wheel, especially when this functionality is required for every 3D graphics program. What is the accepted way to generate world,view & projection matrices for OpenGL? Is there an emerging 'standard' library for this? Or is there some other hidden (to me) functionality in OpenGL/GLSL which I have overlooked?

    Read the article

< Previous Page | 399 400 401 402 403 404 405 406 407 408 409 410  | Next Page >