Search Results

Search found 3574 results on 143 pages for 'difficult'.

Page 55/143 | < Previous Page | 51 52 53 54 55 56 57 58 59 60 61 62  | Next Page >

  • IT Optimization Plan Pays Off For UK Retailer

    - by Brian Dayton
    I caught this article in ComputerworldUK yesterday. The headline talks about UK-based supermarket chain Morrisons is increasing their IT spend...OK, sounds good. Even nicer that Oracle is a big part of that. But what caught my eye were three things: 1) Morrison's truly has a long term strategy for IT. In this case, modernizing and optimizing how they use IT for business advantage.   2) Even in a tough economic climate, Morrison's views IT investments as contributing to and improving the bottom line. Specifically, "The investment in IT contributed to a 21 percent increase in Morrison's underlying profit.."   3) The phased, 3-year "Optimization Plan" took a holistic approach to their business--from CRM and Supply Chain systems to the underlying application infrastructure. On the infrastructure front, adopting a more flexible Service-Oriented Architecture enabled them to be more agile and adapt their business and Identity Management helped with sometimes mundane (but costly) issues like lost passwords and being able to document who has access to what.   Things don't always turn out so rosy. And I know it was a long and difficult process...but it's nice to see a happy ending every once in a while.  

    Read the article

  • Mission critical embedded language

    - by Moe
    Maybe the question sounds a bit strange, so i'll explain a the background a little bit. Currently i'm working on a project at y university, which will be a complete on-board software for an satellite. The system is programmed in c++ on top of a real-time operating system. However, some subsystems like the attitude control system and the fault detection and a space simulation are currently only implemented in Matlab/Simulink, to prototype the algorithms efficiently. After their verification, they will be translated into c++. The complete on-board software grew very complex, and only a handful people know the whole system. Furthermore, many of the students haven't program in c++ yet and the manual memory management of c++ makes it even more difficult to write mission critical software. Of course the main system has to be implemented in c++, but i asked myself if it's maybe possible to use an embedded language to implement the subsystem which are currently written in Matlab. This embedded language should feature: static/strong typing and compiler checks to minimize runtime errors small memory usage, and relative fast runtime attitude control algorithms are mainly numerical computations, so a good numeric support would be nice maybe some sort of functional programming feature, matlab/simulink encourage you to use it too I googled a bit, but only found Lua. It looks nice, but i would not use it in mission critical software. Have you ever encountered a situation like this, or do you know any language, which could satisfies the conditions? EDIT: To clarify some things: embedded means it should be able to embed the language into the existing c++ environment. So no compiled languages like Ada or Haskell ;)

    Read the article

  • How to Attach Sticky-Note Reminders to Windows and Applications

    - by Erez Zukerman
    Some applications come with a boatload of keyboard shortcuts; these can make you very fast, but can be difficult to remember, especially if you customized some of them. What if you could have your own little cheat sheet that would pop up next to the application every time your ran it? Read on to see how you can make one. We’re going to be using an excellent (and free) application called Stickies. If you don’t have it yet, go to the Stickies homepage, download it, and install it. Latest Features How-To Geek ETC How to Get Amazing Color from Photos in Photoshop, GIMP, and Paint.NET Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? Get the MakeUseOf eBook Guide to Hacker Proofing Your PC Sync Your Windows Computer with Your Ubuntu One Account [Desktop Client] Awesome 10 Meter Curved Touchscreen at the University of Groningen [Video] TV Antenna Helper Makes HDTV Antenna Calibration a Snap Turn a Green Laser into a Microscope Projector [Science] The Open Road Awaits [Wallpaper]

    Read the article

  • Enum.HasFlag

    - by Scott Dorman
    An enumerated type, also called an enumeration (or just an enum for short), is simply a way to create a numeric type restricted to a predetermined set of valid values with meaningful names for those values. While most enumerations represent discrete values, or well-known combinations of those values, sometimes you want to combine values in an arbitrary fashion. These enumerations are known as flags enumerations because the values represent flags which can be set or unset. To combine multiple enumeration values, you use the logical OR operator. For example, consider the following: public enum FileAccess { None = 0, Read = 1, Write = 2, }   class Program { static void Main(string[] args) { FileAccess access = FileAccess.Read | FileAccess.Write; Console.WriteLine(access); } } The output of this simple console application is: The value 3 is the numeric value associated with the combination of FileAccess.Read and FileAccess.Write. Clearly, this isn’t the best representation. What you really want is for the output to look like: To achieve this result, you simply add the Flags attribute to the enumeration. The Flags attribute changes how the string representation of the enumeration value is displayed when using the ToString() method. Although the .NET Framework does not require it, enumerations that will be used to represent flags should be decorated with the Flags attribute since it provides a clear indication of intent. One “problem” with Flags enumerations is determining when a particular flag is set. The code to do this isn’t particularly difficult, but unless you use it regularly it can be easy to forget. To test if the access variable has the FileAccess.Read flag set, you would use the following code: (access & FileAccess.Read) == FileAccess.Read Starting with .NET 4, a HasFlag static method has been added to the Enum class which allows you to easily perform these tests: access.HasFlag(FileAccess.Read) This method follows one of the “themes” for the .NET Framework 4, which is to simplify and reduce the amount of boilerplate code like this you must write. Technorati Tags: .NET,C# 4

    Read the article

  • How to deal with static utility classes when designing for testability

    - by Benedikt
    We are trying to design our system to be testable and in most parts developed using TDD. Currently we are trying to solve the following problem: In various places it is necessary for us to use static helper methods like ImageIO and URLEncoder (both standard Java API) and various other libraries that consist mostly of static methods (like the Apache Commons libraries). But it is extremely difficult to test those methods that use such static helper classes. I have several ideas for solving this problem: Use a mock framework that can mock static classes (like PowerMock). This may be the simplest solution but somehow feels like giving up. Create instantiable wrapper classes around all those static utilities so they can be injected into the classes that use them. This sounds like a relatively clean solution but I fear we'll end up creating an awful lot of those wrapper classes. Extract every call to these static helper classes into a function that can be overridden and test a subclass of the class I actually want to test. But I keep thinking that this just has to be a problem that many people have to face when doing TDD - so there must already be solutions for this problem. What is the best strategy to keep classes that use these static helpers testable?

    Read the article

  • Ambiguous in the namespace problem

    - by Agha Usman Ahmed
    From the last few days, I was ignoring an error that keep coming at the compile time. I spent some two hours on it before but didn’t get it work. The error is quit confusing and of course difficult to manage. 'ApplicationSettingsBase' is ambiguous in the namespace 'System.Configuration' 'MailMessage' is ambiguous in the namespace 'System.Net.Mail' And there are couple of other similar errors that is pointing to some ambiguous references in my project.  The confusing part is that the MailMessage object throws similar error when you are importing the old and new email namespace. For example, Imports System.Web.Mail Imports System.Net.Mail So if you are only encountering ambiguous problem in MailMessage object. It is more possible that you have define both the namespaces in your code behind which is actually confusing the compiler about your referencing object. The quick solve for this problem is that remove Imports System.Web.Mail and it should work smooth. But with me, I never used the old asp.net mail namespace in my project. Then I start looking at my references and luckily I found the problem there. Follow the steps below to investigate the issue 1. Go to your project 2. Then references 3. Right click on “System” and see properties. it should point to the following path x:\Windows\Microsoft.NET\Framework\v2.0.50727\System.dll Where x is name of your operating system directory. This was the problem with my project. I had my operating system install on “D” drive and some how it is pointing to “C” drive which is the root cause of this problem. After that I verify all my references and found 5 –6 assemblies that are pointing to wrong path and get it worked. Also note, the problem can occur in any type of project either it is website , web application etc.

    Read the article

  • Dynamic Forms: Pattern or AntiPattern?

    - by Segfault
    I'm sure you've seen it. The database has a bunch of tables called Forms, Controls,FormsControls, ControlSets, Actions and the program that queries these tables has a dynamically generated user interface. It will read all the forms, load a home page that has links to them all, or embed them in some tabbed or paged home page, and for each of those forms it will read the various text boxes, check boxes, radio buttons, submit buttons, combo boxes, labels and whatnot from the controls and form-to-control join tables, lay those elements out according to the database and link all the controls to logic according to other rules in the database. To me, this is an anti-pattern. It actually make the application more difficult to maintain because the design of it is now spread out into multiple different systems. Also, the database is not source controlled. Sure, it may make one or two changes go more quickly, after you've analyzed the program anyway to understand how to change the data and as long as you don't stray from the sort of changes that were anticipated and accounted for, but that's often just not sustainable. What say you?

    Read the article

  • Keep Your Eye on the Ball

    - by [email protected]
    With the FIFA World Cup 2010 in South Africa almost a week underway, the soccer fans all around the World are talking about at least 2 things. That typical vuvuzela sound and the new Jabulani ball, saying it moves unpredictably, is difficult to handle and somehow the altitude of the World Cup stadiums also seem to be a contributing factor.(Picture taken from http://www.flickr.com/photos/warrenski/4143923059/ under a Creative Commons license)Although the FIFA states that it hasn't received any official complaints, the end users don't seem to be very happy with this new ball. This brings me to a comparison with IT management and testing. When you're in a situation where you're introducing a new product, in IT terms, introducing a new application, you would like to test all possible scenarios that your end users could be using and experiencing. However, that's a very time and resource intensive process to do for every application change or update.  It's like getting ready for the big game but you have no game plan.That's why a new approach has been developed. One that's based on the 80/20 rule. Testing 80% of the application will cost about 20% of the efforts. The remaining 20% of your application will not be tested before deployment, but monitored with a real user monitoring solution immediately after deployment. These tools track all user experiences, including error messages and the performance and availability metrics from an end user perspective. Should any anomaly occur, you would be able to repair it quickly so you and your end users can get back into the game.These real user sessions can be easily converted into testing scripts, so the 80% of the application testing can be complimented with the remaining 20%.Oracle Enterprise Manager 11g group of products offers both the real user monitoring solution with Oracle Real User Experience Insight, as well as the required testing solution with Oracle Application Testing Suite. Visit our Oracle Enterprise Manager 11g resource center and find out how it's Business-Driven IT Management approach will help you keep your eye on your business ball.Happy World Cup.

    Read the article

  • Git-based storage and publishing, infrastructure advice

    - by Joel Martinez
    I wanted to get some advice on moving a system to "the cloud" ... specifically, I'm looking to move into some of Windows Azure's managed services, as right now I'm managing a VM. Basically, the system operates on some data stored in a github git repository. I'll describe the current architecture: Current system (all hosted on a single server): GitHub - configured with a webhook pointing at ... ASP.NET MVC application - to accept the webhook from git. It pushes a message onto ... Azure service bus Queue - which is drained by ... Windows Service - pulls the message from the queue and ... Fetches the latest data from the git repository (using GitLib2Sharp) onto the local disk and finally ... Operates on the data in git to produce a static HTML website hosted/served by IIS. The system works really well, actually ... but I would like to get out of the business of managing the VM, and move to using some combination of Azure web and worker roles. But because the system relies so heavily on the git repository on the local filesystem, I'm finding it difficult to figure out how to architect in the cloud. I know you can get file system access, so in theory I could just fetch the repository if there's nothing on disk ... but the performance/responsiveness of the system sort of depends on the repository being available and only having to fetch diffs, which is relatively quick. As opposed to periodically having to fetch the entire (somewhat large) git repository if the web or worker role was recycled, or something. So I would love some advice on how you would architect such a system :) Ultimately, the only real requirement is to be able to serve HTML content that's been produced from the contents of a git repository (in a relatively responsive manner, from a publishing perspective) ... please feel free to ask any clarifying questions if there's something I omitted. Thanks!

    Read the article

  • Remembering September 11 - 11 Years Later

    - by user12613380
    It's September 11 again and time to reminisce about that fateful day when the world came together as one. The attacks of that day touched everyone around the world as almost 3000 people from the United States and 38 other countries were killed. This year, I am finding it difficult to say anything other than what I have said in previous years. So, I will not try to "wax loquacious." Instead, I will simply say that I will never forgot. I will not forget where I was on that day. I will not forgot the people who died. I will not forget the people who gave their lives so that others might live. And I will not forget how our world changed on that day. And with that remembrance, we again return to our lives, using tragedy to drive us to build a world of peace and opportunity. My thanks go out again to the men and women, uniformed or not, who continue to protect us from harm. May we never again experience such human tragedy, on U.S. soil or elsewhere.

    Read the article

  • Turning on the wireless card freezes Ubuntu 12.04

    - by Ryan Schram
    I've been using Ubuntu 12.04 for close to a year. Occasionally after booting into Gnome 3, the desktop is unresponsive to cursor clicks. The mouse cursor moves but can't open menus. Also the network icon displays a red error badge instead of the usual gray x indicating that it is off. The only way I have found to solve this is to switch to a terminal shell (Ctrl-Alt-F5), login and kill gnome-session. This is an intermittent and occasional problem. Recently I moved house and set up this computer in my new apartment. Now I have for the first time a new and more difficult problem. Switching on the wireless card (wlan0) to connect to the wifi router causes the desktop to freeze. Switching to a terminal via Ctrl-Alt-F5 doesn't work. The only solution is a hard reboot from the power switch. How can I diagnose the second, more recent problem? Is there a solution? Cheers, Ryan

    Read the article

  • Emacs: methods for debugging python

    - by Tom Willis
    I use emacs for all my code edit needs. Typically, I will use M-x compile to run my test runner which I would say gets me about 70% of what I need to do to keep the code on track however lately I've been wondering how it might be possible to use M-x pdb on occasions where it would be nice to hit a breakpoint and inspect things. In my googling I've found some things that suggest that this is useful/possible. However I have not managed to get it working in a way that I fully understand. I don't know if it's the combination of buildout + appengine that might be making it more difficult but when I try to do something like M-x pdb Run pdb (like this): /Users/twillis/projects/hydrant/bin/python /Users/twillis/bin/pdb /Users/twillis/projects/hydrant/bin/devappserver /Users/twillis/projects/hydrant/parts/hydrant-app/ Where .../bin/python is the interpreter buildout makes with the path set for all the eggs. ~/bin/pdb is a simple script to call into pdb.main using the current python interpreter HellooKitty:hydrant twillis$ cat ~/bin/pdb #! /usr/bin/env python if __name__ == "__main__": import sys sys.version_info import pdb pdb.main() HellooKitty:hydrant twillis$ .../bin/devappserver is the dev_appserver script that the buildout recipe makes for gae project and .../parts/hydrant-app is the path to the app.yaml I am first presented with a prompt Current directory is /Users/twillis/bin/ C-c C-f Nothing happens but HellooKitty:hydrant twillis$ ps aux | grep pdb twillis 469 100.0 1.6 168488 67188 s002 Rs+ 1:03PM 0:52.19 /usr/local/bin/python2.5 /Users/twillis/projects/hydrant/bin/python /Users/twillis/bin/pdb /Users/twillis/projects/hydrant/bin/devappserver /Users/twillis/projects/hydrant/parts/hydrant-app/ twillis 477 0.0 0.0 2435120 420 s000 R+ 1:05PM 0:00.00 grep pdb HellooKitty:hydrant twillis$ something is happening C-x [space] will report that a breakpoint has been set. But I can't manage to get get things going. It feels like I am missing something obvious here. Am I? So, is interactive debugging in emacs worthwhile? is interactive debugging a google appengine app possible? Any suggestions on how I might get this working?

    Read the article

  • Microsoft&rsquo;s new technical computing initiative

    - by Randy Walker
    I made a mental note from earlier in the year.  Microsoft literally buys computers by the truckload.  From what I understand, it’s a typical practice amongst large software vendors.  You plug a few wires in, you test it, and you instantly have mega tera tera flops (don’t hold me to that number).  Microsoft has been trying to plug away at their cloud services (named Azure).  Which, for the layman, means Microsoft runs your software on their computers, and as demand increases you can allocate more computing power on the fly. With this in mind, it doesn’t surprise me that I was recently sent an executive email concerning Microsoft’s new technical computing initiative.  I find it to be a great marketing idea with actual substance behind their real work.  From the programmer academic perspective, in college we dreamed about this type of processing power.  This has decades of computer science theory behind it. A copy of the email received.  (note that I almost deleted this email, thinking it was spam due to it’s length) We don't often think about how complex life really is. Take the relatively simple task of commuting to and from work: it is, in fact, a complicated interplay of variables such as weather, train delays, accidents, traffic patterns, road construction, etc. You can however, take steps to shorten your commute - using a good, predictive understanding of a few of these variables. In fact, you probably are already taking these inputs and instinctively building a predictive model that you act on daily to get to your destination more quickly. Now, when we apply the same method to very complex tasks, this modeling approach becomes much more challenging. Recent world events clearly demonstrated our inability to process vast amounts of information and variables that would have helped to more accurately predict the behavior of global financial markets or the occurrence and impact of a volcano eruption in Iceland. To make sense of issues like these, researchers, engineers and analysts create computer models of the almost infinite number of possible interactions in complex systems. But, they need increasingly more sophisticated computer models to better understand how the world behaves and to make fact-based predictions about the future. And, to do this, it requires a tremendous amount of computing power to process and examine the massive data deluge from cameras, digital sensors and precision instruments of all kinds. This is the key to creating more accurate and realistic models that expose the hidden meaning of data, which gives us the kind of insight we need to solve a myriad of challenges. We have made great strides in our ability to build these kinds of computer models, and yet they are still too difficult, expensive and time consuming to manage. Today, even the most complicated data-rich simulations cannot fully capture all of the intricacies and dependencies of the systems they are trying to model. That is why, across the scientific and engineering world, it is so hard to say with any certainty when or where the next volcano will erupt and what flight patterns it might affect, or to more accurately predict something like a global flu pandemic. So far, we just cannot collect, correlate and compute enough data to create an accurate forecast of the real world. But this is about to change. Innovations in technology are transforming our ability to measure, monitor and model how the world behaves. The implication for scientific research is profound, and it will transform the way we tackle global challenges like health care and climate change. It will also have a huge impact on engineering and business, delivering breakthroughs that could lead to the creation of new products, new businesses and even new industries. Because you are a subscriber to executive e-mails from Microsoft, I want you to be the first to know about a new effort focused specifically on empowering millions of the world's smartest problem solvers. Today, I am happy to introduce Microsoft's Technical Computing initiative. Our goal is to unleash the power of pervasive, accurate, real-time modeling to help people and organizations achieve their objectives and realize their potential. We are bringing together some of the brightest minds in the technical computing community across industry, academia and science at www.modelingtheworld.com to discuss trends, challenges and shared opportunities. New advances provide the foundation for tools and applications that will make technical computing more affordable and accessible where mathematical and computational principles are applied to solve practical problems. One day soon, complicated tasks like building a sophisticated computer model that would typically take a team of advanced software programmers months to build and days to run, will be accomplished in a single afternoon by a scientist, engineer or analyst working at the PC on their desktop. And as technology continues to advance, these models will become more complete and accurate in the way they represent the world. This will speed our ability to test new ideas, improve processes and advance our understanding of systems. Our technical computing initiative reflects the best of Microsoft's heritage. Ever since Bill Gates articulated the then far-fetched vision of "a computer on every desktop" in the early 1980's, Microsoft has been at the forefront of expanding the power and reach of computing to benefit the world. As someone who worked closely with Bill for many years at Microsoft, I am happy to share with you that the passion behind that vision is fully alive at Microsoft and is carried out in the creation of our new Technical Computing group. Enabling more people to make better predictions We have seen the impact of making greater computing power more available firsthand through our investments in high performance computing (HPC) over the past five years. Scientists, engineers and analysts in organizations of all sizes and sectors are finding that using distributed computational power creates societal impact, fuels scientific breakthroughs and delivers competitive advantages. For example, we have seen remarkable results from some of our current customers: Malaria strikes 300,000 to 500,000 people around the world each year. To help in the effort to eradicate malaria worldwide, scientists at Intellectual Ventures use software that simulates how the disease spreads and would respond to prevention and control methods, such as vaccines and the use of bed nets. Technical computing allows researchers to model more detailed parameters for more accurate results and receive those results in less than an hour, rather than waiting a full day. Aerospace engineering firm, a.i. solutions, Inc., needed a more powerful computing platform to keep up with the increasingly complex computational needs of its customers: NASA, the Department of Defense and other government agencies planning space flights. To meet that need, it adopted technical computing. Now, a.i. solutions can produce detailed predictions and analysis of the flight dynamics of a given spacecraft, from optimal launch times and orbit determination to attitude control and navigation, up to eight times faster. This enables them to avoid mistakes in any areas that can cause a space mission to fail and potentially result in the loss of life and millions of dollars. Western & Southern Financial Group faced the challenge of running ever larger and more complex actuarial models as its number of policyholders and products grew and regulatory requirements changed. The company chose an actuarial solution that runs on technical computing technology. The solution is easy for the company's IT staff to manage and adjust to meet business needs. The new solution helps the company reduce modeling time by up to 99 percent - letting the team fine-tune its models for more accurate product pricing and financial projections. Our Technical Computing direction Collaborating closely with partners across industry and academia, we must now extend the reach of technical computing even further to help predictive modelers and data explorers make faster, more accurate predictions. As we build the Technical Computing initiative, we will invest in three core areas: Technical computing to the cloud: Microsoft will play a leading role in bringing technical computing power to scientists, engineers and analysts through the cloud. Existing high- performance computing users will benefit from the ability to augment their on-premises systems with cloud resources that enable 'just-in-time' processing. This platform will help ensure processing resources are available whenever they are needed-reliably, consistently and quickly. Simplify parallel development: Today, computers are shipping with more processing power than ever, including multiple cores, but most modern software only uses a small amount of the available processing power. Parallel programs are extremely difficult to write, test and trouble shoot. However, a consistent model for parallel programming can help more developers unlock the tremendous power in today's modern computers and enable a new generation of technical computing. We are delivering new tools to automate and simplify writing software through parallel processing from the desktop... to the cluster... to the cloud. Develop powerful new technical computing tools and applications: We know scientists, engineers and analysts are pushing common tools (i.e., spreadsheets and databases) to the limits with complex, data-intensive models. They need easy access to more computing power and simplified tools to increase the speed of their work. We are building a platform to do this. Our development efforts will yield new, easy-to-use tools and applications that automate data acquisition, modeling, simulation, visualization, workflow and collaboration. This will allow them to spend more time on their work and less time wrestling with complicated technology. Thinking bigger There is so much left to be discovered and so many questions yet to be answered in the fascinating world around us. We believe the technical computing community will show us that we have not seen anything yet. Imagine just some of the breakthroughs this community could make possible: Better predictions to help improve the understanding of pandemics, contagion and global health trends. Climate change models that predict environmental, economic and human impact, accessible in real-time during key discussions and debates. More accurate prediction of natural disasters and their impact to develop more effective emergency response plans. With an ambitious charter in hand, this new team is ready to build on our progress to-date and execute Microsoft's technical computing vision over the months and years ahead. We will steadily invest in the right technologies, tools and talent, and work to bring together the technical computing community. I invite you to visit www.modelingtheworld.com today. We welcome your ideas and feedback. I look forward to making this journey with you and others who want to answer the world's biggest questions, discover solutions to problems that seem impossible and uncover a host of new opportunities to change the world we live in for the better. Bob

    Read the article

  • The Strange History of the Honeywell Kitchen Computer

    - by Jason Fitzpatrick
    In 1969 the Honeywell corporation released a $10,000 kitchen computer that weighed 100 pounds, was as big as a table, and required advanced programming skills to use. Shockingly, they failed to sell a single one. Read on to be dumbfounded by how ahead of (and out of touch with) its time the Honeywell Kitchen Computer was. Wired delves into the history of the device, including how difficult it was to use: Now try to imagine all that in late 1960s kitchen. A full H316 system wouldn’t have fit in most kitchens, says design historian Paul Atkinson of Britain’s Sheffield Halam University. Plus, it would have looked entirely out of place. The thought that an average person, like a housewife, could have used it to streamline chores like cooking or bookkeeping was ridiculous, even if she aced the two-week programming course included in the $10,600 price tag. If the lady of the house wanted to build her family’s dinner around broccoli, she’d have to code in the green veggie as 0001101000. The kitchen computer would then suggest foods to pair with broccoli from its database by “speaking” its recommendations as a series of flashing lights. Think of a primitive version of KITT, without the sexy voice. Hit up the link below for the full article. How To Use USB Drives With the Nexus 7 and Other Android Devices Why Does 64-Bit Windows Need a Separate “Program Files (x86)” Folder? Why Your Android Phone Isn’t Getting Operating System Updates and What You Can Do About It

    Read the article

  • Friday Fun: Wake Up the Box

    - by Mysticgeek
    Another Friday and it’s time to waste the rest of your Friday playing a  fun flash game online. Today we take a look at a relaxing physic based puzzle game called Wake Up the Box. Wake Up the Box This goal of this game is to wake up the box character by attaching parts of existing wood objects in each stage. You can start a new game or continue your progress from where you left off. At the beginning you get a tutorial showing what you need to do to wake the box. You get wood parts and can attach them to other wood pieces but not metal or brick. After successfully waking up Mr. Box, you can go to the next level or restart a level at any time if your having problems figuring out the puzzle. Each level gets more difficult and the puzzles are more challenging. Wake Up the Box is a relaxing and challenging game that will allow you to have fun, not working on TPS reports until the whistle blows. Play Wake Up the Box at FreeWebArcade Similar Articles Productive Geek Tips Stop the Mouse From Waking Up Your Computer from Sleep ModeFix "Sleep Mode Randomly Waking Up" Issue in Windows VistaStop Your Mouse from Waking Up Your Windows 7 ComputerPrevent Windows Asking for a Password on Wake Up from Sleep/StandbyUse Sleep.FM to Wake Up with the Web TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Microsoft’s “How Do I ?” Videos Home Networks – How do they look like & the problems they cause Check Your IMAP Mail Offline In Thunderbird Follow Finder Finds You Twitter Users To Follow Combine MP3 Files Easily QuicklyCode Provides Cheatsheets & Other Programming Stuff

    Read the article

  • Is meta description still relevant?

    - by Jeff Atwood
    I received this bit of advice about the meta description tag recently: Meta descriptions are used by Google probably 80% of the time for the snippet. They don’t help with rankings but you should probably use them. You could just auto generate them from the first part of the question. The description tag exists in the header, like so: <meta name="Description" content="A brief summary of the content on the page."> I'm not sure why we would need this field, as Google seems perfectly capable of showing the relevant search terms in context in the search result pages, like so (I searched for c# list performance): In other words, where would a meta description summary improve these results? We want the page to show context around the actual search hits, not a random summary we inserted! Google Webmaster Central has this advice: For some sites, like news media sources, generating an accurate and unique description for each page is easy: since each article is hand-written, it takes minimal effort to also add a one-sentence description. For larger database-driven sites, like product aggregators, hand-written descriptions are more difficult. In the latter case, though, programmatic generation of the descriptions can be appropriate and is encouraged -- just make sure that your descriptions are not "spammy." Good descriptions are human-readable and diverse, as we talked about in the first point above. The page-specific data we mentioned in the second point is a good candidate for programmatic generation. I'm struggling to think of any scenario when I would want the Google-generated summary, that is, actual context from the page for the search terms, to be replaced by a hard-coded meta description summary of the question itself.

    Read the article

  • How do I make complex SQL queries easier to write?

    - by DragonLord
    I'm finding it very difficult to write complex SQL queries involving joins across many (at least 3-4) tables and involving several nested conditions. The queries I'm being asked to write are easily described by a few sentences, but can require a deceptive amount of code to complete. I'm finding myself often using temporary views to write these queries, which seem like a bit of a crutch. What tips can you provide that I can use to make these complex queries easier? More specifically, how do I break these queries down into the steps I need to use to actually write the SQL code? Note that I'm the SQL I'm being asked to write is part of homework assignments for a database course, so I don't want software that will do the work for me. I want to actually understand the code I'm writing. More technical details: The database is hosted on a PostgreSQL server running on the local machine. The database is very small: there are no more than seven tables and the largest table has less than about 50 rows. The SQL queries are being passed unchanged to the server, via LibreOffice Base.

    Read the article

  • Writing a "Hello World" Device Driver for kernel 2.6 using Eclipse

    - by Isaac
    Goal I am trying to write a simple device driver on Ubuntu. I want to do this using Eclipse (or a better IDE that is suitable for driver programming). Here is the code: #include <linux/module.h> static int __init hello_world( void ) { printk( "hello world!\n" ); return 0; } static void __exit goodbye_world( void ) { printk( "goodbye world!\n" ); } module_init( hello_world ); module_exit( goodbye_world ); My effort After some research, I decided to use Eclipse CTD for developing the driver (while I am still not sure if it supports multi-threading debugging tools). So I: Installed Ubuntu 11.04 desktop x86 on a VMWare virtual machine, Installed eclipse-cdt and linux-headers-2.6.38-8 using Synaptic Package Manager, Created a C Project named TestDriver1 and copy-pasted above code to it, Changed the default build command, make, to the following customized build command: make -C /lib/modules/2.6.38-8-generic/build M=/home/isaac/workspace/TestDriver1 The problem I get an error when I try to build this project using eclipse. Here is the log for the build: **** Build of configuration Debug for project TestDriver1 **** make -C /lib/modules/2.6.38-8-generic/build M=/home/isaac/workspace/TestDriver1 all make: Entering directory '/usr/src/linux-headers-2.6.38-8-generic' make: *** No rule to make target vmlinux', needed byall'. Stop. make: Leaving directory '/usr/src/linux-headers-2.6.38-8-generic' Interestingly, I get no error when I use shell instead of eclipse to build this project. To use shell, I just create a Makefile containing obj-m += TestDriver1.o and use the above make command to build. So, something must be wrong with the eclipse Makefile. Maybe it is looking for the vmlinux architecture (?) or something while current architecture is x86. Maybe it's because of VMWare? As I understood, eclipse creates the makefiles automatically and modifying it manually would cause errors in the future OR make managing makefile difficult. So, how can I compile this project on eclipse?

    Read the article

  • Building a common syntax and scoping framework.

    - by Ben DeMott
    Hello fellow programmers, I was discussing a project the other day with a colleague of mine and I was curious to see what others had to say or if such a thing already existed. Background There are many programming languages. There are many IDE's and source editors that highlight and edit source code. Following perfectly and exactly the rules of a language to present auto-complete options and understand scopes in the code is rather complex. This task is complex enough that most IDE's implement different source-editors as plugins that often re-implement the same features over and over but in a different way (netbeans). From what I can tell most IDE's and source editors re-implement parsers that use regular expressions, or some meta-syntax Naur Form to describe the languages grammer generically. These parsers are implemented over and over and over again. Question Has anyone attempted to unify or describe a set of features through an API and have a consistent interface to parsing various programming languages and dialects. I'm not describing an IDE - but a consistent API for any program to use to parse and obtain meta-information from the source code. I realize various programming languages offer many different features which are difficult to 'abstract' into a set of features, but I feel this would be a worthwhile venture. It seems to me that this could possibly allow the authors of interpreters to help maintain a central grammer intepreter for their language. the Python foundation could maintain the Python grammer api, ANSI the C grammer api, Oracle the Java grammer API, etc Example usage If this was API existed code documentation generators could theoretically work across all dialects and languages to some level. It wouldn't matter if your project used 5 different languages a single application could document all of them and the comments and doc-tags within. Has anyone attempted this comprehensively?

    Read the article

  • How to improve battery life on Samsung 13.3” Series 7 Ultra (NP730U3E-S01AU)?

    - by beam022
    Recently I've bought a Series 7 Ultra Samsung ultrabook and decided to change the OS from originally installed Windows 8 to Ubuntu 14.04LTS. However, it's difficult not to notice great decrease in battery life: on pre-installed Windows 8 battery would last for about 6 hours while on Ubuntu it's almost empty after 2 hours of same kind of work (wi-fi, web, vlc, spotify, intellij idea). I'm not here to say that Ubuntu's battery performance is worse than Windows, but to ask for suggestions how to improve the situation (2 hours of work is pretty poor battery life). Can you recommend some sources, applications or tips/tricks that would improve battery life on my ultrabook? I really like the Ubuntu experience, but this makes my machine much less reliable. I suspect that graphic video card might be one of the issues here. Let me give you tech specs of the ultrabook: Processor: Intel® Core™ i5 Processor 3337U (1.80GHz, 3MB L3 Cache) Chipset: Intel HM76 Graphic: AMD Radeon™ HD 8570M Graphics with 1GB gDDR3 Graphic Memory (PowerExpress) and Intel(R) HD Graphics 4000 Display: 13.3" SuperBright+ 350nit FHD LED Display (1920 x 1080), Anti-Reflective Memory: 10GB DDR3 System Memory at 1,600MHz Hard-drive: 128GB Solid-state Drive More informations here, on the official page. If it's helpful to provide additional info, I'm happy to do it, just let me know what you need. Thank you.

    Read the article

  • From .psd to working HTML and CSS - help me suck less

    - by kevinmajor1
    I am not much of a designer. My strength lies in coding. That said, I'm often forced into the role of "The Man," responsible for all aspects of site creation. So, that said I'm wondering if the pros can give me tips/solutions/links to tutorials to my main questions. Resolution. What should I aim for? What are the lower and upper edges I should be aware of? I know that systems like 960 Grid were popular recently. Is that the number I should still aim for? Slicing up a .psd - are there any tricks I should know? I've always found it difficult to get my slices pixel perfect. I'm also really slow at it. I must be looking at it wrong, or missing something fundamental. The same goes for text. Layouts are always filled with the classic "Lorem...", but I can never seem to get real content to fit quite as well on the screen. The advanced (to me, anyway) looking things, like a part of a logo/image overlaying what looks like a content area. How does one do that? How do layouts change/are informed by the decision to go fixed or liquid? Again, any tips/tricks/suggestions/tutorials you can share would be greatly appreciated.

    Read the article

  • SQLAuthority News – Resolution for New Year 2011

    - by pinaldave
    Today is the first day of the year so I want to write something very light. Last Year: 2010 Last Year was a blast; really traveled a lot. My family and I went on vacation. There I enjoyed being father, rolling on the floor and playing with my daughter. Here is the list of the countries I visited throughout 2010: Singapore (twice) Malaysia (twice) Sri Lanka (thrice) Nepal (once) United States of America (twice) United Arab Emirates (UAE) (once) My daughter who just completed 1 year on September 1, 2010 has so far visited three countries: Singapore, Malaysia and Sri Lanka, where I have done lots of community activities. The list containing all my activities can be found at Pinal Dave’s Community Events. I have written nearly 380 blog posts last year. It would be difficult for me to pick a few. However, I keep a running list of all of my articles over here: All Articles on SQLAuthority.com. I have so far received more than 10,000 email questions during the year and consequently I have done my best to answer most of them. I strongly believe if one would Search SQLAuthority.com blog, they would have found the answer quickly. The best part of 2010 for me was working on SQL Server Health Check and SQL Server Performance Tuning. This Year: 2011 This year, I came up with two simple goals: 1. Personal Goal: Reduce Weight 2. Professional Goal: Stay busy for the entire year with SQL Server Performance Tuning Projects. (Currently January 2011 is booked with performance tuning projects and 40 other days are already booked throughout the year). Future The future is something one cannot exactly guess and one cannot see. I just want to wish all of you the very best for this coming New Year. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: About Me, Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • Atheros wireless not working

    - by Chandru1
    I have been struggling hard since i have installed Ubuntu 10.10 but it has been difficult for me to get my wifi working. So here is what i tried. First i checked whether i have the driver using the ifconfig command and it shows the wireless lan driver as wlan0. Next, i tried the command iwlist wlan0 scanning by becoming the root which gave me the output as no scan results. Next, i visited this link https://help.ubuntu.com/community/WifiDocs/Driver/Atheros to see as to what problem my laptop may have. I do own have an ath5k chipset. And as i followed the instructions in the above link in one of the blacklist-ath_pci.conf file had this written in it. For some Atheros 5K RF MACs, the madwifi driver loads buts fails to correctly initialize the hardware, leaving it in a state from which ath5k cannot recover. To prevent this condition, stop madwifi from loading by default. Use Jockey to select one driver or the other. (Ubuntu: #315056, #323830 I am not that good at Linux but i have given it a try. I am desperate to have my wifi working and i would be glad if this community could help. ADDED: If anyone would like to know as to what drivers i am using this is the output. network description: Wireless interface product: AR2413 802.11bg NIC vendor: Atheros Communications Inc. physical id: 3 bus info: pci@0000:0a:03.0 logical name: wlan0 version: 01 serial: 00:19:7d:d3:0c:fd width: 32 bits clock: 33MHz capabilities: pm bus_master cap_list ethernet physical wireless configuration: broadcast=yes driver=ath5k driverversion=2.6.35-24-generic firmware=N/A latency=168 link=no maxlatency=28 mingnt=10 multicast=yes wireless=IEEE 802.11bg resources: irq:18 memory:d0000000-d000ffff Some more information and output as to what i have done. lsmod | grep ath ath5k 130083 0 mac80211 231541 1 ath5k ath 8153 1 ath5k cfg80211 144470 3 ath5k,mac80211,ath led_class 2633 1 ath5k

    Read the article

  • chromium-browser usus 99,99% IO disk

    - by lars
    My favorite browser: chromium is testing my patience. For some reason it sometimes uses 99,99% of I/O. (reading 2-3MB/s) Other processes (updatedb.mlocate, [kswapd0], clementine, compiz) show the same behavior. However this problem always starts and ends with chromium. To illustrate the impact on my system: when my disk starts to spin like crazy en the led burns continiously the system is so slow that it takes about two to five minuits to switch to tty6, log in and execute "killall chromiumbrowser && killall chromium" This is way faster than starting a new terminal in X, just starting a terminal seems to heavy for compiz under these circumstances. Waiting until its over takes more than 30 minuits, if it ends at all. The exact circumstances are difficult to replicate. Several tabs have to be open, usualy 8 or more. It seems that the chance to increases when more complex sites like gmail of plugins like flash are running. Opening several new tabs at omgubunt.co.uk has the best chance to replecate this isue. I have no idea where to start looking for a solution. Any help would be greatly apreciated ubuntu 12.10 | 2GB | 2x 1.66GHz Intel | 32bit | IBM Thinkpad R60e

    Read the article

  • Thoughts on iPhone, Flash, IE

    - by guybarrette
    It’s interesting to see the debate caused by the iPhone debate over Flash.  In the new version of the iPhone Developer Program License Agreement, Apple bans Flash and Monotouch: 3.3.1 — Applications may only use Documented APIs in the manner prescribed by Apple and must not use or call any private APIs. Applications must be originally written in Objective-C, C, C++, or JavaScript as executed by the iPhone OS WebKit engine, and only code written in C, C++, and Objective-C may compile and directly link against the Documented APIs (e.g., Applications that link to Documented APIs through an intermediary translation or compatibility layer or tool are prohibited). In Adobe’s last SEC filing, they list the iPhone/iPad as a threat to their business. http://www.sec.gov/Archives/edgar/data/796343/000079634310000007/form_10q.htm#riskfactors We offer our desktop application-based products primarily on Windows and Macintosh platforms. We generally offer our server-based products on the Linux platform as well as the Windows and UNIX platforms. To the extent that there is a slowdown of customer purchases of personal computers on either the Windows or Macintosh platform or in general, to the extent that we have difficulty transitioning product or version releases to new Windows and Macintosh operating systems, or to the extent that significant demand arises for our products or competitive products on other platforms before we choose and are able to offer our products on these platforms our business could be harmed. Additionally, to the extent new releases of operating systems or other third-party products, platforms or devices, such as the Apple iPhone or iPad, make it more difficult for our products to perform, and our customers are persuaded to use alternative technologies, our business could be harmed. I had a conversation recently about IE9 and people were asking why is Microsoft spending money and resources to build IE9 now that we have Silverlight.  It makes just no sense to put so much efforts to support HTML 5 in IE because it’s overlapping with Silverlight, no?  Well, what if Chrome became the dominant browser and all of a sudden, Google would remove the object tag?  Would Microsoft be in the same position as Adobe is right now on the iPhone? What do you think? var addthis_pub="guybarrette";

    Read the article

< Previous Page | 51 52 53 54 55 56 57 58 59 60 61 62  | Next Page >