Search Results

Search found 1343 results on 54 pages for 'challenge'.

Page 40/54 | < Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >

  • Deliberate Practice

    - by Jeff Foster
    It’s easy to assume, as software engineers, that there is little need to “practice” writing code. After all, we write code all day long! Just by writing a little each day, we’re constantly learning and getting better, right? Unfortunately, that’s just not true. Of course, developers do improve with experience. Each time we encounter a problem we’re more likely to avoid it next time. If we’re in a team that deploys software early and often, we hone and improve the deployment process each time we practice it. However, not all practice makes perfect. To develop true expertise requires a particular type of practice, deliberate practice, the only goal of which is to make us better programmers. Everyday software development has other constraints and goals, not least the pressure to deliver. We rarely get the chance in the course of a “sprint” to experiment with potential solutions that are outside our current comfort zone. However, if we believe that software is a craft then it’s our duty to strive continuously to raise the standard of software development. This requires specific and sustained efforts to get better at something we currently can’t do well (from Harvard Business Review July/August 2007). One interesting way to introduce deliberate practice, in a sustainable way, is the code kata. The term kata derives from martial arts and refers to a set of movements practiced either solo or in pairs. One of the better-known examples is the Bowling Game kata by Bob Martin, the goal of which is simply to write some code to do the scoring for 10-pin bowling. It sounds too easy, right? What could we possibly learn from such a simple example? Trust me, though, that it’s not as simple as five minutes of typing and a solution. Of course, we can reach a solution in a short time, but the important thing about code katas is that we explore each technique fully and in a controlled way. We tackle the same problem multiple times, using different techniques and making different decisions, understanding the ramifications of each one, and exploring edge cases. The short feedback loop optimizes opportunities to learn. Another good example is Conway’s Game of Life. It’s a simple problem to solve, but try solving it in a functional style. If you’re used to mutability, solving the problem without mutating state will push you outside of your comfort zone. Similarly, if you try to solve it with the focus of “tell-don’t-ask“, how will the responsibilities of each object change? As software engineers, we don’t get enough opportunities to explore new ideas. In the middle of a development cycle, we can’t suddenly start experimenting on the team’s code base. Code katas offer an opportunity to explore new techniques in a safe environment. If you’re still skeptical, my challenge to you is simply to try it out. Convince a willing colleague to pair with you and work through a kata or two. It only takes an hour and I’m willing to bet you learn a few new things each time. The next step is to make it a sustainable team practice. Start with an hour every Friday afternoon (after all who wants to commit code to production just before they leave for the weekend?) for month and see how that works out. Finally, consider signing up for the Global Day of Code Retreat. It’s like a daylong code kata, it’s on December 8th and there’s probably an event in your area!

    Read the article

  • Know your Data Lineage

    - by Simon Elliston Ball
    An academic paper without the footnotes isn’t an academic paper. Journalists wouldn’t base a news article on facts that they can’t verify. So why would anyone publish reports without being able to say where the data has come from and be confident of its quality, in other words, without knowing its lineage. (sometimes referred to as ‘provenance’ or ‘pedigree’) The number and variety of data sources, both traditional and new, increases inexorably. Data comes clean or dirty, processed or raw, unimpeachable or entirely fabricated. On its journey to our report, from its source, the data can travel through a network of interconnected pipes, passing through numerous distinct systems, each managed by different people. At each point along the pipeline, it can be changed, filtered, aggregated and combined. When the data finally emerges, how can we be sure that it is right? How can we be certain that no part of the data collection was based on incorrect assumptions, that key data points haven’t been left out, or that the sources are good? Even when we’re using data science to give us an approximate or probable answer, we cannot have any confidence in the results without confidence in the data from which it came. You need to know what has been done to your data, where it came from, and who is responsible for each stage of the analysis. This information represents your data lineage; it is your stack-trace. If you’re an analyst, suspicious of a number, it tells you why the number is there and how it got there. If you’re a developer, working on a pipeline, it provides the context you need to track down the bug. If you’re a manager, or an auditor, it lets you know the right things are being done. Lineage tracking is part of good data governance. Most audit and lineage systems require you to buy into their whole structure. If you are using Hadoop for your data storage and processing, then tools like Falcon allow you to track lineage, as long as you are using Falcon to write and run the pipeline. It can mean learning a new way of running your jobs (or using some sort of proxy), and even a distinct way of writing your queries. Other Hadoop tools provide a lot of operational and audit information, spread throughout the many logs produced by Hive, Sqoop, MapReduce and all the various moving parts that make up the eco-system. To get a full picture of what’s going on in your Hadoop system you need to capture both Falcon lineage and the data-exhaust of other tools that Falcon can’t orchestrate. However, the problem is bigger even that that. Often, Hadoop is just one piece in a larger processing workflow. The next step of the challenge is how you bind together the lineage metadata describing what happened before and after Hadoop, where ‘after’ could be  a data analysis environment like R, an application, or even directly into an end-user tool such as Tableau or Excel. One possibility is to push as much as you can of your key analytics into Hadoop, but would you give up the power, and familiarity of your existing tools in return for a reliable way of tracking lineage? Lineage and auditing should work consistently, automatically and quietly, allowing users to access their data with any tool they require to use. The real solution, therefore, is to create a consistent method by which to bring lineage data from these data various disparate sources into the data analysis platform that you use, rather than being forced to use the tool that manages the pipeline for the lineage and a different tool for the data analysis. The key is to keep your logs, keep your audit data, from every source, bring them together and use the data analysis tools to trace the paths from raw data to the answer that data analysis provides.

    Read the article

  • Finding Leaders Breakfasts - Adelaide and Perth

    - by rdatson-Oracle
    HR Executives Breakfast Roundtables: Find the best leaders using science and social media! Perth, 22nd July & Adelaide, 24th July What is leadership in the 21st century? What does the latest research tell us about leadership? How do you recognise leadership qualities in individuals? How do you find individuals with these leadership qualities, hire and develop them? Join the Neuroleadership Institute, the Hay Group, and Oracle to hear: 1. the latest neuroscience research about human bias, and how it applies to finding and building better leaders; 2. the latest techniques to recognise leadership qualities in people; 3. and how you can harness your people and social media to find the best people for your company. Reflect on your hiring practices at this thought provoking breakfast, where you will be challenged to consider whether you are using best practices aimed at getting the right people into your company. Speakers Abigail Scott, Hay Group Abigail is a UK registered psychologist with 10 years international experience in the design and delivery of talent frameworks and assessments. She has delivered innovative assessment programmes across a range of organisations to identify and develop leaders. She is experienced in advising and supporting clients through new initiatives using evidence-based approach and has published a number of research papers on fairness and predictive validity in assessment. Karin Hawkins, NeuroLeadership Institute Karin is the Regional Director of NeuroLeadership Institute’s Asia-Pacific region. She brings over 20 years experience in the financial services sector delivering cultural and commercial results across a variety of organisations and functions. As a leadership risk specialist Karin understands the challenge of building deep bench strength in teams and she is able to bring evidence, insight, and experience to support executives in meeting today’s challenges. Robert Datson, Oracle Robert is a Human Capital Management specialist at Oracle, with several years as a practicing manager at IBM, learning and implementing latest management techniques for hiring, deploying and developing staff. At Oracle he works with clients to enable best practices for HR departments, and drawing the linkages between HR initiatives and bottom-line improvements. Agenda 07:30 a.m. Breakfast and Registrations 08:00 a.m. Welcome and Introductions 08:05 a.m. Breaking Bias in leadership decisions - Karin Hawkins 08:30 a.m. Identifying and developing leaders - Abigail Scott 08:55 a.m. Finding leaders, the social way - Robert Datson 09:20 a.m. Q&A and Closing Remarks 09:30 a.m. Event concludes If you are an employee or official of a government organisation, please click here for important ethics information regarding this event. To register for Perth, Tuesday 22nd July, please click HERE To register for Adelaide, Thursday 24th July, please click HERE 1024x768 Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 -"/ /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Contact: To register or have questions on the event? Contact Aaron Tait on +61 2 9491 1404

    Read the article

  • Minimum team development sizes

    - by MarkPearl
    Disclaimer - these are observations that I have had, I am not sure if this follows the philosophy of scrum, agile or whatever, but most of these insights were gained while implementing a scrum scenario. Two is a partnership, three starts a team For a while I thought that a team was anything more than one and that scrum could be effective methodology with even two people. I have recently adjusted my thinking to a scrum team being a minimum of three, so what happened to two and what do you call it? For me I consider a group of two people working together a partnership - there is value in having a partnership, but some of the dynamics and value that you get from having a team is lost with a partnership. Avoidance of a one on one confrontation The first dynamic I see missing in a partnership is the team motivation to do better and how this is delivered to individuals that are not performing. Take two highly motivated individuals and put them together and you will typically see them continue to perform. Now take a situation where you have two individuals, one performing and one not and the behaviour is totally different compared to a team of three or more individuals. With two people, if one feels the other is not performing it becomes a one on one confrontation. Most people avoid confrontations and so nothing changes. Compare this to a situation where you have three people in a team, 2 performing and 1 not the dynamic is totally different, it is no longer a personal one on one confrontation but a team concern and people seem more willing to encourage the individual not performing and express their dissatisfaction as a team if they do not improve. Avoiding the effects of Tuckman’s Group Development Theory If you are not familiar with Tuckman’s group development theory give it a read (http://en.wikipedia.org/wiki/Tuckman's_stages_of_group_development) In a nutshell with Tuckman’s theory teams go through these stages of Forming, Storming, Norming & Performing. You want your team to reach and remain in the Performing stage for as long as possible - this is where you get the most value. When you have a partnership of two and you change the individuals in the partnership you basically do a hard reset on the partnership and go back to the beginning of Tuckman’s model each time. This has a major effect on the performance of a team and what they can deliver. What I have seen is that you reduce the effects of Tuckman's theory the more individuals you have in the team (until you hit the maximum team size in which other problems kick in). While you will still experience Tuckman's theory with a team of three, the impact will be greatly reduced compared to two where it is guaranteed every time a change occurs. It's not just in the numbers, it's in the people One final comment - while the actual numbers of a team do play a role, the individuals in the team are even more important - ideally you want to keep individuals working together for an extended period. That doesn't mean that you never change the individuals in a team, or that once someone joins a team they are stuck there - there is value in an individual moving from team to team and getting cross pollination, but the period of time that an individual moves should be in month's or years, not days or weeks. Why? So why is it important to know this? Why is it important to know how a team works and what motivates them? I have been asking myself this question for a while and where I am at right now is this… the aim is to achieve the stage where the sum of the total (team) is greater than the sum of the parts (team members). This is why we form teams and why understanding how they work is a challenge and also extremely stimulating.

    Read the article

  • Managing Operational Risk of Financial Services Processes – part 2/2

    - by Sanjeev Sharma
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} In my earlier blog post, I had described the factors that lead to compliance complexity of financial services processes. In this post, I will outline the business implications of the increasing process compliance complexity and the specific role of BPM in addressing the operational risk reduction objectives of regulatory compliance. First, let’s look at the business implications of increasing complexity of process compliance for financial institutions: · Increased time and cost of compliance due to duplication of effort in conforming to regulatory requirements due to process changes driven by evolving regulatory mandates, shifting business priorities or internal/external audit requirements · Delays in audit reporting due to quality issues in reconciling non-standard process KPIs and integrity concerns arising from the need to rely on multiple data sources for a given process Next, let’s consider some approaches to managing the operational risk of business processes. Financial institutions considering reducing operational risk of their processes, generally speaking, have two choices: · Rip-and-replace existing applications with new off-the shelf applications. · Extend capabilities of existing applications by modeling their data and process interactions, with other applications or user-channels, outside of the application boundary using BPM. The benefit of the first approach is that compliance with new regulatory requirements would be embedded within the boundaries of these applications. However pre-built compliance of any packaged application or custom-built application should not be mistaken as a one-shot fix for future compliance needs. The reason is that business needs and regulatory requirements inevitably out grow end-to-end capabilities of even the most comprehensive packaged or custom-built business application. Thus, processes that originally resided within the application will eventually spill outside the application boundary. It is precisely at such hand-offs between applications or between overlaying processes where vulnerabilities arise to unknown and accidental faults that potentially result in errors and lead to partial or total failure. The gist of the above argument is that processes which reside outside application boundaries, in other words, span multiple applications constitute a latent operational risk that spans the end-to-end value chain. For instance, distortion of data flowing from an account-opening application to a credit-rating system if left un-checked renders compliance with “KYC” policies void even when the “KYC” checklist was enforced at the time of data capture by the account-opening application. Oracle Business Process Management is enabling financial institutions to lower operational risk of such process ”gaps” for Financial Services processes including “Customer On-boarding”, “Quote-to-Contract”, “Deposit/Loan Origination”, “Trade Exceptions”, “Interest Claim Tracking” etc.. If you are faced with a similar challenge and need any guidance on the same feel free to drop me a note.

    Read the article

  • Managing common code on Windows 7 (.NET) and Windows 8 (WinRT)

    - by ryanabr
    Recent announcements regarding Windows Phone 8 and the fact that it will have the WinRT behind it might make some of this less painful but I  discovered the "XmlDocument" object is in a new location in WinRT and is almost the same as it's brother in .NET System.Xml.XmlDocument (.NET) Windows.Data.Xml.Dom.XmlDocument (WinRT) The problem I am trying to solve is how to work with both types in the code that performs the same task on both Windows Phone 7 and Windows 8 platforms. The first thing I did was define my own XmlNode and XmlNodeList classes that wrap the actual Microsoft objects so that by using the "#if" compiler directive either work with the WinRT version of the type, or the .NET version from the calling code easily. public class XmlNode     { #if WIN8         public Windows.Data.Xml.Dom.IXmlNode Node { get; set; }         public XmlNode(Windows.Data.Xml.Dom.IXmlNode xmlNode)         {             Node = xmlNode;         } #endif #if !WIN8 public System.Xml.XmlNode Node { get; set ; } public XmlNode(System.Xml.XmlNode xmlNode)         {             Node = xmlNode;         } #endif     } public class XmlNodeList     { #if WIN8         public Windows.Data.Xml.Dom.XmlNodeList List { get; set; }         public int Count {get {return (int)List.Count;}}         public XmlNodeList(Windows.Data.Xml.Dom.XmlNodeList list)         {             List = list;         } #endif #if !WIN8 public System.Xml.XmlNodeList List { get; set ; } public int Count { get { return List.Count;}} public XmlNodeList(System.Xml.XmlNodeList list)         {             List = list;        } #endif     } From there I can then use my XmlNode and XmlNodeList in the calling code with out having to clutter the code with all of the additional #if switches. The challenge after this was the code that worked directly with the XMLDocument object needed to be seperate on both platforms since the method for populating the XmlDocument object is completly different on both platforms. To solve this issue. I made partial classes, one partial class for .NET and one for WinRT. Both projects have Links to the Partial Class that contains the code that is the same for the majority of the class, and the partial class contains the code that is unique to the version of the XmlDocument. The files with the little arrow in the lower left corner denotes 'linked files' and are shared in multiple projects but only exist in one location in source control. You can see that the _Win7 partial class is included directly in the project since it include code that is only for the .NET platform, where as it's cousin the _Win8 (not pictured above) has all of the code specific to the _Win8 platform. In the _Win7 partial class is this code: public partial class WUndergroundViewModel     { public static WUndergroundData GetWeatherData( double lat, double lng)         { WUndergroundData data = new WUndergroundData();             System.Net. WebClient c = new System.Net. WebClient(); string req = "http://api.wunderground.com/api/xxx/yesterday/conditions/forecast/q/[LAT],[LNG].xml" ;             req = req.Replace( "[LAT]" , lat.ToString());             req = req.Replace( "[LNG]" , lng.ToString()); XmlDocument doc = new XmlDocument();             doc.Load(c.OpenRead(req)); foreach (XmlNode item in doc.SelectNodes("/response/features/feature" ))             { switch (item.Node.InnerText)                 { case "yesterday" :                         ParseForecast( new FishingControls.XmlNodeList (doc.SelectNodes( "/response/forecast/txt_forecast/forecastdays/forecastday" )), new FishingControls.XmlNodeList (doc.SelectNodes( "/response/forecast/simpleforecast/forecastdays/forecastday" )), data); break ; case "conditions" :                         ParseCurrent( new FishingControls.XmlNode (doc.SelectSingleNode("/response/current_observation" )), data); break ; case "forecast" :                         ParseYesterday( new FishingControls.XmlNodeList (doc.SelectNodes( "/response/history/observations/observation" )),data); break ;                 }             } return data;         }     } in _win8 partial class is this code: public partial class WUndergroundViewModel     { public async static Task< WUndergroundData > GetWeatherData(double lat, double lng)         { WUndergroundData data = new WUndergroundData (); HttpClient c = new HttpClient (); string req = "http://api.wunderground.com/api/xxxx/yesterday/conditions/forecast/q/[LAT],[LNG].xml" ;             req = req.Replace( "[LAT]" , lat.ToString());             req = req.Replace( "[LNG]" , lng.ToString()); HttpResponseMessage msg = await c.GetAsync(req); string stream = await msg.Content.ReadAsStringAsync(); XmlDocument doc = new XmlDocument ();             doc.LoadXml(stream, null); foreach ( IXmlNode item in doc.SelectNodes("/response/features/feature" ))             { switch (item.InnerText)                 { case "yesterday" :                         ParseForecast( new FishingControls.XmlNodeList (doc.SelectNodes( "/response/forecast/txt_forecast/forecastdays/forecastday" )), new FishingControls.XmlNodeList (doc.SelectNodes( "/response/forecast/simpleforecast/forecastdays/forecastday" )), data); break; case "conditions" :                         ParseCurrent( new FishingControls.XmlNode (doc.SelectSingleNode("/response/current_observation" )), data); break; case "forecast" :                         ParseYesterday( new FishingControls.XmlNodeList (doc.SelectNodes( "/response/history/observations/observation")), data); break;                 }             } return data;         }     } Summary: This method allows me to have common 'business' code for both platforms that is pretty clean, and I manage the technology differences separately. Thank you tostringtheory for your suggestion, I was considering that approach.

    Read the article

  • Moving abroad - Relocation advice

    - by Tim Koekkoek
    Oracle offers graduates from different European countries the opportunity to start their career abroad. Some already have experience with living abroad as they have done an exchange semester or internship in another county, for others it is the first time they will move abroad. Rui started in October 2011 as a Business Development Consultant in Dublin and moved from Portugal to Dublin, Ireland to start his career. For those planning to leave their home country and who desire to work abroad, he will share some tips and tricks in this article. When you’re faced with an opportunity like this, there are lots of things that will come to your mind. Sometimes it can be either very exciting, or even stressful. 1. First of all, try to relax. If you are certain you are moving abroad, all you need to do is some research about the country where you’re going to live, get to know its culture (gastronomy, important dates and events, its economy and effective ways to keep you in touch with your family and friends – such as mobile companies and Internet services), and start to understand the best locations (with good access) you could/should live in are. Don’t forget that initially you can be limited by transport and therefore it is important to explore the ideal places for you. During this time, Oracle provides everything you’ll need (papers, documents, etc.) to cross borders. 2. When you arrive, you understand that you are in a new country, in a new place, where all things (or most) are unknown to you. Before you panic, try to see it as a new challenge where new opportunities will come. Sometimes, it’s not easy I know, but the very best a new place has to give to you, is the opportunity to understand a new culture, get to know other people, other ways of working, and grow both as a person and professionally. So, you have nothing to lose in this kind of experiment. 3. When you arrive at Oracle, there’s a fantastic team that will help you with settling in, HR, Payroll, Relocation, IT. In my case, Oracle helped me with the relocation, they supported me to arrange everything such as helping out with all the paperwork and finding a new apartment. As you can see they will do their best to help you to be successful! 4. Engage with your new co-workers. Going to a place where you don’t know anyone can be tough sometimes but see it as an opportunity to meet people from all over the world and share experiences. Embrace it. 5. Plan ahead, try to get the most information possible and use it. Oracle is a multinational enterprise that will allow you to get to know a new labour market and give you the flexibility you need to understand your view of employment and occupation, giving you the very best opportunities to join different teams and working areas, so that you can work where you fit best. Good luck! If you’re thinking about starting a career abroad, read the following article: http://www.overseasdigest.com/movingtips.htm it can be very useful to you. Interested in starting your career at Oracle like Rui has? Please have a look at https://campus.oracle.com for all of our latest vacancies.

    Read the article

  • With a little effort you can &ldquo;SEMI&rdquo;-protect your C# assemblies with obfuscation.

    - by mbcrump
    This method will not protect your assemblies from a experienced hacker. Everyday we see new keygens, cracks, serials being released that contain ways around copy protection from small companies. This is a simple process that will make a lot of hackers quit because so many others use nothing. If you were a thief would you pick the house that has security signs and an alarm or one that has nothing? To so begin: Obfuscation is the concealment of meaning in communication, making it confusing and harder to interpret. Lets begin by looking at the cartoon below:     You are probably familiar with the term and probably ignored this like most programmers ignore user security. Today, I’m going to show you reflection and a way to obfuscate it. Please understand that I am aware of ways around this, but I believe some security is better than no security.  In this sample program below, the code appears exactly as it does in Visual Studio. When the program runs, you get either a true or false in a console window. Sample Program. using System; using System.Diagnostics; using System.Linq;   namespace ObfuscateMe {     class Program     {                static void Main(string[] args)         {               Console.WriteLine(IsProcessOpen("notepad")); //Returns a True or False depending if you have notepad running.             Console.ReadLine();         }             public static bool IsProcessOpen(string name)         {             return Process.GetProcesses().Any(clsProcess => clsProcess.ProcessName.Contains(name));         }     } }   Pretend, that this is a commercial application. The hacker will only have the executable and maybe a few config files, etc. After reviewing the executable, he can determine if it was produced in .NET by examing the file in ILDASM or Redgate’s Reflector. We are going to examine the file using RedGate’s Reflector. Upon launch, we simply drag/drop the exe over to the application. We have the following for the Main method:   and for the IsProcessOpen method:     Without any other knowledge as to how this works, the hacker could export the exe and get vs project build or copy this code in and our application would run. Using Reflector output. using System; using System.Diagnostics; using System.Linq;   namespace ObfuscateMe {     class Program     {                static void Main(string[] args)         {               Console.WriteLine(IsProcessOpen("notepad"));             Console.ReadLine();         }             public static bool IsProcessOpen(string name)         {             return Process.GetProcesses().Any<Process>(delegate(Process clsProcess)             {                 return clsProcess.ProcessName.Contains(name);             });         }       } } The code is not identical, but returns the same value. At this point, with a little bit of effort you could prevent the hacker from reverse engineering your code so quickly by using Eazfuscator.NET. Eazfuscator.NET is just one of many programs built for this. Visual Studio ships with a community version of Dotfoscutor. So download and load Eazfuscator.NET and drag/drop your exectuable/project into the window. It will work for a few minutes depending if you have a quad-core or not. After it finishes, open the executable in RedGate Reflector and you will get the following: Main After Obfuscation IsProcessOpen Method after obfuscation: As you can see with the jumbled characters, it is not as easy as the first example. I am aware of methods around this, but it takes more effort and unless the hacker is up for the challenge, they will just pick another program. This is also helpful if you are a consultant and make clients pay a yearly license fee. This would prevent the average software developer from jumping into your security routine after you have left. I hope this article helped someone. If you have any feedback, please leave it in the comments below.

    Read the article

  • ReSharper 8.0 EAP now available

    - by TATWORTH
    Originally posted on: http://geekswithblogs.net/TATWORTH/archive/2013/06/28/resharper-8.0-eap-now-available.aspxJetbrains have just released |ReSharper 8.0 Beta on their Early Access |Programme at http://www.jetbrains.com/resharper/whatsnew/?utm_source=resharper8b&utm_medium=newsletter&utm_campaign=resharper&utm_content=customersResharper 8.0 comes with the following new features:Support for Visual Studio 2013 Preview. Yes, ReSharper is known to work well with the fresh preview of Visual Studio 2013, and if you have already started digging into it, ReSharper 8.0 Beta is ready for the challenge.Faster code fixes. Thanks to the new Fix in Scope feature, you can choose to batch-fix some of the code issues that ReSharper detects in the scope of a project or the whole solution. Supported fixes include removing unused directives and redundant casts.Project dependency viewer. ReSharper is now able to visualize a project dependency graph for a bird's eye view of dependencies within your solution, all without compiling anything!Multifile templates. ReSharper's file templates can now be expanded to generate more than one file. For instance, this is handy for generating pairs of a main logic class and a class for extensions, or sets of partial files.Navigation improvements. These include a new action called Go to Everything to let you search for a file, type or method name from the same input box; support for line numbers in navigation actions; a new tool window called Assembly Explorer for browsing through assemblies; and two more contextual navigation actions: Navigate to Generic Substitutions and Navigate to Assembly Explorer.New solution-wide refactorings. The set of fresh refactorings is headlined by the highly requested Move Instance Method to move methods between classes without making them static. In addition, there are Inline Parameter and Pull Parameter. Last but not least, we're also introducing 4 new XAML-specific refactorings!Extraordinary XAML support. A plethora of new and improved functionality for all developers working with XAML code includes dedicated grid inspections and quick-fixes; Extract Style, Extract, Move and Inline Resource refactorings; atomic renaming of dependency properties; and a lot more.More accessible code completion. ReSharper 8 makes more of its IntelliSense magic available in automatic completion lists, including extension methods and an option to import a type. We're also introducing double completion which gives you additional completion items when you press the corresponding shortcut for the second time.A new level of extensibility. With the new NuGet-based Extension Manager, discovering, installing and uninstalling ReSharper extensions becomes extremely easy in Visual Studio 2010 and higher. When we say extensions, we mean not only full-fledged plug-ins but also sets of templates or SSR patterns that can now be shared much more easily.CSS support improvements. Smarter usage search for CSS attributes, new CSS-specific code inspections, configurable support for CSS3 and earlier versions, compatibility checks against popular browsers - there's a rough outline of what's new for CSS in ReSharper 8.A command-line version of ReSharper. ReSharper 8 goes beyond Visual Studio: we now provide a free standalone tool with hundreds of ReSharper inspections and additionally a duplicate code finder that you can integrate with your CI server or version control system.Multiple minor improvements in areas such as decompiling and code formatting, as well as support for the Blue Theme introduced in Visual Studio 2012 Update 2.

    Read the article

  • New PeopleSoft HCM 9.1 On Demand Standard Edition provides a complete set of IT services at a low, predictable monthly cost

    - by Robbin Velayedam
    At Oracle Open World last month, Oracle announced that we are extending our On Demand offerings with the general availability of PeopleSoft On Demand Standard Edition. Standard Edition represents Oracle’s commitment to providing customers a choice of solutions, technology, and deployment options commensurate with their business needs and future growth. The Standard Edition offering complements the traditional On Demand offerings (Enterprise and Professional Editions) by focusing on a low, predictable monthly cost model that scales with the size of your business.   As part of Oracle's open cloud strategy, customers can freely move PeopleSoft licensed applications between on premise and the various  on demand options as business needs arise.    In today’s business climate, aggressive and creative business objectives demand more of IT organizations. They are expected to provide technology-based solutions to streamline business processes, enable online collaboration and multi-tasking, facilitate data mining and storage, and enhance worker productivity. As IT budgets remain tight in a recovering economy, the challenge becomes how to meet these demands with limited time and resources. One way is to eliminate the variable costs of projects so that your team can focus on the high priority functions and better predict funding and resource needs two to three years out. Variable costs and changing priorities can derail the best laid project and capacity plans. The prime culprits of variable costs in any IT organization include disaster recovery, security breaches, technical support, and changes in business growth and priorities. Customers have an immediate need for solutions that are cheaper, predictable in cost, and flexible enough for long-term growth or capacity changes. The Standard Edition deployment option fulfills that need by allowing customers to take full advantage of the rich business functionality that is inherent to PeopleSoft HCM, while delegating all application management responsibility – such as future upgrades and product updates – to Oracle technology experts, at an affordable and expected price. Standard Edition provides the advantages of the secure Oracle On Demand hosted environment, the complete set of PeopleSoft HCM configurable business processes, and timely management of regular updates and enhancements to the application functionality and underlying technology. Standard Edition has a convenient monthly fee that is scalable by number of employees, which helps align the customer’s overall cost of ownership with its size and anticipated growth and business needs. In addition to providing PeopleSoft HCM applications' world class business functionality and Oracle On Demand's embassy-grade security, Oracle’s hosted solution distinguishes itself from competitors by offering customers the ability to transition between different deployment and service models at any point in the application ownership lifecycle. As our customers’ business and economic climates change, they are free to transition their applications back to on-premise at any time. HCM On Demand Standard Edition is based on configurability options rather than customizations, requiring no additional code to develop or maintain. This keeps the cost of ownership low and time to production less than a month on average. Oracle On Demand offers the highest standard of security and performance by leveraging a state-of-the-art data center with dedicated databases, servers, and secured URL all within a private cloud. Customers will not share databases, environments, platforms, or access portals with other customers because we value how mission critical your data are to your business. Oracle’s On Demand also provides a full breadth of disaster recovery services to provide customers the peace of mind that their data are secure and that backup operations are in place to keep their businesses up and running in the case of an emergency. Currently we have over 50 PeopleSoft customers delegating us with the management of their applications through Oracle On Demand. If you are a customer interested in learning more about the PeopleSoft HCM 9.1 Standard Edition and how it can help your organization minimize your variable IT costs and free up your resources to work on other business initiatives, contact Oracle or your Account Services Representative today.

    Read the article

  • Searching for context in Silverlight applications

    - by PeterTweed
    A common behavior in business applications that have developed through the ages is for a user to be able to get information or execute commands in relation to some information/function displayed by right clicking the object in question and popping up a context menu that offers relevant options to choose. The Silverlight Toolkit April 2010 release introduced the context menu object.  This can be added to other UI objects and display options for the user to choose.  The menu items can be enabled or disabled as per your application logic and icons can be added to the menu items to add visual effect.  This post will walk you through how to use the context menu object from the Silverlight Toolkit. Steps: 1. Create a new Silverlight 4 application 2. Copy the following namespace definition to the user control object of the MainPage.xaml file: xmlns:my="clr-namespace:System.Windows.Controls;assembly=System.Windows.Controls.Input.Toolkit"   3. Copy the following XAML into the LayoutRoot grid in MainPage.xaml:          <Border CornerRadius="15" Background="Blue" Width="400" Height="100">             <TextBlock Foreground="White" FontSize="20" Text="Context Menu In This Border...." HorizontalAlignment="Center" VerticalAlignment="Center" >             </TextBlock>             <my:ContextMenuService.ContextMenu>                 <my:ContextMenu >                     <my:MenuItem                 Header="Copy"                 Click="CopyMenuItem_Click" Name="copyMenuItem">                         <my:MenuItem.Icon>                             <Image Source="copy-icon-small.png"/>                         </my:MenuItem.Icon>                     </my:MenuItem>                     <my:Separator/>                     <my:MenuItem Name="pasteMenuItem"                 Header="Paste"                 Click="PasteMenuItem_Click">                         <my:MenuItem.Icon>                             <Image Source="paste-icon-small.png"/>                         </my:MenuItem.Icon>                     </my:MenuItem>                 </my:ContextMenu>             </my:ContextMenuService.ContextMenu>         </Border>   The above code associates a context menu with two menu items and a separator between them to the border object.  The menu items has icons associated with them to add visual appeal.  The menu items have click event handlers that will be added in the MainPage.xaml.cs code behind in a later step. 4. Add two icon sized images to the ClientBin directory of the web project hosting the Silverlight application, named copy-icon-small.png and paste-icon-small.jpg respectively.  I used copy and paste icons as the names suggest. 5. Add the following code to the class in MainPage.xaml.cs file:         private void CopyMenuItem_Click(object sender, RoutedEventArgs e)         {             MessageBox.Show("Copy selected");         }           private void PasteMenuItem_Click(object sender, RoutedEventArgs e)         {             MessageBox.Show("Paste selected");         }   This code adds the event handlers for the menu items defined in step 3. 6. Run the application, right click on the border and select a menu option and see the appropriate message box displayed. Congratulations it’s that easy!   Take the Slalom Challenge at www.slalomchallenge.com!

    Read the article

  • Thinking Local, Regional and Global

    - by Apeksha Singh-Oracle
    The FIFA World Cup tournament is the biggest single-sport competition: it’s watched by about 1 billion people around the world. Every four years each national team’s manager is challenged to pull together a group players who ply their trade across the globe. For example, of the 23 members of Brazil’s national team, only four actually play for Brazilian teams, and the rest play in England, France, Germany, Spain, Italy and Ukraine. Each country’s national league, each team and each coach has a unique style. Getting all these “localized” players to work together successfully as one unit is no easy feat. In addition to $35 million in prize money, much is at stake – not least national pride and global bragging rights until the next World Cup in four years time. Achieving economic integration in the ASEAN region by 2015 is a bit like trying to create the next World Cup champion by 2018. The team comprises Brunei Darussalam, Cambodia, Indonesia, Lao PDR, Malaysia, Myanmar, Philippines, Singapore, Thailand and Vietnam. All have different languages, currencies, cultures and customs, rules and regulations. But if they can pull together as one unit, the opportunity is not only great for business and the economy, but it’s also a source of regional pride. BCG expects by 2020 the number of firms headquartered in Asia with revenue exceeding $1 billion will double to more than 5,000. Their trade in the region and with the world is forecast to increase to 37% of an estimated $37 trillion of global commerce by 2020 from 30% in 2010. Banks offering transactional banking services to the emerging market place need to prepare to repond to customer needs across the spectrum – MSMEs, SMEs, corporates and multi national corporations. Customers want innovative, differentiated, value added products and services that provide: • Pan regional operational independence while enabling single source of truth at a regional level • Regional connectivity and Cash & Liquidity  optimization • Enabling Consistent experience for their customers  by offering standardized products & services across all ASEAN countries • Multi-channel & self service capabilities / access to real-time information on liquidity and cash flows • Convergence of cash management with supply chain and trade finance While enabling the above to meet customer demands, the need for a comprehensive and robust credit management solution for effective regional banking operations is a must to manage risk. According to BCG, Asia-Pacific wholesale transaction-banking revenues are expected to triple to $139 billion by 2022 from $46 billion in 2012. To take advantage of the trend, banks will have to manage and maximize their own growth opportunities, compete on a broader scale, manage the complexity within the region and increase efficiency. They’ll also have to choose the right operating model and regional IT platform to offer: • Account Services • Cash & Liquidity Management • Trade Services & Supply Chain Financing • Payments • Securities services • Credit and Lending • Treasury services The core platform should be able to balance global needs and local nuances. Certain functions need to be performed at a regional level, while others need to be performed on a country level. Financial reporting and regulatory compliance are a case in point. The ASEAN Economic Community is in the final lap of its preparations for the ultimate challenge: becoming a formidable team in the global league. Meanwhile, transaction banks are designing their own hat trick: implementing a world-class IT platform, positioning themselves to repond to customer needs and establishing a foundation for revenue generation for years to come. Anand Ramachandran Senior Director, Global Banking Solutions Practice Oracle Financial Services Global Business Unit

    Read the article

  • Modern Best Practice in the Cloud with Oracle Accelerate for Midsize Companies

    - by Richard Lefebvre
    v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 See how a modern approach to best practice, enabled by innovative new technologies, can help you increase business agility and accelerate profitable growth: as the only vendor able to deliver Modern Best Practice, Oracle has a new additional competitive advantage. Here's a simple guide to help you challenge your prospects with this innovative approach. Modern Best Practice Explained: Download now! Discover why disruptive technologies are changing the face of business best practice—and how you can harness modern best practice to achieve more than ever before. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:Calibri; mso-fareast-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • Using Content Analytics for More Effective Engagement

    - by Kellsey Ruppel
    Using Content Analytics for More Effective Engagement: Turning High-Volume Content into Templates for Success By Mitchell Palski, Oracle WebCenter Sales Consultant Many organizations use Oracle WebCenter Portal to develop these basic types of portals: Intranet portals used for collaboration, employee self-service, and company communication Extranet portals used by customers and partners for self-service and support Team collaboration portals that allow users to share documents and content, track activity, and engage in discussions Portals are intended to provide a personalized, single point of interaction with web-based applications and information. The user experiences that a Portal is capable of displaying should be relevant to an individual user or class of users (a group or role). The components of a Portal that would vary based on a user’s identity include: Web content such as images, news articles, and on-screen instruction Social tools such as threaded discussions, polls/surveys, and blogs Document management tools to upload, download, and edit files Web applications that present data visualizations and data entry modules These collections of content, tools, and applications make up valuable workspaces. The challenge that a development team may have is defining which combinations are the most effective for its users. No one wants to create and manage a workspace that goes un-used or (even worse) that is used but is ineffective. Oracle WebCenter Portal provides you with the capabilities to not only rapidly develop variations of portals, but also identify which portals are the most effective and should be re-used throughout an enterprise. Capturing Portal AnalyticsOracle WebCenter Portal provides an analytics service that allows administrators and business users to track and analyze portal usage. These analytics are captured in the form of: Usage tracking metrics Behavior tracking User Profile Correlation The out-of-the-box task reports that come with Oracle WebCenter Portal include: WebCenter Portal Traffic Page Traffic Login Metrics Portlet Traffic Portlet Response Time Portlet Instance Traffic Portlet Instance Response Time Search Metrics Document Metrics Wiki Metrics Blog Metrics Discussion Metrics Portal Traffic Portal Response Time By determining the usage and behavior tracking metrics that are associated with specific user profiles (including groups and roles), your administrators will be able to identify the components of your solution that are the most valuable.  Your first step as an administrator should be to identify the specific pages and/or components are used the most frequently. Next, determine the user(s) or user-group(s) that are accessing those high-use elements of a portal. It is also important to determine patterns in high-usage and see if they correlate to a specific schedule. One of the goals of any development team (especially those that are following Agile methodologies) should be to develop reusable web components to minimize redundant development. Oracle WebCenter Portal provides you the tools to capture the successful workspaces that have already been developed and identified so that they can be reused for similar user demographics. Re-using Successful PortalsWhen creating a new Portal in Oracle WebCenter, developers have the option to base that portal on a template that includes: Pre-seeded data such as pages, tools, user roles, and look-and-feel assets Specific sub-sets of page-layouts, tools, and other resources to standardize what is added to a Portal’s pages Any custom components that your team creates during development cycles Once you have identified a successful workspace and its most valuable components, leverage Oracle WebCenter’s ability to turn that custom portal into a portal template. By creating a template from your already successful portal, you are empowering your enterprise by providing a starting point for future initiatives. Your new projects, new teams, and new web pages can benefit from lessons learned and adjustments that have already been made to optimize user experiences instead of starting from scratch. ***For a complete explanation of how to work with Portal Templates, be sure to read the Fusion Middleware documentation available online.

    Read the article

  • How'd they do it: Millions of tiles in Terraria

    - by William 'MindWorX' Mariager
    I've been working up a game engine similar to Terraria, mostly as a challenge, and while I've figured out most of it, I can't really seem to wrap my head around how they handle the millions of interactable/harvestable tiles the game has at one time. Creating around 500.000 tiles, that is 1/20th of what's possible in Terraria, in my engine causes the frame-rate to drop from 60 to around 20, even tho I'm still only rendering the tiles in view. Mind you, I'm not doing anything with the tiles, only keeping them in memory. Update: Code added to show how I do things. This is part of a class, which handles the tiles and draws them. I'm guessing the culprit is the "foreach" part, which iterates everything, even empty indexes. ... public void Draw(SpriteBatch spriteBatch, GameTime gameTime) { foreach (Tile tile in this.Tiles) { if (tile != null) { if (tile.Position.X < -this.Offset.X + 32) continue; if (tile.Position.X > -this.Offset.X + 1024 - 48) continue; if (tile.Position.Y < -this.Offset.Y + 32) continue; if (tile.Position.Y > -this.Offset.Y + 768 - 48) continue; tile.Draw(spriteBatch, gameTime); } } } ... Also here is the Tile.Draw method, which could also do with an update, as each Tile uses four calls to the SpriteBatch.Draw method. This is part of my autotiling system, which means drawing each corner depending on neighboring tiles. texture_* are Rectangles, are set once at level creation, not each update. ... public virtual void Draw(SpriteBatch spriteBatch, GameTime gameTime) { if (this.type == TileType.TileSet) { spriteBatch.Draw(this.texture, this.realm.Offset + this.Position, texture_tl, this.BlendColor); spriteBatch.Draw(this.texture, this.realm.Offset + this.Position + new Vector2(8, 0), texture_tr, this.BlendColor); spriteBatch.Draw(this.texture, this.realm.Offset + this.Position + new Vector2(0, 8), texture_bl, this.BlendColor); spriteBatch.Draw(this.texture, this.realm.Offset + this.Position + new Vector2(8, 8), texture_br, this.BlendColor); } } ... Any critique or suggestions to my code is welcome. Update: Solution added. Here's the final Level.Draw method. The Level.TileAt method simply checks the inputted values, to avoid OutOfRange exceptions. ... public void Draw(SpriteBatch spriteBatch, GameTime gameTime) { Int32 startx = (Int32)Math.Floor((-this.Offset.X - 32) / 16); Int32 endx = (Int32)Math.Ceiling((-this.Offset.X + 1024 + 32) / 16); Int32 starty = (Int32)Math.Floor((-this.Offset.Y - 32) / 16); Int32 endy = (Int32)Math.Ceiling((-this.Offset.Y + 768 + 32) / 16); for (Int32 x = startx; x < endx; x += 1) { for (Int32 y = starty; y < endy; y += 1) { Tile tile = this.TileAt(x, y); if (tile != null) tile.Draw(spriteBatch, gameTime); } } } ...

    Read the article

  • Career-Defining Moments

    - by Robz / Fervent Coder
    Originally posted on: http://geekswithblogs.net/robz/archive/2013/06/25/career-defining-moments.aspx Fear holds us back from many things. A little fear is healthy, but don’t let it overwhelm you into missing opportunities. In every career there is a moment when you can either step forward and define yourself, or sit down and regret it later. Why do we hold back: is it fear, constraints, family concerns, or that we simply can't do it? I think in many cases it comes to the unknown, and we are good at fearing the unknown. Some people hold back because they are fearful of what they don’t know. Some hold back because they are fearful of learning new things. Some hold back simply because to take on a new challenge it means they have to give something else up. The phrase sometimes used is “It’s the devil you know versus the one you don’t.” That fear sometimes allows us to miss great opportunities. In many people’s case it is the opportunity to go into business for yourself, to start something that never existed. Most hold back hear for a fear of failing. We’ve all heard the phrase “What would you do if you knew you couldn’t fail?”, which is intended to get people to think about the opportunities they might create. A better term I heard recently on the Ruby Rogues podcast was “What would be worth doing even if you knew you were going to fail?” I think that wording suits the intent better. If you knew (or thought) going in that you were going to fail and you didn’t care, it would open you up to the possibility of paying more attention to the journey and not the outcome. In my case it is a fear of acceptance. I am fearful that I may not learn what I need to learn or may not do a good enough job to be accepted. At the same time that fear drives me and makes me want to leap forward. Some folks would define this as “The Flinch”. I’m learning Ruby and Puppet right now. I have limited experience with both, limited to the degree it scares me some that I don’t know much about either. Okay, it scares me quite a bit! Some people’s defining moment might be going to work for Microsoft. All of you who know me know that I am in love with automation, from low-tech to high-tech automation. So for me, my “mecca” is a little different in that regard. Awhile back I sat down and defined where I wanted my career to go and it had to do more with DevOps, defined as applying developer practices to system administration operations (I could not find this definition when I searched). It’s an area that interests me and why I really want to expand chocolatey into something more awesome. I want to see Windows be as automatable and awesome as other operating systems that are out there. Back to the career-defining moment. Sometimes these moments only come once in a lifetime. The key is to recognize when you are in one of these moments and step back to evaluate it before choosing to dive in head first. So I am about to embark on what I define as one of these “moments.”  On July 1st I will be joining Puppet Labs and working to help make the Windows automation experience rock solid! I’m both scared and excited about the opportunity!

    Read the article

  • Simple Navigation In Windows Phone 7

    - by PeterTweed
    Take the Slalom Challenge at www.slalomchallenge.com! When moving to the mobile platform all applications need to be able to provide different views.  Navigating around views in Windows Phone 7 is a very easy thing to do.  This post will introduce you to the simplest technique for navigation in Windows Phone 7 apps. Steps: 1.     Create a new Windows Phone Application project. 2.     In the MainPage.xaml file copy the following xaml into the ContentGrid Grid:             <StackPanel Orientation="Vertical" VerticalAlignment="Center"  >                 <TextBox Name="ValueTextBox" Width="200" ></TextBox>                 <Button Width="200" Height="30" Content="Next Page" Click="Button_Click"></Button>             </StackPanel> This gives a text box for the user to enter text and a button to navigate to the next page. 3.     Copy the following event handler code to the MainPage.xaml.cs file:         private void Button_Click(object sender, RoutedEventArgs e)         {             NavigationService.Navigate(new Uri( string.Format("/SecondPage.xaml?val={0}", ValueTextBox.Text), UriKind.Relative));         }   The event handler uses the NavigationService.Navigate() function.  This is what makes the navigation to another page happen.  The function takes a Uri parameter with the name of the page to navigate to and the indication that it is a relative Uri to the current page.  Note also the querystring is formatted with the value entered in the ValueTextBox control – in a similar manner to a standard web querystring. 4.     Add a new Windows Phone Portrait Page to the project named SecondPage.xaml. 5.     Paste the following XAML in the ContentGrid Grid in SecondPage.xaml:             <Button Name="GoBackButton" Width="200" Height="30" Content="Go Back" Click="Button_Click"></Button>   This provides a button to navigate back to the first page. 6.     Copy the following event handler code to the SecondPage.xaml.cs file:         private void Button_Click(object sender, RoutedEventArgs e)         {             NavigationService.GoBack();         } This tells the application to go back to the previously displayed page. 7.     Add the following code to the constructor in SecondPage.xaml.cs:             this.Loaded += new RoutedEventHandler(SecondPage_Loaded); 8.     Add the following loaded event handler to the SecondPage.xaml.cs file:         void SecondPage_Loaded(object sender, RoutedEventArgs e)         {             if (NavigationContext.QueryString["val"].Length > 0)                 MessageBox.Show(NavigationContext.QueryString["val"], "Data Passed", MessageBoxButton.OK);             else                 MessageBox.Show("{Empty}!", "Data Passed", MessageBoxButton.OK);         }   This code pops up a message box displaying either the text entered on the first page or the message “{Empty}!” if no text was entered. 9.     Run the application, enter some text in the text box and click on the next page button to see the application in action:   Congratulations!  You have created a new Windows Phone 7 application with page navigation.

    Read the article

  • EPPM Is a Must-Have Capability as Global Energy and Power Industries Eye US$38 Trillion in New Investments

    - by Melissa Centurio Lopes
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} “The process manufacturing industry is facing an unprecedented challenge: from now until 2035, cumulative worldwide investments of US$38 trillion will be required for drilling, power generation, and other energy projects,” Iain Graham, director of energy and process manufacturing for Oracle’s Primavera, said in a recent webcast. He adds that process manufacturing organizations such as oil and gas, utilities, and chemicals must manage this level of investment in an environment of constrained capital markets, erratic supply and demand, aging infrastructure, heightened regulations, and declining global skills. In the following interview, Graham explains how the right enterprise project portfolio management (EPPM) technology can help the industry meet these imperatives. Q: Why is EPPM so important for today’s process manufacturers? A: If the industry invests US$38 trillion without proper cost controls in place, a huge amount of resources will be put at risk, especially when it comes to cost overruns that may occur in large capital projects. Process manufacturing companies must not only control costs, but also monitor all the various contractors that will be involved in each project. If you’re not managing your own workers and all the interdependencies among the different contractors, then you’ve got problems. Q: What else should process manufacturers look for? A: It’s also important that an EPPM solution has the ability to manage more than just capital projects. For example, it’s best to manage maintenance and capital projects in the same system. Say you’re due to install a new transformer in a power station as part of a capital project, but routine maintenance in that area of the facility is scheduled for that morning. The lack of coordination could lead to unforeseen delays. There are also IT considerations that impact capital projects, such as adding servers and network cable for a control system in a power station. What organizations need is a true EPPM system that’s not just for capital projects, maintenance, or IT activities, but instead an enterprisewide solution that provides visibility into all types of projects. Read the complete Q&A here and discover the practical framework for successfully managing this massive capital spending.

    Read the article

  • A programmer who doesn't get to program - where to turn? [closed]

    - by Just an Anon
    I'm in my mid 20's, and have been working as a full time programmer / developer for the last ~6 years, with several years of part-time freelancing before this, and three straight years of freelancing in the middle of this short career. I work mostly with PHP and the Drupal framework. By and large, I focus on programming custom pieces of functionality; these, of course, vary greatly from project to project. I've got years of solid experience with OOP (have done some Java & C# years ago, too) including intensive experience with front-end development, and even some design work. I've lead small teams (2-4 people) of developers. And of course, given the large amount of freelancing, I've got decent project- & client-management skills. My problem is staying motivated at any place of employment. In the time mentioned I've worked (full-time) at six local companies. The longest I've stayed at any company was just over a year. I find that I'll get hired and be very excited and motivated for the first few months, but the work quickly gets "stale." By that I mean that the interesting components (ie. the programming) get done, and the rest of the work turns into boring cleanup (move a button, add text, change colours, add a field). I don't get challenged, and I don't feel like I'm learning anything new. This happens repeatedly time and time again, and I always end up leaving for either a new opportunity, or to freelance. I'm wondering if perhaps I've painted myself into a corner with the rather niche work market (although with very high demand and good compensation) and need to explore other career choices. Another possibility is that I may be choosing the wrong places of employment, mostly small agencies, and need to look into working for a larger, more established firm. I find programming, writing code, and architecting solutions very rewarding. When I'm working on an interesting problem I lose all sense of time and 14-16 hours can fly by like minutes. I get the same exciting feeling when I'm doing high-level planning of a complex system, breaking up the work and figuring out how everything will tie-in together. I absolutely hate doing small, "stupid" changes that pose no challenge, yet seem to make up more and more of my work. I want to find a workplace where I will get to work on such tasks, be challenged, and improve in all areas of product development. This maybe a programming job, management, architecture of desktop apps, or may be managing a taco stand on a beach in Mexico - I don't know, and I need some advice and real-world feedback. What are some job areas worth exploring? The requirements are fairly simple: working with computers interacting with others challenging decent pay (I'm making just short of 90k / year with a month of vacation & some benefits, and would like to stay in this range, but am willing to take a temporary cut in pay for a more interesting position) Any advice would be much appreciated!

    Read the article

  • Code Golf: Code 39 Bar Code

    - by gwell
    The challenge The shortest code by character count to draw an ASCII representation of a Code 39 bar code. Wikipedia article about Code 39: http://en.wikipedia.org/wiki/Code_39 Input The input will be a string of legal characters for Code 39 bar codes. This means 43 characters are valid: 0-9 A-Z (space) and -.$/+%. The * character will not appear in the input as it is used as the start and stop characters. Output Each character encoded in Code 39 bar codes have nine elements, five bars and four spaces. Bars will be represented with # characters, and spaces will be represented with the space character. Three of the nine elements will be wide. The narrow elements will be one character wide, and the wide elements will be three characters wide. A inter-character space of a single space should be added between each character pattern. The pattern should be repeated so that the height of the bar code is eight characters high. The start/stop character * (bWbwBwBwb) would be represented like this: # # ### ### # # # ### ### # # # ### ### # # # ### ### # # # ### ### # # # ### ### # # # ### ### # # # ### ### # ^ ^ ^^ ^ ^ ^ ^^^ | | || | | | ||| narrow bar -+ | || | | | ||| wide space ---+ || | | | ||| narrow bar -----+| | | | ||| narrow space ------+ | | | ||| wide bar --------+ | | ||| narrow space ----------+ | ||| wide bar ------------+ ||| narrow space --------------+|| narrow bar ---------------+| inter-character space ----------------+ The start and stop character * will need to be output at the start and end of the bar code. No quiet space will need to be included before or after the bar code. No check digit will need to be calculated. Full ASCII Code39 encoding is not required, just the standard 43 characters. No text needs to be printed below the ASCII bar code representation to identify the output contents. The character # can be replaced with another character of higher density if wanted. Using the full block character U+2588, would allow the bar code to actually scan when printed. Test cases Input: ABC Output: # # ### ### # ### # # # ### # ### # # ### ### ### # # # # # ### ### # # # ### ### # ### # # # ### # ### # # ### ### ### # # # # # ### ### # # # ### ### # ### # # # ### # ### # # ### ### ### # # # # # ### ### # # # ### ### # ### # # # ### # ### # # ### ### ### # # # # # ### ### # # # ### ### # ### # # # ### # ### # # ### ### ### # # # # # ### ### # # # ### ### # ### # # # ### # ### # # ### ### ### # # # # # ### ### # # # ### ### # ### # # # ### # ### # # ### ### ### # # # # # ### ### # # # ### ### # ### # # # ### # ### # # ### ### ### # # # # # ### ### # Input: 1/3 Output: # # ### ### # ### # # # ### # # # # # ### ### # # # # # ### ### # # # ### ### # ### # # # ### # # # # # ### ### # # # # # ### ### # # # ### ### # ### # # # ### # # # # # ### ### # # # # # ### ### # # # ### ### # ### # # # ### # # # # # ### ### # # # # # ### ### # # # ### ### # ### # # # ### # # # # # ### ### # # # # # ### ### # # # ### ### # ### # # # ### # # # # # ### ### # # # # # ### ### # # # ### ### # ### # # # ### # # # # # ### ### # # # # # ### ### # # # ### ### # ### # # # ### # # # # # ### ### # # # # # ### ### # Input: - $ (minus space dollar) Output: # # ### ### # # # # ### ### # ### # ### # # # # # # # # ### ### # # # ### ### # # # # ### ### # ### # ### # # # # # # # # ### ### # # # ### ### # # # # ### ### # ### # ### # # # # # # # # ### ### # # # ### ### # # # # ### ### # ### # ### # # # # # # # # ### ### # # # ### ### # # # # ### ### # ### # ### # # # # # # # # ### ### # # # ### ### # # # # ### ### # ### # ### # # # # # # # # ### ### # # # ### ### # # # # ### ### # ### # ### # # # # # # # # ### ### # # # ### ### # # # # ### ### # ### # ### # # # # # # # # ### ### # Code count includes input/output (full program).

    Read the article

  • Testing smart card minidriver

    - by user352792
    when testing smart card minidriver in windows 7, got the following errors: "cmck exec Reconnect" always show that Testing through CAPI calls Submitting CSP PIN for reader \.\DMWZ ESAFE 0\ CryptAcquireContext - CRYPT_NEWKEYSET CryptGenKey Reconnecting CryptAcquireContext - CRYPT_DELETEKEYSET CryptAcquireContext failed unexpectedly d:\5429t\testsrc\dstest\security\core\credentials\smartcard\cmck\cmck\fnreconnect.cpp Line: 264 WIN32 0x80090016 Keyset does not exist. in windows xp, it always passed. i have no idea! this is my log. in XP: /* P:608 T:3380 8-30-203 CardAcquireContext(): BEGIN /* P:608 T:3380 8-30-203 CardAcquireContext(): SUCCESS /* P:608 T:3380 8-30-203 CardAcquireContext(): BEGIN /* P:608 T:3380 8-30-203 CardAcquireContext(): SUCCESS /* P:608 T:3380 8-31-750 CardAcquireContext(): BEGIN /* P:608 T:3380 8-31-765 CardAcquireContext(): SUCCESS /* P:608 T:3380 8-31-765 CardDeleteContext(): BEGIN /* P:608 T:3380 8-31-765 CardDeleteContext(): SUCCESS /* P:608 T:3380 8-31-765 CardAcquireContext(): BEGIN /* P:608 T:3380 8-31-765 CardAcquireContext(): SUCCESS /* P:608 T:3380 8-31-765 CardDeleteContext(): BEGIN /* P:608 T:3380 8-31-781 CardDeleteContext(): SUCCESS /* P:608 T:3380 8-31-781 CardAcquireContext(): BEGIN /* P:608 T:3380 8-31-781 CardAcquireContext(): SUCCESS /* P:608 T:3380 8-31-781 CardGetChallenge(): BEGIN /* P:608 T:3380 CardGetChallenge(): Challenge = CE568537C1BC9318 / / P:608 T:3380 8-31-781 CardGetChallenge(): SUCCESS /* P:608 T:3380 8-31-796 CardAuthenticateChallenge(): BEGIN /* P:608 T:3380 CardAuthenticateChallenge(): Response = B99E85F50E1F5C29 / / P:608 T:3380 8-31-796 CardAuthenticateChallenge(): SUCCESS /* P:608 T:3380 8-31-812 CardDeauthenticate(): BEGIN /* P:608 T:3380 8-31-812 CardDeauthenticate(): SUCCESS /* P:608 T:3380 8-31-812 CardAuthenticatePin(): BEGIN /* P:608 T:3380 CardAuthenticatePin(): User PIN = 0000 / / P:608 T:3380 8-31-828 CardAuthenticatePin(): SUCCESS /* P:608 T:3380 8-31-828 CardDeauthenticate(): BEGIN /* P:608 T:3380 8-31-843 CardDeauthenticate(): SUCCESS /* P:608 T:3380 8-31-843 CardDeleteContext(): BEGIN /* P:608 T:3380 8-31-843 CardDeleteContext(): SUCCESS /* P:608 T:3380 8-31-859 CardAcquireContext(): BEGIN /* P:608 T:3380 8-31-859 CardAcquireContext(): SUCCESS /* P:608 T:3380 8-31-859 CardAuthenticatePin(): BEGIN /* P:608 T:3380 CardAuthenticatePin(): User PIN = 0000 / / P:608 T:3380 8-31-875 CardAuthenticatePin(): SUCCESS /* P:608 T:3380 8-31-875 CardQueryCapabilities(): BEGIN /* P:608 T:3380 8-31-875 CardQueryCapabilities(): SUCCESS /* P:608 T:3380 8-31-890 CardAuthenticatePin(): BEGIN /* P:608 T:3380 CardAuthenticatePin(): User PIN = 0000 / / P:608 T:3380 8-31-906 CardAuthenticatePin(): SUCCESS /* P:608 T:3380 8-31-906 CardDeauthenticate(): BEGIN /* P:608 T:3380 8-31-921 CardDeauthenticate(): SUCCESS /* P:608 T:3380 8-31-921 CardDeleteContext(): BEGIN /* P:608 T:3380 8-31-921 CardDeleteContext(): SUCCESS /* P:608 T:3380 8-32-0 CardAcquireContext(): BEGIN /* P:608 T:3380 8-32-0 CardAcquireContext(): SUCCESS /* P:608 T:3380 8-32-0 CardReadFile(): BEGIN /* P:608 T:3380 CardReadFile(): Dir Name = ROOT, File Name = cardid / / P:608 T:3380 CardReadFile(): cardid = 34646533393531342D643465662D3432 / / P:608 T:3380 8-32-46 CardReadFile(): SUCCESS /* P:608 T:3380 8-32-62 CardReadFile(): BEGIN /* P:608 T:3380 CardReadFile(): Dir Name = ROOT, File Name = cardcf / / P:608 T:3380 CardReadFile(): cardcf = 000000000000 / / P:608 T:3380 8-32-109 CardReadFile(): SUCCESS /* P:608 T:3380 8-32-109 CardReadFile(): BEGIN /* P:608 T:3380 CardReadFile(): Dir Name = mscp, File Name = cmapfile / / P:608 T:3380 8-32-187 CardReadFile(): BEGIN /* P:608 T:3380 CardReadFile(): Dir Name = ROOT, File Name = cardcf / / P:608 T:3380 CardReadFile(): cardcf = 000000000000 / / P:608 T:3380 8-32-234 CardReadFile(): SUCCESS /* P:608 T:3380 8-32-250 CardAuthenticatePin(): BEGIN /* P:608 T:3380 CardAuthenticatePin(): User PIN = 0000 / / P:608 T:3380 8-32-265 CardAuthenticatePin(): SUCCESS /* P:608 T:3380 8-32-265 CardDeauthenticate(): BEGIN /* P:608 T:3380 8-32-281 CardDeauthenticate(): SUCCESS /* P:608 T:3380 8-32-281 CardReadFile(): BEGIN /* P:608 T:3380 CardReadFile(): Dir Name = ROOT, File Name = cardcf / / P:608 T:3380 CardReadFile(): cardcf = 000000000000 / / P:608 T:3380 8-32-328 CardReadFile(): SUCCESS /* P:608 T:3380 8-32-343 CardQueryFreeSpace(): BEGIN /* P:608 T:3380 8-32-359 CardQueryFreeSpace(): SUCCESS /* P:608 T:3380 8-32-375 CardReadFile(): BEGIN /* P:608 T:3380 CardReadFile(): Dir Name = ROOT, File Name = cardcf / / P:608 T:3380 CardReadFile(): cardcf = 000000000000 / / P:608 T:3380 8-32-421 CardReadFile(): SUCCESS /* P:608 T:3380 8-32-421 CardAuthenticatePin(): BEGIN /* P:608 T:3380 CardAuthenticatePin(): User PIN = 0000 / / P:608 T:3380 8-32-453 CardAuthenticatePin(): SUCCESS /* P:608 T:3380 8-32-453 CardWriteFile(): BEGIN /* P:608 T:3380 CardWriteFile(): Dir Name = ROOT, File Name = cardcf / / P:608 T:3380 CardWriteFile(): cardcf = 000000000100 / / P:608 T:3380 8-32-531 CardWriteFile(): SUCCESS /* P:608 T:3380 8-32-531 CardWriteFile(): BEGIN /* P:608 T:3380 CardWriteFile(): Dir Name = mscp, File Name = cmapfile / / P:608 T:3380 CardWriteFile(): cmapfile = 660031006500300035003000300030002D0031003600380038002D0034006200380063002D0039006500300066002D003000310061006200300066006200340062003800660037000000000000000000010000000000 / / P:608 T:3380 8-32-921 CardWriteFile(): SUCCESS /* P:608 T:3380 8-32-921 CardWriteFile(): BEGIN /* P:608 T:3380 CardWriteFile(): Dir Name = ROOT, File Name = cardcf / / P:608 T:3380 CardWriteFile(): cardcf = 000000000200 / / P:608 T:3380 8-33-0 CardWriteFile(): SUCCESS /* P:608 T:3380 8-33-0 CardWriteFile(): BEGIN /* P:608 T:3380 CardWriteFile(): Dir Name = mscp, File Name = cmapfile / / P:608 T:3380 CardWriteFile(): cmapfile = 660031006500300035003000300030002D0031003600380038002D0034006200380063002D0039006500300066002D003000310061006200300066006200340062003800660037000000000000000000030000000000 / / P:608 T:3380 8-33-109 CardWriteFile(): SUCCESS /* P:608 T:3380 8-33-125 CardQueryCapabilities(): BEGIN /* P:608 T:3380 8-33-125 CardQueryCapabilities(): SUCCESS /* P:608 T:3380 8-33-125 CardWriteFile(): BEGIN /* P:608 T:3380 CardWriteFile(): Dir Name = ROOT, File Name = cardcf / / P:608 T:3380 CardWriteFile(): cardcf = 000001000200 / / P:608 T:3380 8-33-203 CardWriteFile(): SUCCESS /* P:608 T:3380 8-33-203 CardCreateContainer(): BEGIN /* P:608 T:3380 8-35-515 CardCreateContainer(): SUCCESS /* P:608 T:3380 8-35-531 CardWriteFile(): BEGIN /* P:608 T:3380 CardWriteFile(): Dir Name = ROOT, File Name = cardcf / / P:608 T:3380 CardWriteFile(): cardcf = 000001000300 / / P:608 T:3380 8-35-609 CardWriteFile(): SUCCESS /* P:608 T:3380 8-35-609 CardWriteFile(): BEGIN /* P:608 T:3380 CardWriteFile(): Dir Name = mscp, File Name = cmapfile / / P:608 T:3380 CardWriteFile(): cmapfile = 660031006500300035003000300030002D0031003600380038002D0034006200380063002D0039006500300066002D003000310061006200300066006200340062003800660037000000000000000000030000040000 / / P:608 T:3380 8-35-734 CardWriteFile(): SUCCESS /* P:608 T:3380 8-35-734 CardGetContainerInfo(): BEGIN /* P:608 T:3380 8-35-796 CardGetContainerInfo(): SUCCESS /* P:608 T:5764 8-37-296 CardDeauthenticate(): BEGIN /* P:608 T:5764 8-37-312 CardDeauthenticate(): SUCCESS /* P:608 T:3380 8-37-312 CardReadFile(): BEGIN /* P:608 T:3380 CardReadFile(): Dir Name = ROOT, File Name = cardcf / / P:608 T:3380 CardReadFile(): cardcf = 000001000300 / / P:608 T:3380 8-37-375 CardReadFile(): SUCCESS /* P:608 T:3380 8-37-375 CardReadFile(): BEGIN /* P:608 T:3380 CardReadFile(): Dir Name = ROOT, File Name = cardcf / / P:608 T:3380 CardReadFile(): cardcf = 000001000300 / / P:608 T:3380 8-37-437 CardReadFile(): SUCCESS /* P:608 T:3380 8-37-437 CardAuthenticatePin(): BEGIN /* P:608 T:3380 CardAuthenticatePin(): User PIN = 0000 / / P:608 T:3380 8-37-468 CardAuthenticatePin(): SUCCESS /* P:608 T:3380 8-37-484 CardWriteFile(): BEGIN /* P:608 T:3380 CardWriteFile(): Dir Name = ROOT, File Name = cardcf / / P:608 T:3380 CardWriteFile(): cardcf = 000001000400 / / P:608 T:3380 8-37-546 CardWriteFile(): SUCCESS /* P:608 T:3380 8-37-562 CardDeleteFile(): BEGIN /* P:608 T:3380 CardDeleteFile(): Dir Name = mscp, File Name = ksc00 / / P:608 T:3380 8-37-625 CardDeleteFile(): SCARD_E_FILE_NOT_FOUND (0x80100024) /* P:608 T:3380 CardDeleteFile(): FAILED /* P:608 T:3380 8-37-625 CardReadFile(): BEGIN /* P:608 T:3380 CardReadFile(): Dir Name = mscp, File Name = cmapfile / / P:608 T:3380 CardReadFile(): cmapfile = 660031006500300035003000300030002D0031003600380038002D0034006200380063002D0039006500300066002D003000310061006200300066006200340062003800660037000000000000000000030000040000 / / P:608 T:3380 8-37-718 CardReadFile(): SUCCESS /* P:608 T:3380 8-37-718 CardWriteFile(): BEGIN /* P:608 T:3380 CardWriteFile(): Dir Name = ROOT, File Name = cardcf / / P:608 T:3380 CardWriteFile(): cardcf = 000001000500 / / P:608 T:3380 8-37-796 CardWriteFile(): SUCCESS /* P:608 T:3380 8-37-796 CardDeleteFile(): BEGIN /* P:608 T:3380 CardDeleteFile(): Dir Name = mscp, File Name = kxc00 / / P:608 T:3380 8-37-875 CardDeleteFile(): SCARD_E_FILE_NOT_FOUND (0x80100024) /* P:608 T:3380 CardDeleteFile(): FAILED /* P:608 T:3380 8-37-875 CardWriteFile(): BEGIN /* P:608 T:3380 CardWriteFile(): Dir Name = ROOT, File Name = cardcf / / P:608 T:3380 CardWriteFile(): cardcf = 000002000500 / / P:608 T:3380 8-37-953 CardWriteFile(): SUCCESS /* P:608 T:3380 8-37-953 CardDeleteContainer(): BEGIN /* P:608 T:3380 8-38-578 CardDeleteContainer(): SUCCESS /* P:608 T:3380 8-38-593 CardReadFile(): BEGIN /* P:608 T:3380 CardReadFile(): Dir Name = mscp, File Name = cmapfile / / P:608 T:3380 CardReadFile(): cmapfile = 660031006500300035003000300030002D0031003600380038002D0034006200380063002D0039006500300066002D003000310061006200300066006200340062003800660037000000000000000000030000040000 / / P:608 T:3380 8-38-687 CardReadFile(): SUCCESS /* P:608 T:3380 8-38-687 CardWriteFile(): BEGIN /* P:608 T:3380 CardWriteFile(): Dir Name = ROOT, File Name = cardcf / / P:608 T:3380 CardWriteFile(): cardcf = 000002000600 / / P:608 T:3380 8-38-781 CardWriteFile(): SUCCESS /* P:608 T:3380 8-38-781 CardWriteFile(): BEGIN /* P:608 T:3380 CardWriteFile(): Dir Name = mscp, File Name = cmapfile / / P:608 T:3380 CardWriteFile(): cmapfile = 0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000 / / P:608 T:3380 8-38-906 CardWriteFile(): SUCCESS /* P:608 T:5764 8-40-406 CardDeauthenticate(): BEGIN /* P:608 T:5764 8-40-421 CardDeauthenticate(): SUCCESS /* P:608 T:3380 8-40-671 CardDeleteContext(): BEGIN /* P:608 T:3380 8-40-687 CardDeleteContext(): SUCCESS in windows 7: /* P:3368 T:3800 17-39-515 CardAcquireContext(): BEGIN /* P:3368 T:3800 17-39-515 CardAcquireContext(): SUCCESS /* P:3368 T:3800 17-39-515 CardAcquireContext(): BEGIN /* P:3368 T:3800 17-39-515 CardAcquireContext(): SUCCESS /* P:3368 T:3800 17-39-531 CardAcquireContext(): BEGIN /* P:3368 T:3800 17-39-531 CardAcquireContext(): SUCCESS /* P:3368 T:3800 17-39-531 CardAcquireContext(): BEGIN /* P:3368 T:3800 17-39-531 CardAcquireContext(): SUCCESS /* P:3368 T:3800 17-41-187 CardAcquireContext(): BEGIN /* P:3368 T:3800 17-41-187 CardAcquireContext(): SUCCESS /* P:3368 T:3800 17-41-187 CardDeleteContext(): BEGIN /* P:3368 T:3800 17-41-187 CardDeleteContext(): SUCCESS /* P:3368 T:3800 17-41-187 CardAcquireContext(): BEGIN /* P:3368 T:3800 17-41-187 CardAcquireContext(): SUCCESS /* P:3368 T:3800 17-41-187 CardDeleteContext(): BEGIN /* P:3368 T:3800 17-41-203 CardDeleteContext(): SUCCESS /* P:3368 T:3800 17-41-203 CardAcquireContext(): BEGIN /* P:3368 T:3800 17-41-203 CardAcquireContext(): SUCCESS /* P:3368 T:3800 17-41-203 CardDeleteContext(): BEGIN /* P:3368 T:3800 17-41-203 CardDeleteContext(): SUCCESS /* P:3368 T:3800 17-41-203 CardAcquireContext(): BEGIN /* P:3368 T:3800 17-41-203 CardAcquireContext(): SUCCESS /* P:3368 T:3800 17-41-218 CardDeleteContext(): BEGIN /* P:3368 T:3800 17-41-218 CardDeleteContext(): SUCCESS /* P:3368 T:3800 17-41-218 CardAcquireContext(): BEGIN /* P:3368 T:3800 17-41-218 CardAcquireContext(): SUCCESS /* P:3368 T:3800 17-41-218 CardGetChallenge(): BEGIN /* P:3368 T:3800 CardGetChallenge(): Challenge = BF830855CDCA4F0D / / P:3368 T:3800 17-41-234 CardGetChallenge(): SUCCESS /* P:3368 T:3800 17-41-234 CardAuthenticateChallenge(): BEGIN /* P:3368 T:3800 CardAuthenticateChallenge(): Response = A2DB6F882D402D94 / / P:3368 T:3800 17-41-234 CardAuthenticateChallenge(): SUCCESS /* P:3368 T:3800 17-41-234 CardDeauthenticate(): BEGIN /* P:3368 T:3800 17-41-250 CardDeauthenticate(): SUCCESS /* P:3368 T:3800 17-41-250 CardAuthenticatePin(): BEGIN /* P:3368 T:3800 CardAuthenticatePin(): User PIN = 0000 / / P:3368 T:3800 17-41-265 CardAuthenticatePin(): SUCCESS /* P:3368 T:3800 17-41-265 CardDeauthenticate(): BEGIN /* P:3368 T:3800 17-41-265 CardDeauthenticate(): SUCCESS /* P:3368 T:3800 17-41-265 CardDeleteContext(): BEGIN /* P:3368 T:3800 17-41-281 CardDeleteContext(): SUCCESS /* P:3368 T:3800 17-41-281 CardAcquireContext(): BEGIN /* P:3368 T:3800 17-41-281 CardAcquireContext(): SUCCESS /* P:3368 T:3800 17-41-281 CardAuthenticatePin(): BEGIN /* P:3368 T:3800 CardAuthenticatePin(): User PIN = 0000 / / P:3368 T:3800 17-41-296 CardAuthenticatePin(): SUCCESS /* P:3368 T:3800 17-41-296 CardQueryCapabilities(): BEGIN /* P:3368 T:3800 17-41-296 CardQueryCapabilities(): SUCCESS /* P:3368 T:3800 17-41-296 CardAuthenticatePin(): BEGIN /* P:3368 T:3800 CardAuthenticatePin(): User PIN = 0000 / / P:3368 T:3800 17-41-312 CardAuthenticatePin(): SUCCESS /* P:3368 T:3800 17-41-312 CardDeauthenticate(): BEGIN /* P:3368 T:3800 17-41-328 CardDeauthenticate(): SUCCESS /* P:3368 T:3800 17-41-328 CardDeleteContext(): BEGIN /* P:3368 T:3800 17-41-328 CardDeleteContext(): SUCCESS /* P:3368 T:3800 17-41-359 CardAcquireContext(): BEGIN /* P:3368 T:3800 17-41-359 CardAcquireContext(): SUCCESS /* P:3368 T:3800 17-41-359 CardReadFile(): BEGIN /* P:3368 T:3800 CardReadFile(): Dir Name = ROOT, File Name = cardid / / P:3368 T:3800 CardReadFile(): cardid = 34363438653733652D346430342D3463 / / P:3368 T:3800 17-41-406 CardReadFile(): SUCCESS /* P:3368 T:3800 17-41-406 CardReadFile(): BEGIN /* P:3368 T:3800 CardReadFile(): Dir Name = ROOT, File Name = cardcf / / P:3368 T:3800 CardReadFile(): cardcf = 000000000000 / / P:3368 T:3800 17-41-453 CardReadFile(): SUCCESS /* P:3368 T:3800 17-41-453 CardReadFile(): BEGIN /* P:3368 T:3800 CardReadFile(): Dir Name = mscp, File Name = cmapfile / / P:3368 T:3800 17-41-531 CardReadFile(): BEGIN /* P:3368 T:3800 CardReadFile(): Dir Name = ROOT, File Name = cardcf / / P:3368 T:3800 CardReadFile(): cardcf = 000000000000 / / P:3368 T:3800 17-41-593 CardReadFile(): SUCCESS /* P:3368 T:3800 17-41-593 CardAuthenticatePin(): BEGIN /* P:3368 T:3800 CardAuthenticatePin(): User PIN = 0000 / / P:3368 T:3800 17-41-609 CardAuthenticatePin(): SUCCESS /* P:3368 T:3800 17-41-609 CardDeauthenticate(): BEGIN /* P:3368 T:3800 17-41-609 CardDeauthenticate(): SUCCESS /* P:3368 T:3800 17-41-609 CardDeleteContext(): BEGIN /* P:3368 T:3800 17-41-625 CardDeleteContext(): SUCCESS /* P:3368 T:3800 17-41-625 CardAcquireContext(): BEGIN /* P:3368 T:3800 17-41-625 CardAcquireContext(): SUCCESS /* P:3368 T:3800 17-41-625 CardReadFile(): BEGIN /* P:3368 T:3800 CardReadFile(): Dir Name = ROOT, File Name = cardid / / P:3368 T:3800 CardReadFile(): cardid = 34363438653733652D346430342D3463 / / P:3368 T:3800 17-41-671 CardReadFile(): SUCCESS /* P:3368 T:3800 17-41-687 CardReadFile(): BEGIN /* P:3368 T:3800 CardReadFile(): Dir Name = ROOT, File Name = cardcf / / P:3368 T:3800 CardReadFile(): cardcf = 000000000000 / / P:3368 T:3800 17-41-734 CardReadFile(): SUCCESS /* P:3368 T:3800 17-41-734 CardQueryFreeSpace(): BEGIN /* P:3368 T:3800 17-41-750 CardQueryFreeSpace(): SUCCESS /* P:3368 T:3800 17-41-750 CardAuthenticatePin(): BEGIN /* P:3368 T:3800 CardAuthenticatePin(): User PIN = 0000 / / P:3368 T:3800 17-41-765 CardAuthenticatePin(): SUCCESS /* P:3368 T:3800 17-41-765 CardWriteFile(): BEGIN /* P:3368 T:3800 CardWriteFile(): Dir Name = ROOT, File Name = cardcf / / P:3368 T:3800 CardWriteFile(): cardcf = 000000000100 / / P:3368 T:3800 17-41-828 CardWriteFile(): SUCCESS /* P:3368 T:3800 17-41-828 CardWriteFile(): BEGIN /* P:3368 T:3800 CardWriteFile(): Dir Name = mscp, File Name = cmapfile / / P:3368 T:3800 CardWriteFile(): cmapfile = 370062003800640030006200390031002D0063003600650064002D0034003000650033002D0062006100610037002D006200620032003800640063003800610035003300330032000000000000000000010000000000 / / P:3368 T:3800 17-42-218 CardWriteFile(): SUCCESS /* P:3368 T:3800 17-42-234 CardWriteFile(): BEGIN /* P:3368 T:3800 CardWriteFile(): Dir Name = ROOT, File Name = cardcf / / P:3368 T:3800 CardWriteFile(): cardcf = 000000000200 / / P:3368 T:3800 17-42-296 CardWriteFile(): SUCCESS /* P:3368 T:3800 17-42-296 CardWriteFile(): BEGIN /* P:3368 T:3800 CardWriteFile(): Dir Name = mscp, File Name = cmapfile / / P:3368 T:3800 CardWriteFile(): cmapfile = 370062003800640030006200390031002D0063003600650064002D0034003000650033002D0062006100610037002D006200620032003800640063003800610035003300330032000000000000000000030000000000 / / P:3368 T:3800 17-42-390 CardWriteFile(): SUCCESS /* P:3368 T:3800 17-42-406 CardQueryCapabilities(): BEGIN /* P:3368 T:3800 17-42-406 CardQueryCapabilities(): SUCCESS /* P:3368 T:3800 17-42-406 CardWriteFile(): BEGIN /* P:3368 T:3800 CardWriteFile(): Dir Name = ROOT, File Name = cardcf / / P:3368 T:3800 CardWriteFile(): cardcf = 000001000200 / / P:3368 T:3800 17-42-468 CardWriteFile(): SUCCESS /* P:3368 T:3800 17-42-468 CardCreateContainer(): BEGIN /* P:3368 T:3800 17-48-421 CardCreateContainer(): SUCCESS /* P:3368 T:3800 17-48-437 CardWriteFile(): BEGIN /* P:3368 T:3800 CardWriteFile(): Dir Name = ROOT, File Name = cardcf / / P:3368 T:3800 CardWriteFile(): cardcf = 000001000300 / / P:3368 T:3800 17-48-484 CardWriteFile(): SUCCESS /* P:3368 T:3800 17-48-500 CardWriteFile(): BEGIN /* P:3368 T:3800 CardWriteFile(): Dir Name = mscp, File Name = cmapfile / / P:3368 T:3800 CardWriteFile(): cmapfile = 370062003800640030006200390031002D0063003600650064002D0034003000650033002D0062006100610037002D006200620032003800640063003800610035003300330032000000000000000000030000040000 / / P:3368 T:3800 17-48-593 CardWriteFile(): SUCCESS /* P:3368 T:3800 17-48-593 CardGetContainerInfo(): BEGIN /* P:3368 T:3800 17-48-640 CardGetContainerInfo(): SUCCESS /* P:3368 T:288 17-50-140 CardDeauthenticate(): BEGIN /* P:3368 T:288 17-50-140 CardDeauthenticate(): SUCCESS /* P:3368 T:3800 17-50-140 CardReadFile(): BEGIN /* P:3368 T:3800 CardReadFile(): Dir Name = ROOT, File Name = cardid / / P:3368 T:3800 CardReadFile(): cardid = 34363438653733652D346430342D3463 / / P:3368 T:3800 17-50-187 CardReadFile(): SUCCESS /* P:3368 T:3800 17-50-187 CardReadFile(): BEGIN /* P:3368 T:3800 CardReadFile(): Dir Name = ROOT, File Name = cardcf / / P:3368 T:3800 CardReadFile(): cardcf = 000001000300 / / P:3368 T:3800 17-50-234 CardReadFile(): SUCCESS /* P:3368 T:3800 17-50-234 CardReadFile(): BEGIN /* P:3368 T:3800 CardReadFile(): Dir Name = ROOT, File Name = cardid / / P:3368 T:3800 CardReadFile(): cardid = 34363438653733652D346430342D3463 / / P:3368 T:3800 17-50-296 CardReadFile(): SUCCESS /* P:3368 T:3800 17-50-296 CardReadFile(): BEGIN /* P:3368 T:3800 CardReadFile(): Dir Name = ROOT, File Name = cardid / / P:3368 T:3800 CardReadFile(): cardid = 34363438653733652D346430342D3463 / / P:3368 T:3800 17-50-343 CardReadFile(): SUCCESS Comparing the two logs, it seems that in win 7 cmck always read file, read file, read file... and fail, never get into CardDeleteContainer or CardWriteFile :( Please help me!!!! Many thanks!

    Read the article

  • Using PHP substr() and strip_tags() while retaining formatting and without breaking HTML

    - by Peter
    I have various HTML strings to cut to 100 characters (of the stripped content, not the original) without stripping tags and without breaking HTML. Original HTML string (288 characters): $content = "<div>With a <span class='spanClass'>span over here</span> and a <div class='divClass'>nested div over <div class='nestedDivClass'>there</div> </div> and a lot of other nested <strong><em>texts</em> and tags in the air <span>everywhere</span>, it's a HTML taggy kind of day.</strong></div>"; When trimming to 100 characters HTML breaks and stripped content comes to about 40 characters: $content = substr($content, 0, 100)."..."; /* output: <div>With a <span class='spanClass'>span over here</span> and a <div class='divClass'>nested div ove... */ Stripping HTML gives the correct character count but obviously looses formatting: $content = substr(strip_tags($content)), 0, 100)."..."; /* output: With a span over here and a nested div over there and a lot of other nested texts and tags in the ai... */ Challenge: To output the character count of strip_tags while retaining HTML formatting and on closing the string finish any started tags: /* <div>With a <span class='spanClass'>span over here</span> and a <div class='divClass'>nested div over <div class='nestedDivClass'>there</div> </div> and a lot of other nested <strong><em>texts</em> and tags in the ai</strong></div>..."; Similar question (less strict on solution provided so far)

    Read the article

  • Frame Buster Buster ... buster code needed

    - by Jeff Atwood
    Let's say you don't want other sites to "frame" your site in an <iframe>: <iframe src="http://yourwebsite.com"></iframe> So you insert anti-framing, frame busting JavaScript into all your pages: /* break us out of any containing iframes */ if (top != self) { top.location.replace(self.location.href); } Excellent! Now you "bust" or break out of any containing iframe automatically. Except for one small problem. As it turns out, your frame-busting code can be busted, as shown here: <script type="text/javascript"> var prevent_bust = 0 window.onbeforeunload = function() { prevent_bust++ } setInterval(function() { if (prevent_bust > 0) { prevent_bust -= 2 window.top.location = 'http://server-which-responds-with-204.com' } }, 1) </script> This code does the following: increments a counter every time the browser attempts to navigate away from the current page, via the window.onbeforeonload event handler sets up a timer that fires every millisecond via setInterval(), and if it sees the counter incremented, changes the current location to a server of the attacker's control that server serves up a page with HTTP status code 204, which does not cause the browser to nagivate anywhere My question is -- and this is more of a JavaScript puzzle than an actual problem -- how can you defeat the frame-busting buster? I had a few thoughts, but nothing worked in my testing: attempting to clear the onbeforeunload event via onbeforeonload = null had no effect adding an alert() stopped the process let the user know it was happening, but did not interfere with the code in any way; clicking OK lets the busting continue as normal I can't think of any way to clear the setInterval() timer I'm not much of a JavaScript programmer, so here's my challenge to you: hey buster, can you bust the frame-busting buster?

    Read the article

  • Code Golf: 1x1 black pixel

    - by Joey Adams
    Recently, I used my favorite image editor to make a 1x1 black pixel (which can come in handy when you want to draw solid boxes in HTML cheaply). Even though I made it a monochrome PNG, it came out to be 120 bytes! I mean, that's kind of steep. 120 bytes. For one pixel. I then converted it to a GIF, which dropped the size down to 43 bytes. Much better, but still... Challenge The shortest image file or program that is or generates a 1x1 black pixel. A submission may be: An image file that represents a 1x1 black pixel. The format chosen must be able to represent larger images than 1x1, and cannot be ad-hoc (that is, it can't be an image format you just made up for code golf). Image files will be ranked by byte count. A program that generates such an image file. Programs will be ranked by character count, as usual in code golf. As long as an answer falls into one of these two categories, anything is fair game.

    Read the article

  • Can a masterpage reference another masterpage with the same content and contentplaceholder tags?

    - by Peach
    Current Setup I currently have three masterpages and content pages in the following hierarchy : One root-level masterpage that displays the final result. Call this "A" Two sibling pages that don't reference each other but contain all the same contentplaceholder elements, just in a different order with different <div>'s surrounding them. Both reference the root-level masterpage. Call these "B1" and "B2". Several content pages that reference one or the other sibling master pages above (not both). Call these "C1" through "C-whatever". Basically I have: Cn = B1 = A Cm = B2 = A This hierarchy works fine. Desired Setup What I want to do is add in a new level to this hierarchy (a new master page) between the content pages and the sibling masterpages. Basically so it's like this: One root-level masterpage that displays the final result. Two sibling pages plus a third sibling. Call it B3 A new middle masterpage that dynamically 'chooses' one of the sibling masterpages. The desired behaviour is to pass through the content given by C directly to Bn without modifying it. The only thing D actively does is choose which Bn. Call this new masterpage D. Several content pages that reference the new middle master page instead of the old siblings. The challenge to this is, I'm working within the confines of a rather complex product and I cannot change the original two sibling masterpages (B1 and B2) or content pages (C) in any meaningful way. I want: Cn = D = B1 = A Cm = D = B2 = A Ck = D = B3 = A Essentially, D should "pass through" all it's content to whichever B-level masterpage it chooses. I can't put this logic in the C-level pages. Additional Details All B-level pages have the same content/contentplaceholder tags, just ordered and styled differently. D can be as convoluted as it has to be, so long as it doesn't require modifying C or B. I'm using ASP.Net 2.0 Is this possible?

    Read the article

< Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >