Search Results

Search found 2145 results on 86 pages for 'training'.

Page 83/86 | < Previous Page | 79 80 81 82 83 84 85 86  | Next Page >

  • Agile Testing Days 2012 – Day 3 – Agile or agile?

    - by Chris George
    Another early start for my last Lean Coffee of the conference, and again it was not wasted. We had some really interesting discussions around how to determine what test automation is useful, if agile is not faster, why do it? and a rather existential discussion on whether unicorns exist! First keynote of the day was entitled “Fast Feedback Teams” by Ola Ellnestam. Again this relates nicely to the releasing faster talk on day 2, and something that we are looking at and some teams are actively trying. Introducing the notion of feedback, Ola describes a game he wrote for his eldest child. It was a simple game where every time he clicked a button, it displayed “You’ve Won!”. He then changed it to be a Win-Lose-Win-Lose pattern and watched the feedback from his son who then twigged the pattern and got his younger brother to play, alternating turns… genius! (must do that with my children). The idea behind this was that you need that feedback loop to learn and progress. If you are not getting the feedback you need to close that loop. An interesting point Ola made was to solve problems BEFORE writing software. It may be that you don’t have to write anything at all, perhaps it’s a communication/training issue? Perhaps the problem can be solved another way. Writing software, although it’s the business we are in, is expensive, and this should be taken into account. He again mentions frequent releases, and how they should be made as soon as stuff is ready to be released, don’t leave stuff on the shelf cause it’s not earning you anything, money or data. I totally agree with this and it’s something that we will be aiming for moving forwards. “Exceptions, Assumptions and Ambiguity: Finding the truth behind the story” by David Evans started off very promising by making references to ‘Grim up North’ referring to the north of England. Not sure it was appreciated by most of the audience, but it made me laugh! David explained how there are always risks associated with exceptions, giving the example of a one-way road near where he lives, with an exception sign giving rights to coaches to go the wrong way. Therefore you could merrily swing around the corner of the one way road straight into a coach! David showed the danger in making assumptions with lyrical quotes from Lola by The Kinks “I’m glad I’m a man, and so is Lola” and with a picture of a toilet flush that needed instructions to operate the full and half flush. With this particular flush, you pulled the handle all the way down to half flush, and half way down to full flush! hmmm, a bit of a crappy user experience methinks! Then through a clever use of a passage from the Jabberwocky, David then went onto show how mis-translation/ambiguity is the can completely distort the original meaning of something, and this is a real enemy of software development. This was all helping to demonstrate that the term Story is often heavily overloaded in the Agile world, and should really be stripped back to what it is really for, stating a business problem, and offering a technical solution. Therefore a story could be worded as “In order to {make some improvement}, we will { do something}”. The first ‘in order to’ statement is stakeholder neutral, and states the problem through requesting an improvement to the software/process etc. The second part of the story is the verb, the doing bit. So to achieve the ‘improvement’ which is not currently true, we will do something to make this true in the future. My PM is very interested in this, and he’s observed some of the problems of overloading stories so I’m hoping between us we can use some of David’s suggestions to help clarify our stories better. The second keynote of the day (and our last) proved to be the most entertaining and exhausting of the conference for me. “The ongoing evolution of testing in agile development” by Scott Barber. I’ve never had the pleasure of seeing Scott before… OMG I would love to have even half of the energy he has! What struck me during this presentation was Scott’s explanation of how testing has become the role/job that it is (largely) today, and how this has led to the need for ‘methodologies’ to make dev and test work! The argument that we should be trying to converge the roles again is a very valid one, and one that a couple of the teams at work are actively doing with great results. Making developers as responsible for quality as testers is something that has been lost over the years, but something that we are now striving to achieve. The idea that we (testers) should be testing experts/specialists, not testing ‘union members’, supports this idea so the entire team works on all aspects of a feature/product, with the ‘specialists’ taking the lead and advising/coaching the others. This leads to better propagation of information around the team, a greater holistic understanding of the project and it allows the team to continue functioning if some of it’s members are off sick, for example. Feeling somewhat drained from Scott’s keynote (but at the same time excited that alot of the points he raised supported actions we are taking at work), I headed into my last presentation for Agile Testing Days 2012 before having to make my way to Tegel to catch the flight home. “Thinking and working agile in an unbending world” with Pete Walen was a talk I was not going to miss! Having spoken to Pete several times during the past few days, I was looking forward to hearing what he was going to say, and I was not disappointed. Pete started off by trying to separate the definitions of ‘Agile’ as in the methodology, and ‘agile’ as in the adjective by pronouncing them the ‘english’ and ‘american’ ways. So Agile pronounced (Ajyle) and agile pronounced (ajul). There was much confusion around what the hell he was talking about, although I thought it was quite clear. Agile – Software development methodology agile – Marked by ready ability to move with quick easy grace; Having a quick resourceful and adaptable character. Anyway, that aside (although it provided a few laughs during the presentation), the point was that many teams that claim to be ‘Agile’ but are not, in fact, ‘agile’ by nature. Implementing ‘Agile’ methodologies that are so prescriptive actually goes against the very nature of Agile development where a team should anticipate, adapt and explore. Pete made a valid point that very few companies intentionally put up roadblocks to impede work, so if work is being blocked/delayed, why? This is where being agile as a team pays off because the team can inspect what’s going on, explore options and adapt their processes. It is through experimentation (and that means trying and failing as well as trying and succeeding) that a team will improve and grow leading to focussing on what really needs to be done to achieve X. So, that was it, the last talk of our conference. I was gutted that we had to miss the closing keynote from Matt Heusser, as Matt was another person I had spoken too a few times during the conference, but the flight would not wait, and just as well we left when we did because the traffic was a nightmare! My Takeaway Triple from Day 3: Release often and release small – don’t leave stuff on the shelf Keep the meaning of the word ‘agile’ in mind when working in ‘Agile Look at testing as more of a skill than a role  

    Read the article

  • It was a figure of speech!

    - by Ratman21
    Yesterday I posted the following as attention getter / advertisement (as well as my feelings). In the groups, (I am in) on the social networking site, LinkedIn and boy did I get responses.    I am fighting mad about (a figure of speech, really) not having a job! Look just because I am over 55 and have gray hair. It does not mean, my brain is dead or I can no longer trouble shoot a router or circuit or LAN issue. Or that I can do “IT” work at all. And I could prove this if; some one would give me at job. Come on try me for 90 days at min. wage. I know you will end up keeping me (hope fully at normal pay) around. Is any one hearing me…come on take up the challenge!     This was the responses I got.   I hear you. We just need to retrain and get our skills up to speed is all. That is what I am doing. I have not given up. Just got to stay on top of the game. Experience is on our side if we have the credentials and we are reasonable about our salaries this should not be an issue.   Already on it, going back to school and have got three certifications (CompTIA A+, Security+ and Network+. I am now studying for my CISCO CCNA certification. As to my salary, I am willing to work at very reasonable rate.   You need to re-brand yourself like a product, market and sell yourself. You need to smarten up, look and feel a million dollars, re-energize yourself, regain your confidents. Either start your own business, or re-write your CV so it stands out from the rest, get the template off the internet. Contact every recruitment agent in your town, state, country and overseas, and on the web. Apply to every job you think you could do, you may not get it but you will make a contact for your network, which may lead to a job at the end of the tunnel. Get in touch with everyone you know from past jobs. Do charity work. I maintain the IT Network, stage electrical and the Telecom equipment in my church,   Again already on it. I have email the world is seems with my resume and cover letters. So far, I have rewritten or had it rewrote, my resume and cover letters; over seven times so far. Re-energize? I never lost my energy level or my self-confidents in my work (now if could get some HR personal to see the same). I also volunteer at my church, I created and maintain the church web sit.   I share your frustration. Sucks being over 50 and looking for work. Please don't sell yourself short at min wage because the employer will think that’s your worth. Keep trying!!   I never stop trying and min wage is only for 90 days. If some one takes up the challenge. Some post asked if I am keeping up technology.   Do you keep up with the latest technology and can speak the language fluidly?   Yep to that and as to speaking it also a yep! I am a geek you know. I heard from others over the 50 year mark and younger too.   I'm with you! I keep getting told that I don't have enough experience because I just recently completed a Masters level course in Microsoft SQL Server, which gave me a project-intensive equivalent of between 2 and 3 years of experience. On top of that training, I have 19 years as an applications programmer and database administrator. I can normalize rings around experienced DBAs and churn out effective code with the best of them. But my 19 years is worthless as far as most recruiters and HR people are concerned because it is not the specific experience for which they're looking. HR AND RECRUITERS TAKE NOTE: Experience, whatever the language, translates across platforms and technology! By the way, I'm also over 55 and still have "got it"!   I never lost it and I also can work rings round younger techs.   I'm 52 and female and seem to be having the same issues. I have over 10 years experience in tech support (with a BS in CIS) and can't get hired either.   Ow, I only have an AS in computer science along with my certifications.   Keep the faith, I have been unemployed since August of 2008. I agree with you...I am willing to return to the beginning of my retail career and work myself back through the ranks, if someone will look past the grey and realize the knowledge I would bring to the table.   I also would like some one to look past the gray.   Interesting approach, volunteering to work for minimum wage for 90 days. I'm in the same situation as you, being 55 & balding w/white hair, so I know where you're coming from. I've been out of work now for a year. I'm in Michigan, where the unemployment rate is estimated to be 15% (the worst in the nation) & even though I've got 30+ years of IT experience ranging from mainframe to PC desktop support, it's difficult to even get a face-to-face interview. I had one prospective employer tell me flat out that I "didn't have the energy required for this position". Mostly I never get any feedback. All I can say is good luck & try to remain optimistic.   He said WHAT! Yes remaining optimistic is key. Along with faith in God. Then there was this (for lack of better word) jerk.   Give it up already. You were too old to work in high tech 10 years ago. Scratch that, 20 years ago! Try selling hot dogs in front of Fry's Electronics. At least you would get a chance to eat lunch with your previous colleagues....   You know funny thing on this person is that I checked out his profile. He is older than I am.

    Read the article

  • Oracle Partner Store (OPS) New Enhancements

    - by Kristin Rose
    Effective June 29th, Oracle Partner Store (OPS) will release the enhancements listed below to improve your overall ordering experience. v Online Transactional Oracle Master Agreement (Online TOMA) The Online TOMA enables end users to execute a transactional end user license agreement with Oracle. The new Online TOMA in OPS will replace the need for you to obtain a signed hard copy of the TOMA from the end user. You will now initiate the Online TOMA via OPS. Navigation: OPS Home > Order Tools > Online TOMA Query > Request Online TOMA> End User Contact, click “Select for TOMA” > Select Language > Submit (an automated email is sent immediately to the requestor and the end user) Ø The Online TOMA can also be initiated from the ‘My OPS’ tab. Under the Online TOMA Query section partners can track Online TOMA request details submitted to end users. The status of the Online TOMA request and the OMA Key generated (once Ts&Cs of the Online TOMA are accepted by an end user) are also displayed in this table. There is also the ability to resend pending Online TOMA requests by clicking ‘Resend’. Navigation: OPS Home > Order Tools > Online TOMA Query For more details on the Transactional OMA, please click here. v Convert Deals to Carts The partner deal registration system within OPS will now allow you to convert approved deals into carts with a simple click of a button. VADs can use Deal to Cart on all of their partners' registrations, regardless of whether they submitted on their partner's behalf, or the partner submitted themselves. Navigation: Login > Deal Registrations > Deal Registration List > Open the approved deal > Click Deal Reg ID number link to open > Click on 'Create Cart' link You can locate your newly created cart in the Saved Carts section of OPS. Links are also available from within an open deal or from the Deal Registration List. Click on the cart number to proceed. v Partner Opportunity Management: Deal Registration on OPS now allows you to see updated information on your opportunities from Oracle’s Fusion CRM opportunity management system.  Key fields such as close date, sales stage, products and status can be viewed by clicking the opportunity ID associated with the deal registration.  This new feature allows you to see regular updates to your opportunities after registrations are approved.  Through ongoing communication with Oracle Channel Managers and Sales Reps, you can ensure that Oracle has the latest information on your active registered deals. v Product Recommendations: When adding products to the Deal Registrations tab, OPS will now show additional products that you can try to include to maximize your sale and rebate. v Advanced Customer Support(ACS) Services Note: This will be available from July 9th. Initiate the purchase of the complete stack (HW/SW/Services) online with one single OPS order. More ACS services now supported online with exception of Start-Up Pack: · New SW installation services for Standard Configurations & stand alone System Software. · New Pre-production & Go-live services for Standard & Engineered Systems · New SW configuration & Platinum Pre-Production & Go-Live services for Engineered Systems · New Travel & Expenses Estimate included · New Partner & VAD volume discount supported v Software as a Service (SaaS) for Independent Software Vendors (ISVs): Oracle SaaS ISVs can now use OPS to submit their monthly usage reports to Oracle within 20 days after the end of every month. Navigation: OPS Home > Cart > Transaction Type: Partner SaaS for ISV’s > Add Eligible Products > Check out v Existing Approvals: In an effort to reduce the processing time of discount approvals, we have added a new section in the Request Approval page for you to communicate pre-existing approvals without having to attach the DAT. Just enter the Approval ID and submit your request. In case of existing software approvals, you will be required to submit the DAT with the Contact Information section filled out. v Additional data for Shipping Box Labels and Packing Slips OPS now has additional fields in the Shipping Notes section for you to add PO details. This will help you easily identify shipments as they arrive. Partners will have an End User PO field, whereas VADs will have VAR and End User PO fields. v Shipping Notes on OPS Hardware delivery Shipping Notes will now have multiple options to better suit your requirements. v Reminders for Royalty Reporting Partners: If you have not submitted your royalty report online, OPS will now send an automated alert to remind you. v Order Tracker Changes: · Order Tracker will now have a deal reg flag (Yes/No). You can now clearly distinguish between orders that have registered opportunities. · All lines of the order will be visible in the order details list. v Changes in Terminology · You will notice textual changes on some of our labels and messages relating to approval requests. “Discount Requests” has been replaced with “Approval Requests” to cater to some of our other offerings. · First Line Support (FLS) transaction type has been renamed to Support Provider Partner (SPP). OPS Support For more details on these enhancements, please request a training here. For assistance on the Oracle Partner Store, please contact the OPS support team in your region. NAMER: [email protected] LAD: [email protected] EMEA : [email protected] APAC: [email protected] Japan: [email protected] You can even call us on our Hotline! Find your local number here.     Thank you, Oracle Partner Store Support Team      

    Read the article

  • TGIF: Engagement Wrap-up

    - by Michael Snow
    We've had a very busy week here at Oracle and as we build up to Oracle OpenWorld starting in less than 10 days - it doesn't look like things will be slowing down. Engagement is definitely in the air this week. Our friend, John Mancini published a great article entitled: "The World of Engagement" on his Digital Landfill blog yesterday and we hosted a great webcast with R "Ray" Wang from Constellation Research yesterday on the "9 C's of Engagement". 12.00 Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} I wanted to wrap-up the week with some key takeaways from our webcast yesterday with Ray Wang. If you missed the webcast yesterday, fear not - it is now available  On-Demand. We'll leave you this week with lots of questions about how to navigate these churning waters of engagement. Stay tuned to the Oracle WebCenter Social Business Thought Leaders Webcast Series as we fuel this dialogue. 12.00 Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} Company Culture Does company support a culture of putting customer satisfaction ahead of profits? Does culture promote creativity and cross functional employee collaboration? Does culture accept different views of multi-generational workforce? Does culture promote employee training and skills development Does culture support upward mobility and long term retention? Does culture support work-life balance? Does the culture provide rewards for employee for outstanding customer support? Channels What are the current primary channels for customer communications? What do you think will be the primary channels in two years? Is company developing support model for emerging channels? Do all channels consistently deliver the same level of customer support? Do you know the cost per transaction across all channels? Do you engage customers proactively across multiple channels? Do all channels have access to the same customer information? Community Does company extend customer support into virtual communities of interest? Does company facilitate educating users through its virtual communities? Does company mine its customer’s experience into useful data? Does company increase the value for customers through using data to deliver new products and services? Does company support two way interactions with its customers through communities of interest? Does company actively support social CRM, online communities and social media markets? Credibility Does company market its trustworthiness through external certificates such as business licenses, BBB certificates or other validations? Does company promote trust through customer testimonials and case studies on ethical business practices? Does company promote truthful market campaigns Does company make it easy for customers to complain? Does company build its reputation for standing behind its products with guarantees for satisfaction? Does company protect its customer data with high security measures> Content What sources do you use to create customer content? Does company mine social media and blogs for customer content? How does your company sort, store and retain its customer content? How frequently does content get updated? What external sources do you use for customer content? How many responses are typically received from a knowledge management system inquiry? Does your company use customer content to design and develop new product and services? Context Does your company market to customers in clusters or individually? Does your company customize its messages and personalize them to specific needs of each individual customer? Does your company store customer data based on their past behaviors, purchases, sentiment analysis and current activities? Does your company manage customer context according to channels used? For example identify personal use channels versus business channels? What is your frequency of collecting customer activities across various touch points? How is your customer data stored and analyzed? Is contextual data used for future customer outreach? Cadence Which channels does your company measure-web site visits, phone calls, IVR, store visits, face to face, social media? Does company make effective use of cross channel marketing to promote more frequent customer engagement? Does your company rate the patterns relevant for your product or service and monitor usage against this pattern? Does your company measure the frequency of both online and offline channels? Does your company apply metrics to the frequency of customer engagements with product or services revenues? Does your company consolidate data for customer engagement across various channels for a complete view of its customer? Catalyst Does company offer coupon discounts? Does company have a customer loyalty program or a VIP membership program? Does company mine customer data to target specific groups of buyers? Do internal employees serve as ambassadors for customer programs? Does company drive loyalty through social media loyalty programs? Does company build rewards based on using loyalty data? Does company offer an employee incentive program to drive customer loyalty?

    Read the article

  • Windows Azure – Write, Run or Use Software

    - by BuckWoody
    Windows Azure is a platform that has you covered, whether you need to write software, run software that is already written, or Install and use “canned” software whether you or someone else wrote it. Like any platform, it’s a set of tools you can use where it makes sense to solve a problem. The primary location for Windows Azure information is located at http://windowsazure.com. You can find everything there from the development kits for writing software to pricing, licensing and tutorials on all of that. I have a few links here for learning to use Windows Azure – although it’s best if you focus not on the tools, but what you want to solve. I’ve got it broken down here into various sections, so you can quickly locate things you want to know. I’ll include resources here from Microsoft and elsewhere – I use these same resources in the Architectural Design Sessions (ADS) I do with my clients worldwide. Write Software Also called “Platform as a Service” (PaaS), Windows Azure has lots of components you can use together or separately that allow you to write software in .NET or various Open Source languages to work completely online, or in partnership with code you have on-premises or both – even if you’re using other cloud providers. Keep in mind that all of the features you see here can be used together, or independently. For instance, you might only use a Web Site, or use Storage, but you can use both together. You can access all of these components through standard REST API calls, or using our Software Development Kit’s API’s, which are a lot easier. In any case, you simply use Visual Studio, Eclipse, Cloud9 IDE, or even a text editor to write your code from a Mac, PC or Linux.  Components you can use: Azure Web Sites: Windows Azure Web Sites allow you to quickly write an deploy websites, without setting a Virtual Machine, installing a web server or configuring complex settings. They work alone, with other Windows Azure Web Sites, or with other parts of Windows Azure. Web and Worker Roles: Windows Azure Web Roles give you a full stateless computing instance with Internet Information Services (IIS) installed and configured. Windows Azure Worker Roles give you a full stateless computing instance without Information Services (IIS) installed, often used in a "Services" mode. Scale-out is achieved either manually or programmatically under your control. Storage: Windows Azure Storage types include Blobs to store raw binary data, Tables to use key/value pair data (like NoSQL data structures), Queues that allow interaction between stateless roles, and a relational SQL Server database. Other Services: Windows Azure has many other services such as a security mechanism, a Cache (memcacheD compliant), a Service Bus, a Traffic Manager and more. Once again, these features can be used with a Windows Azure project, or alone based on your needs. Various Languages: Windows Azure supports the .NET stack of languages, as well as many Open-Source languages like Java, Python, PHP, Ruby, NodeJS, C++ and more.   Use Software Also called “Software as a Service” (SaaS) this often means consumer or business-level software like Hotmail or Office 365. In other words, you simply log on, use the software, and log off – there’s nothing to install, and little to even configure. For the Information Technology professional, however, It’s not quite the same. We want software that provides services, but in a platform. That means we want things like Hadoop or other software we don’t want to have to install and configure.  Components you can use: Kits: Various software “kits” or packages are supported with just a few clicks, such as Umbraco, Wordpress, and others. Windows Azure Media Services: Windows Azure Media Services is a suite of services that allows you to upload media for encoding, processing and even streaming – or even one or more of those functions. We can add DRM and even commercials to your media if you like. Windows Azure Media Services is used to stream large events all the way down to small training videos. High Performance Computing and “Big Data”: Windows Azure allows you to scale to huge workloads using a few clicks to deploy Hadoop Clusters or the High Performance Computing (HPC) nodes, accepting HPC Jobs, Pig and Hive Jobs, and even interfacing with Microsoft Excel. Windows Azure Marketplace: Windows Azure Marketplace offers data and programs you can quickly implement and use – some free, some for-fee.   Run Software Also known as “Infrastructure as a Service” (IaaS), this offering allows you to build or simply choose a Virtual Machine to run server-based software.  Components you can use: Persistent Virtual Machines: You can choose to install Windows Server, Windows Server with Active Directory, with SQL Server, or even SharePoint from a pre-configured gallery. You can configure your own server images with standard Hyper-V technology and load them yourselves – and even bring them back when you’re done. As a new offering, we also even allow you to select various distributions of Linux – a first for Microsoft. Windows Azure Connect: You can connect your on-premises networks to Windows Azure Instances. Storage: Windows Azure Storage can be used as a remote backup, a hybrid storage location and more using software or even hardware appliances.   Decision Matrix With all of these options, you can use Windows Azure to solve just about any computing problem. It’s often hard to know when to use something on-premises, in the cloud, and what kind of service to use. I’ve used a decision matrix in the last couple of years to take a particular problem and choose the proper technology to solve it. It’s all about options – there is no “silver bullet”, whether that’s Windows Azure or any other set of functions. I take the problem, decide which particular component I want to own and control – and choose the column that has that box darkened. For instance, if I have to control the wiring for a solution (a requirement in some military and government installations), that means the “Networking” component needs to be dark, and so I select the “On Premises” column for that particular solution. If I just need the solution provided and I want no control at all, I can look as “Software as a Service” solutions. Security, Pricing, and Other Info  Security: Security is one of the first questions you should ask in any distributed computing environment. We have certification info, coding guidelines and more, even a general “Request for Information” RFI Response already created for you.   Pricing: Are there licenses? How much does this cost? Is there a way to estimate the costs in this new environment? New Features: Many new features were added to Windows Azure - a good roundup of those changes can be found here. Support: Software Support on Virtual Machines, general support.    

    Read the article

  • Simplifying Human Capital Management with Mobile Applications

    - by HCM-Oracle
    By Aaron Green If you're starting to think 'mobility' is a recurring theme in your reading, you'd be right. For those who haven't started to build organisational capabilities to leverage it, it's fair to say you're late to the party. The good news: better late than never. Research firm eMarketer says the worldwide smartphone audience will total 1.75 billion this year, while communications technology and services provider Ericsson suggests smartphones will triple to 5.6 billion globally by 2019. It should be no surprise, smart phone adoption is reaching the farthest corners of the globe; the subsequent impact of enterprise applications enabled by these devices is driving business performance improvement and will continue to do so. Companies using advanced workforce analytics can add significantly to the bottom line, while impacting customer satisfaction, quality and productivity. It's a statement that makes most business leaders sit forward in their chairs. Achieving these three standards is like sipping The Golden Elixir for the business world. No-one would argue their importance. So what are 'advanced workforce analytics?' Simply, they're unprecedented access to workforce trends and performance markers. Many are made possible by a mobile world and the enterprise applications that come with it on smart devices. Some refer to it as 'the consumerisation of IT'. As this phenomenon has matured and become more widely appreciated it has impacted the spectrum of functional units within an enterprise differently, but powerfully. Whether it's sales, HR, marketing, IT, or operations, all have benefited from a more mobile approach. It has been the catalyst for improvement in, and management of, the employee experience. The net result of which is happier customers. The obvious benefits but the lesser realised impact Most people understand that mobility allows for greater efficiency and productivity, collaboration and flexibility, but how that translates into business outcomes within the various functional groups is lesser known. In actuality mobility has helped galvanise partnerships between cross-functional groups within the enterprise. Where in some quarters it was once feared mobility could fragment a workforce, its rallying cry of support is coming from what you might describe as an unlikely source - HR. As the bedrock of an enterprise, it is conceivable HR might contemplate the possible negative impact of a mobile workforce that no-longer sits in an office, at the same desks every day. After all, who would know what they were doing or saying? How would they collaborate? It's reasonable to see why HR might have a legitimate claim to try and retain as much 'perceived control' as possible. The reality however is mobility has emancipated human capital and its management. Mobility and enterprise applications are expediting decision making. Google calls it Zero Moment of Truth, or ZMOT. It enables smoother operation and can contribute to faster growth. From a collaborative perspective, with the growing use of enterprise social media, which in many cases is being driven by HR, workforce planning and the tangible impact of change is much easier to map. This in turn provides a platform from which individuals and teams can thrive. With more agility and ability to anticipate, staff satisfaction and retention is higher, and real time feedback constant. The management team can save time, energy and costs with more accurate data, which is then intelligently applied across the workforce to truly engage with staff, customers and partners. From a human capital management (HCM) perspective, mobility can help you close the loop on true talent management. It can enhance what managers can offer and what employees can provide in return. It can create nested relationships and powerful partnerships. IT and HR - partners and stewards of mobility One effect of enterprise mobility is an evolution in the nature of the relationship between HR and IT from one of service provision to partnership. The reason for the dynamic shift is largely due to the 'bring your own device' (BYOD) movement, which is transitioning to a 'bring your own application' (BYOA) scenario. As enterprise technology has in some ways reverse-engineered its solutions to help manage this situation, the partnership between IT (the functional owner) and HR (the strategic enabler) is deeply entrenched. And it has to be. The CIO and the HR leader are faced with compliance and regulatory issues and concerns around information security and personal privacy on a daily basis, complicated by global reach and varied domestic legislation. There are tens of thousands of new mobile apps entering the market each month and, unlike many consumer applications which get downloaded but are often never opened again after initial perusal, enterprise applications are being relied upon by functional groups, not least by HR to enhance people management. It requires a systematic approach across all applications in use within the enterprise in order to ensure they're used to best effect. No turning back, and no desire to With real time analytics on performance and the ability for immediate feedback, there is no turning back for managers. In my experience with Oracle, our customers' operational efficiency is at record levels. It's clear as a result of the combination of individual KPIs and organisational goals, CIOs have been able to give HR leaders the ability to build predictive models that feed into an enterprise organisations' evolving strategy. It also helps them ensure regulatory compliance much more easily. Once an arduous task, with mobile enabled automation and quality data, compliance is simpler. Their world has changed for the better. For the CIO, mobility also assists them to optimise performance. While it doesn't come without challenges, mobile-enabled applications and the native experience users have with them means employees don't need high-level technical expertise to train users. It reduces the training and engagement required from the IT team so they can focus on other things that deliver value to the bottom line; all the while lowering the cost of assets and related maintenance work by simplifying processes. Rewards of a mobile enterprise outweigh risks With mobile tools allowing us to increasingly integrate our personal and professional lives, terms like "office hours" are becoming irrelevant, so work/life balance is a cultural must. Enterprises are expected to offer tools that enable workers to access information from anywhere, at any time, from any device. Employees want simplicity and convenience but it doesn't stop at private enterprise. This is a societal shift. Governments, which traditionally have been known to be slower to adopt newer technology, are also offering support for local businesses to go mobile. Several state government websites have advice on how to create mobile apps and more. And as recently as last week the Victorian Minister for Technology Gordon Rich-Phillips unveiled his State government's ICT roadmap for the next two years, which details an increased use of the public cloud, as well as mobile communications, and improved access to online data-sets. Tech giants are investing significantly in solutions designed to simplify mobile deployment and enablement. The mobility trend is creating a wave of change in the industry and driving transformation in the enterprise. If you're not on that wave, the business risk continues to rise as your competitiveness drops. Aaron is the Vice President of HCM Strategy at Oracle Corporation where he is responsible for researching and identifying emerging trends in the practice of Human Resources and works to deliver industry-leading technology solutions. Other responsibilities include, ownership of Oracle's innovative HCM solutions across JAPAC and enabling organisations to transform and modernise their workforce tools. Follow him on Twitter @aaronjgreen

    Read the article

  • ANTS CLR and Memory Profiler In Depth Review (Part 2 of 2 &ndash; Memory Profiler)

    - by ToStringTheory
    One of the things that people might not know about me, is my obsession to make my code as efficient as possible. Many people might not realize how much of a task or undertaking that this might be, but it is surely a task as monumental as climbing Mount Everest, except this time it is a challenge for the mind… In trying to make code efficient, there are many different factors that play a part – size of project or solution, tiers, language used, experience and training of the programmer, technologies used, maintainability of the code – the list can go on for quite some time. I spend quite a bit of time when developing trying to determine what is the best way to implement a feature to accomplish the efficiency that I look to achieve. One program that I have recently come to learn about – Red Gate ANTS Performance (CLR) and Memory profiler gives me tools to accomplish that job more efficiently as well. In this review, I am going to cover some of the features of the ANTS memory profiler set by compiling some hideous example code to test against. Notice As a member of the Geeks With Blogs Influencers program, one of the perks is the ability to review products, in exchange for a free license to the program. I have not let this affect my opinions of the product in any way, and Red Gate nor Geeks With Blogs has tried to influence my opinion regarding this product in any way. Introduction – Part 2 In my last post, I reviewed the feature packed Red Gate ANTS Performance Profiler.  Separate from the Red Gate Performance Profiler is the Red Gate ANTS Memory Profiler – a simple, easy to use utility for checking how your application is handling memory management…  A tool that I wish I had had many times in the past.  This post will be focusing on the ANTS Memory Profiler and its tool set. The memory profiler has a large assortment of features just like the Performance Profiler, with the new session looking nearly exactly alike: ANTS Memory Profiler Memory profiling is not something that I have to do very often…  In the past, the few cases I’ve had to find a memory leak in an application I have usually just had to trace the code of the operations being performed to look for oddities…  Sadly, I have come across more undisposed/non-using’ed IDisposable objects, usually from ADO.Net than I would like to ever see.  Support is not fun, however using ANTS Memory Profiler makes this task easier.  For this round of testing, I am going to use the same code from my previous example, using the WPF application. This time, I will choose the ‘Profile Memory’ option from the ANTS menu in Visual Studio, which launches the solution in its currently configured state/start-up project, and then launches the ANTS Memory Profiler to help.  It prepopulates all of the fields with the current project information, and all I have to do is select the ‘Start Profiling’ option. When the window comes up, it is actually quite barren, just giving ideas on how to work the profiler.  You start by getting to the point in your application that you want to profile, and then taking a ‘Memory Snapshot’.  This performs a full garbage collection, and snapshots the managed heap.  Using the same WPF app as before, I will go ahead and take a snapshot now. As you can see, ANTS is already giving me lots of information regarding the snapshot, however this is just a snapshot.  The whole point of the profiler is to perform an action, usually one where a memory problem is being noticed, and then take another snapshot and perform a diff between them to see what has changed.  I am going to go ahead and generate 5000 primes, and then take another snapshot: As you can see, ANTS is already giving me a lot of new information about this snapshot compared to the last.  Information such as difference in memory usage, fragmentation, class usage, etc…  If you take more snapshots, you can use the dropdown at the top to set your actual comparison snapshots. If you beneath the timeline, you will see a breadcrumb trail showing how best to approach profiling memory using ANTS.  When you first do the comparison, you start on the Summary screen.  You can either use the charts at the bottom, or switch to the class list screen to get to the next step.  Here is the class list screen: As you can see, it lists information about all of the instances between the snapshots, as well as at the bottom giving you a way to filter by telling ANTS what your problem is.  I am going to go ahead and select the Int16[] to look at the Instance Categorizer Using the instance categorizer, you can travel backwards to see where all of the instances are coming from.  It may be hard to see in this image, but hopefully the lightbox (click on it) will help: I can see that all of these instances are rooted to the application through the UI TextBlock control.  This image will probably be even harder to see, however using the ‘Instance Retention Graph’, you can trace an objects memory inheritance up the chain to see its roots as well.  This is a simple example, as this is simply a known element.  Usually you would be profiling an actual problem, and comparing those differences.  I know in the past, I have spotted a problem where a new context was created per page load, and it was rooted into the application through an event.  As the application began to grow, performance and reliability problems started to emerge.  A tool like this would have been a great way to identify the problem quickly. Overview Overall, I think that the Red Gate ANTS Memory Profiler is a great utility for debugging those pesky leaks.  3 Biggest Pros: Easy to use interface with lots of options for configuring profiling session Intuitive and helpful interface for drilling down from summary, to instance, to root graphs ANTS provides an API for controlling the profiler. Not many options, but still helpful. 2 Biggest Cons: Inability to automatically snapshot the memory by interval Lack of complete integration with Visual Studio via an extension panel Ratings Ease of Use (9/10) – I really do believe that they have brought simplicity to the once difficult task of memory profiling.  I especially liked how it stepped you further into the drilldown by directing you towards the best options. Effectiveness (10/10) – I believe that the profiler does EXACTLY what it purports to do.  Features (7/10) – A really great set of features all around in the application, however, I would like to see some ability for automatically triggering snapshots based on intervals or framework level items such as events. Customer Service (10/10) – My entire experience with Red Gate personnel has been nothing but good.  their people are friendly, helpful, and happy! UI / UX (9/10) – The interface is very easy to get around, and all of the options are easy to find.  With a little bit of poking around, you’ll be optimizing Hello World in no time flat! Overall (9/10) – Overall, I am happy with the Memory Profiler and its features, as well as with the service I received when working with the Red Gate personnel.  Thank you for reading up to here, or skipping ahead – I told you it would be shorter!  Please, if you do try the product, drop me a message and let me know what you think!  I would love to hear any opinions you may have on the product. Code Feel free to download the code I used above – download via DropBox

    Read the article

  • squid3 auth thru samba using ntlm to AD doesn't work

    - by derty
    some users here are spending to much time exploring the WWW. So big boss whats to get this under control. We use a squid3 just for some security reason and chace benefits. and now i'm trying to set up a new proxy on a different server (Debian 6) Permissions are defined in AC and the squid3 should get the auth thru samba/winbind by using the ntlm protocol. but i'll get all the time Access, denited. it only works by using LDAP but thats not the way i need it. here some log and confs squid access.log 1326878095.784 1 192.168.15.27 TCP_DENIED/407 4049 GET http://at.msn.com/? -NONE/- text/html 1326878095.791 1 192.168.15.27 TCP_DENIED/407 4294 GET http://at.msn.com/? - NONE/- text/html 1326878095.803 9 192.168.15.27 TCP_DENIED/403 4028 GET http://at.msn.com/? kavan NONE/- text/html 1326878095.848 0 192.168.15.27 TCP_DENIED/403 3881 GET http://www.squid-cache.org/Artwork/SN.png kavan NONE/- text/html 1326878100.279 0 192.168.15.27 TCP_DENIED/403 3735 GET http://www.google.at/ kavan NONE/- text/html 1326878100.296 0 192.168.15.27 TCP_DENIED/403 3870 GET http://www.squid-cache.org/Artwork/SN.png kavan NONE/- text/html 1326878155.700 0 192.168.15.27 TCP_DENIED/407 4072 GET http://ie9cvlist.ie.microsoft.com/IE9CompatViewList.xml - NONE/- text/html 1326878155.705 2 192.168.15.27 TCP_DENIED/407 4317 GET http://ie9cvlist.ie.microsoft.com/IE9CompatViewList.xml - NONE/- text/html 1326878155.709 3 192.168.15.27 TCP_DENIED/403 4026 GET http://ie9cvlist.ie.microsoft.com/IE9CompatViewList.xml kavan NONE/- text/html squid chace 2012/01/18 10:12:49| Creating Swap Directories 2012/01/18 10:12:49| Starting Squid Cache version 3.1.6 for x86_64-pc-linux-gnu... 2012/01/18 10:12:49| Process ID 17236 2012/01/18 10:12:49| With 65535 file descriptors available 2012/01/18 10:12:49| Initializing IP Cache... 2012/01/18 10:12:49| DNS Socket created at [::], FD 7 2012/01/18 10:12:49| DNS Socket created at 0.0.0.0, FD 8 2012/01/18 10:12:49| Adding nameserver 192.168.15.2 from /etc/resolv.conf 2012/01/18 10:12:49| Adding nameserver 192.168.15.19 from /etc/resolv.conf 2012/01/18 10:12:49| Adding nameserver 192.168.15.1 from /etc/resolv.conf 2012/01/18 10:12:49| Adding domain schoenbrunn.local from /etc/resolv.conf 2012/01/18 10:12:49| helperOpenServers: Starting 5/5 'squid_ldap_auth' processes 2012/01/18 10:12:49| helperOpenServers: Starting 10/10 'ntlm_auth' processes 2012/01/18 10:12:49| helperOpenServers: Starting 10/10 'squid_kerb_auth' processes 2012/01/18 10:12:49| squid_kerb_auth: INFO: Starting version 1.0.5 2012/01/18 10:12:49| squid_kerb_auth: INFO: Starting version 1.0.5 2012/01/18 10:12:49| squid_kerb_auth: INFO: Starting version 1.0.5 2012/01/18 10:12:49| squid_kerb_auth: INFO: Starting version 1.0.5 2012/01/18 10:12:49| squid_kerb_auth: INFO: Starting version 1.0.5 2012/01/18 10:12:49| squid_kerb_auth: INFO: Starting version 1.0.5 2012/01/18 10:12:49| squid_kerb_auth: INFO: Starting version 1.0.5 2012/01/18 10:12:49| squid_kerb_auth: INFO: Starting version 1.0.5 2012/01/18 10:12:49| helperOpenServers: Starting 5/5 'squid_ldap_group' processes 2012/01/18 10:12:49| squid_kerb_auth: INFO: Starting version 1.0.5 2012/01/18 10:12:49| squid_kerb_auth: INFO: Starting version 1.0.5 2012/01/18 10:12:49| Unlinkd pipe opened on FD 73 2012/01/18 10:12:49| Local cache digest enabled; rebuild/rewrite every 3600/3600 sec 2012/01/18 10:12:49| Store logging disabled 2012/01/18 10:12:49| Swap maxSize 0 + 262144 KB, estimated 20164 objects 2012/01/18 10:12:49| Target number of buckets: 1008 2012/01/18 10:12:49| Using 8192 Store buckets 2012/01/18 10:12:49| Max Mem size: 262144 KB 2012/01/18 10:12:49| Max Swap size: 0 KB 2012/01/18 10:12:49| Using Least Load store dir selection 2012/01/18 10:12:49| Set Current Directory to /var/spool/squid3 2012/01/18 10:12:49| Loaded Icons. 2012/01/18 10:12:49| Accepting HTTP connections at [::]:3128, FD 74. 2012/01/18 10:12:49| HTCP Disabled. 2012/01/18 10:12:49| Squid modules loaded: 0 2012/01/18 10:12:49| Adaptation support is off. 2012/01/18 10:12:49| Ready to serve requests. 2012/01/18 10:12:50| storeLateRelease: released 0 objects smb.conf # Domain Authntication Settings workgroup = <WORKGROUP> security = ads password server = <DOMAINNAME>.LOCAL realm = <DOMAINNAME>.LOCAL ldap ssl = no # logging log level = 5 max log size = 50 # logs split per machine log file = /var/log/samba/%m.log # max 50KB per log file, then rotate ; max log size = 50 # User settings username map = /etc/samba/smbusers idmap uid = 10000-20000000 idmap gid = 10000-20000000 idmap backend = ad ; template primary group = <ad group> template shell = /sbin/nologin # Winbind Settings winbind separator = + winbind enum users = Yes winbind enum groups = Yes winbind netsted groups = Yes winbind nested groups = Yes winbind cache time = 10 winbind use default domain = Yes #Other Globals unix charset = LOCALE server string = <SERVERNAME> load printers = no printing = cups cups options = raw ; printcap name = /etc/printcap #obtain list of printers automatically on SystemV ; printcap name = lpstat ; printing = cups squid.conf auth_param ntlm program /usr/bin/ntlm_auth --require-membership-of=<DOMAINNAME>\\INTERNETZ --helper-protocol=squid-2.5-ntlmssp auth_param ntlm children 10 auth_param basic program /usr/lib/squid3/squid_ldap_auth -R -b "dc=<dcname>,dc=local" -D "cn=administrator,cn=Users,dc=<domainname>,dc=local" -w "******" -f sAMAccountName=%s -h 192.168.15.19:3268 auth_param basic realm "Proxy Authentifizierung. Bitte geben Sie Ihren Benutzername und Ihr Passwort ein!" #means insert you PW in an other language - # external_acl_type InetGroup %LOGIN /usr/lib/squid3/squid_ldap_group -R -b "dc=<domainname>,dc=local" -D "cn=administrator,cn=Users,dc=<domainname>,dc=local" -w "******" -f "(&(objectclass=person)(sAMAccountName=%v) (memberof=cn=%a,cn=internetz,dc=<domainname>,dc=local))" -h 192.168.15.19:3268 auth_param negotiate program /usr/lib/squid3/squid_kerb_auth -d auth_param negotiate children 10 auth_param negotiate keep_alive on acl localnet proxy_auth REQUIRED acl InetAccess external InetGroup Internetz http_access allow InetAccess http_access deny all acl auth proxy_auth REQUIRED http_access allow auth and a very suspicious is that by adding the proxy server to the Domain i see 2 new entries in the PC one with the original computer-name leopoldine and one with leopoldine CNF:f8efa4c4-ff0e-4217-939d-f1523b43464d ?!? I tried a lot, really... but i stuck on this problem... i actually i even reinstalled all dependent programs and reconfigured them from default. Group exists and has me in it. Firefox running on the old proxy and i use IE for testing the new one. But i'll get all the time Access-Denited and to be honest i'm quite a beginner, so please don't be to prude. I'll interested in improving, i'll get the information we need to fix this but i started working 2 month ago and got only 1 1/2 year's training and not a single sec. in linux ;)

    Read the article

  • DDNS Not Creating Journal (Dhcpd and Named)

    - by user130094
    * EDIT 1 * After monkeying with additional debug logging I see some log entries of interest. 27-Jul-2012 23:45:26.537 general: error: zone example.lan/IN/internal: journal rollforward failed: no more 27-Jul-2012 23:45:26.537 general: error: zone example.lan/IN/internal: not loaded due to errors. ^^^ If I can remedy the above messages I think I'll be good to go ^^^ * EDIT 2 * Grasping at straws I touched a forward and a reverse zone journal file and restarted named. Boom! Works. Despite documentation stating the files are created automatically and what I have seen before... dunno why but that did the trick. Also re-checked perms on the dir the files live in. As certain as I was, they were correct with named having rw. CentOS 6 (final) dhcpd 4.1.1-P1 named BIND 9.8.2rc1-RedHat-9.8.2-0.10.rc1.el6 Basic DHCP and DNS functionality are in place on 192.168.111.2. Clients are assigned addresses as intended and can resolve local DNS names as well as Internet names. My problem is that named's zone journal files are not created. chroot: /var/named/chroot I tried placing the zone files in various directories (/var/named/data, /var/named, /var/named/dynamic - no matter which dir with named owning and wide open perms I now get nowhere). Along the way I, at one point, got a permission denied when named tried to create the journal. Resolved the issue by: chown --recursive named:named /var/named chmod --recursive 777 /var/named The journal was then created and here's where things fell apart. I attempted to tame permissions to something more sane and broke it. Once changed and having restarted named it threw an error indicating the journal was out of sync (or something to that affect)... didn't matter since this is a new setup so I deleted it and now it is not recreated. Now though I see no errors in /var/log/messages, my chrooted /var/log/named.log, or chrooted /var/log/named.debug. I increased the debug level with 'rndc trace' - no love. Increased trace to 10, still nothing. SELinux is disabled... [root@server temp]# sestatus SELinux status: disabled dhcpd.conf... allow client-updates; ddns-update-style interim; subnet 192.168.111.0 netmask 255.255.255.224 { ... key dhcpudpate { algorithm hmac-md5; secret LDJMdPdEZED+/nN/AGO9ZA==; } zone example.lan. { primary 192.168.111.2; key dhcpudpate; } } named.conf... key dhcpudpate { algorithm hmac-md5; secret "LDJMdPdEZED+/nN/AGO9ZA=="; }; zone "example.lan" { type master; file "/var/named/dynamic/example.lan.db"; allow-transfer { none; }; allow-update { key dhcpudpate; }; notify false; check-names ignore; }; The following shows /var/log/named.log output of named starting up - no errors. 27-Jul-2012 21:33:39.349 general: info: zone 111.168.192.in-addr.arpa/IN/internal: loaded serial 2012072601 27-Jul-2012 21:33:39.349 general: info: zone example.lan/IN/internal: loaded serial 2012072501 27-Jul-2012 21:33:39.350 general: info: zone example2.lan/IN/internal: loaded serial 2012072501 27-Jul-2012 21:33:39.350 general: info: zone example3.lan/IN/internal: loaded serial 2012072601 27-Jul-2012 21:33:39.350 general: info: zone example4.lan/IN/internal: loaded serial 2012072501 27-Jul-2012 21:33:39.351 general: info: zone example5.lan/IN/internal: loaded serial 2012072501 27-Jul-2012 21:33:39.351 general: info: managed-keys-zone ./IN/internal: loaded serial 0 27-Jul-2012 21:33:39.351 general: info: zone example.lan/IN/external: loaded serial 2012072501 27-Jul-2012 21:33:39.352 general: info: zone example1.lan/IN/external: loaded serial 2012072501 27-Jul-2012 21:33:39.352 general: info: zone example2.lan/IN/external: loaded serial 2012072501 27-Jul-2012 21:33:39.352 general: info: zone example3.lan/IN/external: loaded serial 2012072501 27-Jul-2012 21:33:39.353 general: info: managed-keys-zone ./IN/external: loaded serial 0 27-Jul-2012 21:33:39.353 general: notice: running 27-Jul-2012 21:34:03.825 general: info: received control channel command 'trace 10' 27-Jul-2012 21:34:03.825 general: info: debug level is now 10 ...and /var/log/messages for a named start... Jul 27 23:02:04 server named[9124]: ---------------------------------------------------- Jul 27 23:02:04 server named[9124]: BIND 9 is maintained by Internet Systems Consortium, Jul 27 23:02:04 server named[9124]: Inc. (ISC), a non-profit 501(c)(3) public-benefit Jul 27 23:02:04 server named[9124]: corporation. Support and training for BIND 9 are Jul 27 23:02:04 server named[9124]: available at https://www.isc.org/support Jul 27 23:02:04 server named[9124]: ---------------------------------------------------- Jul 27 23:02:04 server named[9124]: adjusted limit on open files from 4096 to 1048576 Jul 27 23:02:04 server named[9124]: found 2 CPUs, using 2 worker threads Jul 27 23:02:04 server named[9124]: using up to 4096 sockets Jul 27 23:02:04 server named[9124]: loading configuration from '/etc/named.conf' Jul 27 23:02:04 server named[9124]: using default UDP/IPv4 port range: [1024, 65535] Jul 27 23:02:04 server named[9124]: using default UDP/IPv6 port range: [1024, 65535] Jul 27 23:02:04 server named[9124]: listening on IPv4 interface eth0, 192.168.111.2#53 Jul 27 23:02:04 server named[9124]: generating session key for dynamic DNS Jul 27 23:02:04 server named[9124]: sizing zone task pool based on 12 zones Jul 27 23:02:04 server named[9124]: set up managed keys zone for view internal, file 'dynamic/3bed2cb3a3acf7b6a8ef408420cc682d5520e26976d354254f528c965612054f.mkeys' Jul 27 23:02:04 server named[9124]: set up managed keys zone for view external, file 'dynamic/3c4623849a49a53911c4a3e48d8cead8a1858960bccdea7a1b978d73ec2f06d7.mkeys' Jul 27 23:02:04 server named[9124]: command channel listening on 127.0.0.1#953 What can I do to troubleshoot this further? It almost seems as though dhcpd is not triggering the update. Maybe I should troubleshoot here and, if so, how? Many thanks.

    Read the article

  • Performance in backpropagation algorithm

    - by Taban
    I've written a matlab program for standard backpropagation algorithm, it is my homework and I should not use matlab toolbox, so I write the entire code by myself. This link helped me for backpropagation algorithm. I have a data set of 40 random number and initial weights randomly. As output, I want to see a diagram that shows the performance. I used mse and plot function to see performance for 20 epochs but the result is this: I heard that performance should go up through backpropagation, so I want to know is there any problem with my code or this result is normal because local minimums. This is my code: Hidden_node=inputdlg('Enter the number of Hidden nodes'); a=0.5;%initialize learning rate hiddenn=str2num(Hidden_node{1,1}); randn('seed',0); %creating data set s=2; N=10; m=[5 -5 5 5;-5 -5 5 -5]; S = s*eye(2); [l,c] = size(m); x = []; % Creating the training set for i = 1:c x = [x mvnrnd(m(:,i)',S,N)']; end % target value toutput=[ones(1,N) zeros(1,N) ones(1,N) zeros(1,N)]; for epoch=1:20; %number of epochs for kk=1:40; %number of patterns %initial weights of hidden layer for ii=1 : 2; for jj=1 :hiddenn; whidden{ii,jj}=rand(1); end end initial the wights of output layer for ii=1 : hiddenn; woutput{ii,1}=rand(1); end for ii=1:hiddenn; x1=x(1,kk); x2=x(2,kk); w1=whidden{1,ii}; w2=whidden{2,ii}; activation{1,ii}=(x1(1,1)*w1(1,1))+(x2(1,1)*w2(1,1)); end %calculate output of hidden nodes for ii=1:hiddenn; hidden_to_out{1,ii}=logsig(activation{1,ii}); end activation_O{1,1}=0; for jj=1:hiddenn; activation_O{1,1} = activation_O{1,1}+(hidden_to_out{1,jj}*woutput{jj,1}); end %calculate output out{1,1}=logsig(activation_O{1,1}); out_for_plot(1,kk)= out{1,ii}; %calculate error for output node delta_out{1,1}=(toutput(1,kk)-out{1,1}); %update weight of output node for ii=1:hiddenn; woutput{ii,jj}=woutput{ii,jj}+delta_out{1,jj}*hidden_to_out{1,ii}*dlogsig(activation_O{1,jj},logsig(activation_O{1,jj}))*a; end %calculate error of hidden nodes for ii=1:hiddenn; delta_hidden{1,ii}=woutput{ii,1}*delta_out{1,1}; end %update weight of hidden nodes for ii=1:hiddenn; for jj=1:2; whidden{jj,ii}= whidden{jj,ii}+(delta_hidden{1,ii}*dlogsig(activation{1,ii},logsig(activation{1,ii}))*x(jj,kk)*a); end end a=a/(1.1);%decrease learning rate end %calculate performance e=toutput(1,kk)-out_for_plot(1,1); perf(1,epoch)=mse(e); end plot(perf); Thanks a lot.

    Read the article

  • RackSpace Cloud Strips $_SESSION if URL Has Certain File Extensions

    - by macinjosh
    The Situation I am creating a video training site for a client on the RackSpace Cloud using the traditional LAMP stack (RackSpace's cloud has both Windows and LAMP stacks). The videos and other media files I'm serving on this site need to be protected as my client charges money for access to them. There is no DRM or funny business like that, essentially we store the files outside of the web root and use PHP to authenticate user's before they are able to access the files by using mod_rewrite to run the request through PHP. So let's say the user requests a file at this URL: http://www.example.com/uploads/preview_image/29.jpg I am using mod_rewrite to rewrite that url to: http://www.example.com/files.php?path=%2Fuploads%2Fpreview_image%2F29.jpg Here is a simplified version of the files.php script: <?php // Setups the environment and sets $logged_in // This part requires $_SESSION require_once('../../includes/user_config.php'); if (!$logged_in) { // Redirect non-authenticated users header('Location: login.php'); } // This user is authenticated, continue $content_type = "image/jpeg"; // getAbsolutePathForRequestedResource() takes // a Query Parameter called path and uses DB // lookups and some string manipulation to get // an absolute path. This part doesn't have // any bearing on the problem at hand $file_path = getAbsolutePathForRequestedResource($_GET['path']); // At this point $file_path looks something like // this: "/path/to/a/place/outside/the/webroot" if (file_exists($file_path) && !is_dir($file_path)) { header("Content-Type: $content_type"); header('Content-Length: ' . filesize($file_path)); echo file_get_contents($file_path); } else { header('HTTP/1.0 404 Not Found'); header('Status: 404 Not Found'); echo '404 Not Found'; } exit(); ?> The Problem Let me start by saying this works perfectly for me. On local test machines it works like a charm. However once deployed to the cloud it stops working. After some debugging it turns out that if a request to the cloud has certain file extensions like .JPG, .PNG, or .SWF (i.e. extensions of typically static media files.) the request is routed to a cache system called Varnish. The end result of this routing is that by the time this whole process makes it to my PHP script the session is not present. If I change the extension in the URL to .PHP or if I even add a query parameter Varnish is bypassed and the PHP script can get the session. No problem right? I'll just add a meaningless query parameter to my requests! Here is the rub: The media files I am serving through this system are being requested through compiled SWF files that I have zero control over. They are generated by third-party software and I have no hope of adding or changing the URLs that they request. Are there any other options I have on this? Update: I should note that I have verified this behavior with RackSpace support and they have said there is nothing they can do about it.

    Read the article

  • Javascript style objects in Objective-C

    - by awolf
    Background: I use a ton of NSDictionary objects in my iPhone and iPad code. I'm sick of the verbose way of getting/setting keys to these state dictionaries. So a little bit of an experiment: I just created a class I call Remap. Remap will take any arbitrary set[VariableName]:(NSObject *) obj selector and forward that message to a function that will insert obj into an internal NSMutableDictionary under the key [vairableName]. Remap will also take any (zero argument) arbitrary [variableName] selector and return the NSObject mapped in the NSMutableDictionary under the key [variableName]. e.g. Remap * remap = [[Remap alloc] init]; NSNumber * testNumber = [NSNumber numberWithInt:46]; [remap setTestNumber:testNumber]; testNumber = [remap testNumber]; [remap setTestString:@"test string"]; NSString * testString = [remap testString]; NSMutableDictionary * testDict = [NSMutableDictionary dictionaryWithObject:testNumber forKey:@"testNumber"]; [remap setTestDict:testDict]; testDict = [remap testDict]; where none of the properties testNumber, testString, or testDict are actually defined in Remap. The crazy thing? It works... My only question is how can I disable the "may not respond to " warnings for JUST accesses to Remap? P.S. : I'll probably end up scrapping this and going with macros since message forwarding is quite inefficient... but aside from that does anyone see other problems with Remap? Here's Remap's .m for those who are curious: #import "Remap.h" @interface Remap () @property (nonatomic, retain) NSMutableDictionary * _data; @end @implementation Remap @synthesize _data; - (void) dealloc { relnil(_data); [super dealloc]; } - (id) init { self = [super init]; if (self != nil) { NSMutableDictionary * dict = [[NSMutableDictionary alloc] init]; [self set_data:dict]; relnil(dict); } return self; } - (void)forwardInvocation:(NSInvocation *)anInvocation { NSString * selectorName = [NSString stringWithUTF8String: sel_getName([anInvocation selector])]; NSRange range = [selectorName rangeOfString:@"set"]; NSInteger numArguments = [[anInvocation methodSignature] numberOfArguments]; if (range.location == 0 && numArguments == 4) { //setter [anInvocation setSelector:@selector(setData:withKey:)]; [anInvocation setArgument:&selectorName atIndex:3]; [anInvocation invokeWithTarget:self]; } else if (numArguments == 3) { [anInvocation setSelector:@selector(getDataWithKey:)]; [anInvocation setArgument:&selectorName atIndex:2]; [anInvocation invokeWithTarget:self]; } } - (NSMethodSignature *) methodSignatureForSelector:(SEL) aSelector { NSString * selectorName = [NSString stringWithUTF8String: sel_getName(aSelector)]; NSMethodSignature * sig = [super methodSignatureForSelector:aSelector]; if (sig == nil) { NSRange range = [selectorName rangeOfString:@"set"]; if (range.location == 0) { sig = [self methodSignatureForSelector:@selector(setData:withKey:)]; } else { sig = [self methodSignatureForSelector:@selector(getDataWithKey:)]; } } return sig; } - (NSObject *) getDataWithKey: (NSString *) key { NSObject * returnValue = [[self _data] objectForKey:key]; return returnValue; } - (void) setData: (NSObject *) data withKey:(NSString *)key { if (key && [key length] >= 5 && data) { NSRange range; range.length = 1; range.location = 3; NSString * firstChar = [key substringWithRange:range]; firstChar = [firstChar lowercaseString]; range.length = [key length] - 5; // the 4 we have processed plus the training : range.location = 4; NSString * adjustedKey = [NSString stringWithFormat:@"%@%@", firstChar, [key substringWithRange:range]]; [[self _data] setObject:data forKey:adjustedKey]; } else { //assert? } } @end

    Read the article

  • Haskell Monad bind currying

    - by Chime
    I am currently in need of a bit of brain training and I found this article on Haskell and Monads I'm having trouble with exercise 7 re. Randomised function bind. To make the problem even simpler to experiment, I replaced the StdGen type with an unspecified type. So instead of... bind :: (a -> StdGen -> (b,StdGen)) -> (StdGen -> (a,StdGen)) -> (StdGen -> (b,StdGen)) I used... bind :: (a -> c -> (b,c)) -> (c -> (a,c)) -> (c -> (b,c)) and for the actual function impelemtation (just straight from the exercise) bind f x seed = let (x',seed') = x seed in f x' seed' and also 2 randomised functions to trial with: rndf1 :: (Num a, Num b) => a -> b -> (a,b) rndf1 a s = (a+1,s+1) rndf2 :: (Num a, Num b) => a -> b -> (a,b) rndf2 a s = (a+8,s+2) So with this in a Haskell compiler (ghci), I get... :t bind rndf2 bind rndf2 :: (Num a, Num c) => (c -> (a, c)) -> c -> (a, c) This matches the bind curried with rndf2 as the first parameter. But the thing I don't understand is how... :t bind rndf2 . rndf1 Suddenly gives bind rndf2 . rndf1 :: (Num a, Num c) => a -> c -> (a, c) This is the correct type of the composition that we are trying to produce because bind rndf2 . rndf1 Is a function that: takes the same parameter type(s) as rndf1 AND takes the return from rndf1 and pipes it as an input of rndf2 to return the same type as rndf2 rndf1 can take 2 parameters a -> c and rndf2 returns (a, c) so it matches that a composition of these function should have type: bind rndf2 . rndf1 :: (Num a, Num c) => a -> c -> (a, c) This does not match the naive type that I initially came up with for bind bind f :: (a -> b -> (c, d)) -> (c, d) -> (e, f) Here bind mythically takes a function that takes two parameters and produces a function that takes a tuple in order that the output from rndf1 can be fed into rndf2 why the bind function needs to be coded as it is Why the bind function does not have the naive type

    Read the article

  • Oracle Fusion Applications: Changing the Game

    - by kellsey.ruppel(at)oracle.com
    Originally posted in the Oracle Profit Magazine, November 2010 Edition. When the order processing system red-flags a customer's credit status, the IT department doesn't get the customer's call. When a supplier misses a delivery date for a key automotive assembly, it's not the CIO who has to answer for the error. Knowledge workers (known in IT circles as "users") are on the front lines when an exception occurs in an established business process. They're also the ones who study sales trends to decide when to open a new store in an up-and-coming neighborhood, which products are most profitable, how employee skill sets are evolving, and which suppliers are most efficient. In short, knowledge workers are masters of business as unusual. Traditional enterprise resource planning (ERP) systems and other familiar enterprise applications excel at automating, managing, and executing standard business processes. These programs shine when everything goes as planned. Life gets even trickier when a traditional application needs to be extended with a new service or an extra step is added to a business process when new products are brought to market, divisions are merged, or companies are acquired. Monolithic applications often need the IT department to step in and make the necessary adjustments--incurring additional costs and delays. Until now. When Oracle unveiled the much-anticipated family of Oracle Fusion Applications at Oracle OpenWorld in September 2010, knowledge workers in particular had a lot to cheer about. Business users will soon have ready access to analytical information and collaboration tools in the context of what they are working on, so they can make better decisions when problems or opportunities arise. Additionally, the Oracle Fusion Applications platform will make it easy for business users to tweak processes, create new capabilities, and find information, often without the need for IT department assistance and while still following company guidelines. And IT leaders will be happy to hear about new deployment options, guided implementation and setup tools, and cost-saving management capabilities. Just as important, the underlying technologies in Oracle Fusion Applications will allow organizations to choose among their existing investments and next-generation enterprise applications so they can introduce innovations at a pace that makes the most business and financial sense. "Oracle Fusion Applications are architected so you don't have to do rip and replace," says Jim Hayes, managing director of the consulting firm Accenture. "That's very important for creating a business case that will get through the steering committee and be approved by the board. It shows you can drive value and make a difference in the near term." For these and other reasons, analysts and early adopters are calling Oracle Fusion Applications a game changer for enterprise customers. The differences become apparent in three key areas: the way we innovate, work, and adopt technology. Game Changer #1: New Standard for InnovationChange is a constant challenge for most businesses, whether the catalysts are market dynamics, new competition, or the ever-expanding regulatory environment. And, in an ongoing effort to differentiate, business leaders are constantly looking for new ways to do business, serve constituents, and bring new products and services to market. In addition, companies face significant costs to keep their applications up-to-date. For example, when a company adds new suppliers to a procurement system, the IT shop typically has to invest time, effort, and even consulting fees for custom integrations that allow various ERP systems to communicate with each other. Oracle Fusion Applications were built on Web services and a modular SOA foundation to ease customizations and integration activities among all applications--whether from Oracle or another vendor. Interfaces and updates written in ubiquitous Java, rather than a proprietary coding language, allow organizations to tap into existing in-house technical skills rather than seek expensive outside specialists. And with SOA, organizations can extend a feature set or integrate with other SOA environments by combining Web services such as "look up customer" into a new business process managed by the BPEL orchestration engine. Flexibility like this has long-term implications. "Because users capture these changes at a higher metadata layer, not in the application's code, changes and additions are protected even as new versions of Oracle Fusion Applications are released," says Steve Miranda, senior vice president of applications development at Oracle. "This is a much more sustainable approach because you don't incur costly customizations that prevent upgrades and other innovations." And changes are easier to make: if one change is made in the metadata, that change is automatically reflected throughout the application interface, business intelligence, business process, and business logic. Game Changer #2: New Standard for WorkBoosting productivity comes down to doing the basics right: running business processes more efficiently and managing exceptions more effectively, so users can accomplish more in the course of a day or spend more quality time with the most profitable customers. The fastest way to improve process efficiency is to reduce the number of steps it takes to execute common tasks, such as ordering office equipment from an internal procurement system. Oracle Fusion Applications will deliver a complete role-based user experience with business intelligence and collaboration capabilities provided in the context of the work at hand. "We created every Oracle Fusion Applications screen by asking 'What does the user need to know?' 'What does he or she need to do?' and 'Who do they need to work with to get the job done?'" Miranda explains. So when the sales department heads need new laptops, the self-service procurement screen will not only display a list of approved vendors and configurations, but also a running list of reviews by coworkers who recently purchased the various models. Embedded intelligence may also display prevailing delivery lead times based on actual order histories, not the generic shipping dates vendors may quote. The pervasive business intelligence serves many other business activities across all areas of the enterprise. For example, a manager considering whether to promote a direct report can see the person's employee profile, with a salary history, appraisal summaries, and a rundown of skills and training. This approach to business intelligence also has implications for supply chain management. "One of the challenges at Ingersoll Rand is lack of visibility in our supply chain," says Mike Macrie, global director of enterprise applications for global industrial firm Ingersoll Rand. "Oracle Fusion Applications are going to provide the embedded intelligence to give us that visibility and give us the ability to analyze those orders at any point in our supply chain." Oracle Fusion Applications will also create a "role-based user experience" that displays a work list of events that need attention, based on user job function. Role awareness guides users with daily lists of action items and exceptions. So a credit manager may see seven invoices with discounts that are about to expire or 12 suppliers that have been put on hold because credit memos are awaiting approval. Individualization extends to the search capabilities of Oracle Fusion Applications. The platform uses Web-style search screens powered by an Oracle enterprise search engine, with a security framework that filters search results so individuals will only see the internal information they're authorized to access. A further aid to productivity is Oracle Fusion Applications' integration with Web 2.0 collaboration and social networking resources for business environments. Hover-over text will reveal relevant contact information whenever the name of a person appears in an Oracle Fusion Application. Users can connect via an online chat, phone call, or instant message without leaving the main application, reducing the time required for an accounts payable staffer to resolve a mismatch between an invoiced charge and the service record, for example. Addresses of suppliers, customers, or partners will also initiate hover-over text to show contact details and Web-based maps. Finally, Oracle Fusion Applications will promote a new way of working with purpose-driven communities that can bring new efficiencies to everything from cultivating sales leads to managing new projects. As soon as a lead or project materializes, the applications will automatically gather relevant participants into an online community that shares member contact information, schedules, discussion forums, and Wiki pages. "Oracle Fusion Applications will allow us to take it to the next level with embedded Web 2.0 tools and the embedded analytics," says Steve Printz, CIO and vice president, supply chain management, at window-and-door manufacturer Pella. "[This] allows those employees today who are processing transactions to really contribute to the success of the company and become decision-makers." Game Changer #3: New Standard for Technology AdoptionAs IT becomes a dominant component of how businesses run and compete, organizations need to lower the cost of implementing applications and introducing new application features. In the past, rolling out new code often required creating a test bed system, moving beta code to a separate system for user feedback, and--once all the revisions were made--moving version one of the software onto production systems, where business users could finally get the needed new features. Oracle Fusion Applications will use a dedicated setup manager application to streamline this process. First, the setup manager will help scope out the project, querying users about their requirements. "From those questions and answers we determine the steps and the order of those steps that will enable that task," Miranda says. Next, system utilities will assign tasks to owners, track completion status, and monitor the overall status of a programming effort. Oracle Fusion Applications can then recommend Web services that allow users to migrate setup choices and steps across all the various deployments of the application. Those setup capabilities automate the migration from test systems to production systems, as well as between different business units that may be using the same application. "The self-service ability of the setup manager helps business users change setups with very little intervention from the IT team," says Ravi Kumar, vice president at IT services company Infosys. "That to me is a big difference from how we've viewed enterprise applications before." For additional flexibility, organizations will be able to adopt Oracle Fusion Applications modules in either of two modes: a single-instance alternative uses one database for all Oracle Fusion Applications, while a "pillar mode" creates separate databases to underpin each application. This means IT departments running any one of Oracle's applications or even third-party applications can plug Oracle Fusion Applications modules into their environment and see additional business value created on top of their existing systems. And Oracle Fusion Applications offer a hybrid approach to deployment. The applications are all software-as-a-service-ready, so customers can choose on-premises, public or private cloud, or a combination of these to suit their business needs. It's that combination of flexibility and a roadmap for the future that may be the biggest game changer of all. "The Oracle Fusion Applications architecture allows us to migrate our company at a pace that's consistent with our business strategy, whereas before we might have had to do it with a massive upgrade," says Macrie of Ingersoll Rand. "We're looking forward to that architecture to really give us more flexibility in how we migrate over time." For More InformationUser Input Key to the Success of Oracle Fusion ApplicationsTransforming Coexistence into Strategic ValueUnder the HoodOracle Fusion ApplicationsOracle Service-Oriented Architecture  

    Read the article

  • Revisiting ANTS Performance Profiler 7.4

    - by James Michael Hare
    Last year, I did a small review on the ANTS Performance Profiler 6.3, now that it’s a year later and a major version number higher, I thought I’d revisit the review and revise my last post. This post will take the same examples as the original post and update them to show what’s new in version 7.4 of the profiler. Background A performance profiler’s main job is to keep track of how much time is typically spent in each unit of code. This helps when we have a program that is not running at the performance we expect, and we want to know where the program is experiencing issues. There are many profilers out there of varying capabilities. Red Gate’s typically seem to be the very easy to “jump in” and get started with very little training required. So let’s dig into the Performance Profiler. I’ve constructed a very crude program with some obvious inefficiencies. It’s a simple program that generates random order numbers (or really could be any unique identifier), adds it to a list, sorts the list, then finds the max and min number in the list. Ignore the fact it’s very contrived and obviously inefficient, we just want to use it as an example to show off the tool: 1: // our test program 2: public static class Program 3: { 4: // the number of iterations to perform 5: private static int _iterations = 1000000; 6: 7: // The main method that controls it all 8: public static void Main() 9: { 10: var list = new List<string>(); 11: 12: for (int i = 0; i < _iterations; i++) 13: { 14: var x = GetNextId(); 15: 16: AddToList(list, x); 17: 18: var highLow = GetHighLow(list); 19: 20: if ((i % 1000) == 0) 21: { 22: Console.WriteLine("{0} - High: {1}, Low: {2}", i, highLow.Item1, highLow.Item2); 23: Console.Out.Flush(); 24: } 25: } 26: } 27: 28: // gets the next order id to process (random for us) 29: public static string GetNextId() 30: { 31: var random = new Random(); 32: var num = random.Next(1000000, 9999999); 33: return num.ToString(); 34: } 35: 36: // add it to our list - very inefficiently! 37: public static void AddToList(List<string> list, string item) 38: { 39: list.Add(item); 40: list.Sort(); 41: } 42: 43: // get high and low of order id range - very inefficiently! 44: public static Tuple<int,int> GetHighLow(List<string> list) 45: { 46: return Tuple.Create(list.Max(s => Convert.ToInt32(s)), list.Min(s => Convert.ToInt32(s))); 47: } 48: } So let’s run it through the profiler and see what happens! Visual Studio Integration First, let’s look at how the ANTS profilers integrate with Visual Studio’s menu system. Once you install the ANTS profilers, you will get an ANTS menu item with several options: Notice that you can either Profile Performance or Launch ANTS Performance Profiler. These sound similar but achieve two slightly different actions: Profile Performance: this immediately launches the profiler with all defaults selected to profile the active project in Visual Studio. Launch ANTS Performance Profiler: this launches the profiler much the same way as starting it from the Start Menu. The profiler will pre-populate the application and path information, but allow you to change the settings before beginning the profile run. So really, the main difference is that Profile Performance immediately begins profiling with the default selections, where Launch ANTS Performance Profiler allows you to change the defaults and attach to an already-running application. Let’s Fire it Up! So when you fire up ANTS either via Start Menu or Launch ANTS Performance Profiler menu in Visual Studio, you are presented with a very simple dialog to get you started: Notice you can choose from many different options for application type. You can profile executables, services, web applications, or just attach to a running process. In fact, in version 7.4 we see two new options added: ASP.NET Web Application (IIS Express) SharePoint web application (IIS) So this gives us an additional way to profile ASP.NET applications and the ability to profile SharePoint applications as well. You can also choose your level of detail in the Profiling Mode drop down. If you choose Line-Level and method-level timings detail, you will get a lot more detail on the method durations, but this will also slow down profiling somewhat. If you really need the profiler to be as unintrusive as possible, you can change it to Sample method-level timings. This is performing very light profiling, where basically the profiler collects timings of a method by examining the call-stack at given intervals. Which method you choose depends a lot on how much detail you need to find the issue and how sensitive your program issues are to timing. So for our example, let’s just go with the line and method timing detail. So, we check that all the options are correct (if you launch from VS2010, the executable and path are filled in already), and fire it up by clicking the [Start Profiling] button. Profiling the Application Once you start profiling the application, you will see a real-time graph of CPU usage that will indicate how much your application is using the CPU(s) on your system. During this time, you can select segments of the graph and bookmark them, giving them mnemonic names. This can be useful if you want to compare performance in one part of the run to another part of the run. Notice that once you select a block, it will give you the call tree breakdown for that selection only, and the relative performance of those calls. Once you feel you have collected enough information, you can click [Stop Profiling] to stop the application run and information collection and begin a more thorough analysis. Analyzing Method Timings So now that we’ve halted the run, we can look around the GUI and see what we can see. By default, the times are shown in terms of percentage of time of the total run of the application, though you can change it in the View menu item to milliseconds, ticks, or seconds as well. This won’t affect the percentages of methods, it only affects what units the times are shown. Notice also that the major hotspot seems to be in a method without source, ANTS Profiler will filter these out by default, but you can right-click on the line and remove the filter to see more detail. This proves especially handy when a bottleneck is due to a method in the BCL. So now that we’ve removed the filter, we see a bit more detail: In addition, ANTS Performance Profiler gives you the ability to decompile the methods without source so that you can dive even deeper, though typically this isn’t necessary for our purposes. When looking at timings, there are generally two types of timings for each method call: Time: This is the time spent ONLY in this method, not including calls this method makes to other methods. Time With Children: This is the total of time spent in both this method AND including calls this method makes to other methods. In other words, the Time tells you how much work is being done exclusively in this method, and the Time With Children tells you how much work is being done inclusively in this method and everything it calls. You can also choose to display the methods in a tree or in a grid. The tree view is the default and it shows the method calls arranged in terms of the tree representing all method calls and the parent method that called them, etc. This is useful for when you find a hot-spot method, you can see who is calling it to determine if the problem is the method itself, or if it is being called too many times. The grid method represents each method only once with its totals and is useful for quickly seeing what method is the trouble spot. In addition, you can choose to display Methods with source which are generally the methods you wrote (as opposed to native or BCL code), or Any Method which shows not only your methods, but also native calls, JIT overhead, synchronization waits, etc. So these are just two ways of viewing the same data, and you’re free to choose the organization that best suits what information you are after. Analyzing Method Source If we look at the timings above, we see that our AddToList() method (and in particular, it’s call to the List<T>.Sort() method in the BCL) is the hot-spot in this analysis. If ANTS sees a method that is consuming the most time, it will flag it as a hot-spot to help call out potential areas of concern. This doesn’t mean the other statistics aren’t meaningful, but that the hot-spot is most likely going to be your biggest bang-for-the-buck to concentrate on. So let’s select the AddToList() method, and see what it shows in the source window below: Notice the source breakout in the bottom pane when you select a method (from either tree or grid view). This shows you the timings in this method per line of code. This gives you a major indicator of where the trouble-spot in this method is. So in this case, we see that performing a Sort() on the List<T> after every Add() is killing our performance! Of course, this was a very contrived, duh moment, but you’d be surprised how many performance issues become duh moments. Note that this one line is taking up 86% of the execution time of this application! If we eliminate this bottleneck, we should see drastic improvement in the performance. So to fix this, if we still wanted to maintain the List<T> we’d have many options, including: delay Sort() until after all Add() methods, using a SortedSet, SortedList, or SortedDictionary depending on which is most appropriate, or forgoing the sorting all together and using a Dictionary. Rinse, Repeat! So let’s just change all instances of List<string> to SortedSet<string> and run this again through the profiler: Now we see the AddToList() method is no longer our hot-spot, but now the Max() and Min() calls are! This is good because we’ve eliminated one hot-spot and now we can try to correct this one as well. As before, we can then optimize this part of the code (possibly by taking advantage of the fact the list is now sorted and returning the first and last elements). We can then rinse and repeat this process until we have eliminated as many bottlenecks as possible. Calls by Web Request Another feature that was added recently is the ability to view .NET methods grouped by the HTTP requests that caused them to run. This can be helpful in determining which pages, web services, etc. are causing hot spots in your web applications. Summary If you like the other ANTS tools, you’ll like the ANTS Performance Profiler as well. It is extremely easy to use with very little product knowledge required to get up and running. There are profilers built into the higher product lines of Visual Studio, of course, which are also powerful and easy to use. But for quickly jumping in and finding hot spots rapidly, Red Gate’s Performance Profiler 7.4 is an excellent choice. Technorati Tags: Influencers,ANTS,Performance Profiler,Profiler

    Read the article

  • CodePlex Daily Summary for Thursday, April 29, 2010

    CodePlex Daily Summary for Thursday, April 29, 2010New ProjectsDesign Technology Laboratory: DTL project is prominent.Em-Lib: Electronic Music Library เป็นโปรแกรมที่พัฒนาขึ้นเพื่อให้เป็นห้องสมุดอิเล็กทรอนิกส์ โดยตัวโปรแกรมจะสามารถจำลองเครื่องดนตรี พร้อมข้อมูลที่เกี่ย...Extensions For ASP.NET: Extensions For ASP.NETFIM 2010 PowerShell Cmdlets: These PowerShell cmdlets let you manage objects in the Forefront Identity Manager 2010 web service easily and concisely. The five cmdlets let you e...Grunty OS: Grunty OS is a Free open source OS made my GruntXproductionsIrishDevelopers.net Test: IrishDevelopers.net TestJobping Url Shortener: MVC 2 project that Shortens urls Fully functioning 'bare bones' url shortening website. Just 1 page to create and display the short url. 'API' ...Made In Hell: Small console game. Just a training in code writings.manx: manx is a system for managing catalogs of technical documents. The system was created to handle the needs of the vintage computing community with ...ntn: playing around with new featuresOpen Bitrac: Bitrac is ASP.NET Blog SystemOpenNETCF.WindowsMobile.Ink: The Mobile Ink Library is a .NET component that provides support for the new Windows Mobile 6 WISP Lite API in your .NET Compact Framework applicat...Proxy Switch: Proxy switch is a program which allows the user to easily switch between a configured proxy server and a home network. The program is developed in ...Samcrypt: .Seoftware Evoluion Repository Mining: Project for mining data about develpoment effort from svn repository.SharePoint 2010 Bulk Document Importer: Need to import/migrate files and folders from servers and network shares into SharePoint 2010 document libraries? This free commandline application...SL2 - A Scala interpreter for L2: A simple L2 Interpreter. Just a project for a class.sMAPedit: -SocialCare: SocialCare - Team EUSOCIALSqlDiffFramework-A Visual Differencing Engine for Dissimilar Data Sources: Unique in database comparison, SDF lets you compare live data--or saved snapshots--from any combination of SQL Server, Oracle, MySql, or ODBC data ...Yasminuroban: Yasminuroban is an open source "Sokoban" (aka "Warehouse Keeper" or "Boxes") alike game totally written in DHTML (JavaScript, CSS and HTML) that us...Хрень для ао: Test project to implement Bot.New ReleasesAntySpam Microsoft Outlook Plugin: AntySpam Microsoft Outlook 2010 Plugin v.2.0.1: this release only tested on MS Outlook 2010 version: 14.0.4536.1000 (32-bit and 64-bit) part of Microsoft Office Professional Plus 2010. Source pro...AppFabric Caching Admin Tool: AppFabric Caching Admin Tool 1.2: System Requirements:.NET 4.0 RC AppFabric Caching Beta2 Refresh Test On:Win 7 (64x) Note: Must run as Administrator !!!AutoSPInstaller: Powershell script to pre-download prerequisites: This script will download all files required by Prerequisiteinstaller.exe in advance, in order to support an 'offline' installation of SharePoint 2...BeanProxy: BeanProxy 2.5: BeanProxy is a C# (.NET 3.5) library housing classes that facilitates unit testing. Any non-static, public interface/class/or abstract class can be...Bluetooth Radar: Verison 2.2.1: Please follow this link for the latest release - http://bluetoothradar.codeplex.com/releases/view/44232CSS 360 Planetary Calendar: LCA: Upload your LCA ppt pieces here. Then someone can merge them together.DeskNotes: DeskNotes 2.1.2: Version in development which features following improvements over DeskNotes 2.1.1: Each DeskNote can be made switchable between other windows on Wi...DotNetNuke Russian Language packs: Core Russian Language Pack for DNN 05.04.01: Core Russian Resource Language Pack for DNN 05.04.01 Русский языковый пакет для DNN 05.04.01DotNetNuke® Community Edition: 05.04.01: Major Highlights Fixed issue with assembly version type. Fixed issue with multiple cancel buttons on Add User screen. Fixed issue with ve...Event Scavenger: Version 3.2: Tweaked the import foreign logs functionality. Set version to 3.2 for Admin tool, service and database. Viewer unaffected.Event Scavenger: Version 3.2 stable: Update viewer version to 3.2 for consistency. Minor changes for rest.Event Scavenger: Version 3.2.2: Added setting to enable/disable creation of historic data.FIM 2010 PowerShell Cmdlets: Initial Release of FIM 2010 PowerShell Cmdlets: Initial release of FIM 2010 PowerShell cmdlets based on open source FIM web service client. Install by running installutil on Quest.FIMPowerhellSna...Fix.ly: Fix.ly 0.2: Fixed a bug where non-verbose mode would not workJobping Url Shortener: Deployment Code v0.1: Contains only the files necessary to run the Url Shortener website. Just provide a valid connection string in the web.configMetabolite Enterprise Libraries for EPiServer CMS using Page Type Builder: Metabolite Enterprise Libraries 1.1: This is the release of the Metabolite Enterprise Libraries 1.1 for use with EPiServer 5 R2 SP2 and Page Type Builder 1.1. If you want to get the s...Metabolite Enterprise Libraries for EPiServer CMS using Page Type Builder: Metabolite Enterprise Libraries 1.1 (source): This is the source code release of the Metabolite Enterprise Libraries 1.1 for use with EPiServer 5 R2 SP2 and Page Type Builder 1.1.MVC Music Store: MVC Music Store v0.8: This is the initial release of the MVC Music Store Tutorial. Download contains source and a tutorial document in PDF format.MyWSAT - ASP.NET Membership Administration Tool: MyWSAT v3.5.2: MyWSAT 3.5.2 Update Notes - April 28th 2010 1.) Fixed theme engine for administration area. Now I can create theme packages to make them available ...N2 CMS: 2.0 rc2: N2 is a lightweight CMS framework for ASP.NET. It helps you build great web sites that anyone can update. Major Changes (1.5 -> 2.0 release candid...NodeXL: Network Overview, Discovery and Exploration for Excel: NodeXL Excel 2007 Template, version 1.0.1.122: The NodeXL Excel 2007 template displays a network graph using edge and vertex lists stored in an Excel 2007 workbook. What's NewThis version signi...nRoute Framework: nRoute.Toolkit Release Version 0.4.5: Release Targets, Silverlight and WPF: - Targets Silverlight 4 for Web - Targets WPF in .NET 4 for Desktop nRoute Dependencies: - System.Windows...NSIS Autorun: NSIS Autorun 0.1.4: NSIS Autorun 0.1.4 This release includes source code, application binary, and example materials.OpenNETCF.WindowsMobile.Ink: Initial Release: This is the source as originally hosted in the OpenNETCF SVN server.Resharper Settings Manager: RSM v2.1: Warning! The 'Load Settings File...' menu item won't show up in VS 2008. Will release an update soon. ChangesReworked plug-in options UI: Split RSM...RoTwee: RoTwee (11.1.0.0): Rotation is improved from 11.0.0.3.SharePoint 2010 Bulk Document Importer: SharePoint 2010 Bulk Document Importer 1.0.0.0: Features in 1.0.0.0 Command line interface Import folders and files with all subfolders Can be run from outside the SharePoint Server. Every a...sMAPedit: sMAPedit v0.6 (Base): + base version containing the applicaion & html resourcesSocialCare: SocialCare_src: Codigo Fonte da Aplicação e Banco de DadosTFS Code Comment Checking Policy (CCCP): CCCP 4.0 for VS2010: This release installs side-by-side with CCCP 3.0 for VS2008 SP1.VCC: Latest build, v2.1.30428.0: Automatic drop of latest buildVCC: Latest build, v2.1.30428.1: Automatic drop of latest buildVCC: Latest build, v2.1.30428.2: Automatic drop of latest buildХрень для ао: Первый релиз: Пробую что это такое.Most Popular ProjectsRawrWBFS ManagerAJAX Control Toolkitpatterns & practices – Enterprise LibrarySilverlight ToolkitMicrosoft SQL Server Product Samples: DatabaseWindows Presentation Foundation (WPF)ASP.NETiTuner - The iTunes CompanionMicrosoft SQL Server Community & SamplesMost Active Projectspatterns & practices – Enterprise LibraryRawrHydroServer - CUAHSI Hydrologic Information System ServerIonics Isapi Rewrite FilterGMap.NET - Great Maps for Windows Forms & PresentationNB_Store - Free DotNetNuke Ecommerce Catalog ModuleParticle Plot PivotSqlDiffFramework-A Visual Differencing Engine for Dissimilar Data SourcesN2 CMSBlogEngine.NET

    Read the article

  • TOTD #166: Using NoSQL database in your Java EE 6 Applications on GlassFish - MongoDB for now!

    - by arungupta
    The Java EE 6 platform includes Java Persistence API to work with RDBMS. The JPA specification defines a comprehensive API that includes, but not restricted to, how a database table can be mapped to a POJO and vice versa, provides mechanisms how a PersistenceContext can be injected in a @Stateless bean and then be used for performing different operations on the database table and write typesafe queries. There are several well known advantages of RDBMS but the NoSQL movement has gained traction over past couple of years. The NoSQL databases are not intended to be a replacement for the mainstream RDBMS. As Philosophy of NoSQL explains, NoSQL database was designed for casual use where all the features typically provided by an RDBMS are not required. The name "NoSQL" is more of a category of databases that is more known for what it is not rather than what it is. The basic principles of NoSQL database are: No need to have a pre-defined schema and that makes them a schema-less database. Addition of new properties to existing objects is easy and does not require ALTER TABLE. The unstructured data gives flexibility to change the format of data any time without downtime or reduced service levels. Also there are no joins happening on the server because there is no structure and thus no relation between them. Scalability and performance is more important than the entire set of functionality typically provided by an RDBMS. This set of databases provide eventual consistency and/or transactions restricted to single items but more focus on CRUD. Not be restricted to SQL to access the information stored in the backing database. Designed to scale-out (horizontal) instead of scale-up (vertical). This is important knowing that databases, and everything else as well, is moving into the cloud. RBDMS can scale-out using sharding but requires complex management and not for the faint of heart. Unlike RBDMS which require a separate caching tier, most of the NoSQL databases comes with integrated caching. Designed for less management and simpler data models lead to lower administration as well. There are primarily three types of NoSQL databases: Key-Value stores (e.g. Cassandra and Riak) Document databases (MongoDB or CouchDB) Graph databases (Neo4J) You may think NoSQL is panacea but as I mentioned above they are not meant to replace the mainstream databases and here is why: RDBMS have been around for many years, very stable, and functionally rich. This is something CIOs and CTOs can bet their money on without much worry. There is a reason 98% of Fortune 100 companies run Oracle :-) NoSQL is cutting edge, brings excitement to developers, but enterprises are cautious about them. Commercial databases like Oracle are well supported by the backing enterprises in terms of providing support resources on a global scale. There is a full ecosystem built around these commercial databases providing training, performance tuning, architecture guidance, and everything else. NoSQL is fairly new and typically backed by a single company not able to meet the scale of these big enterprises. NoSQL databases are good for CRUDing operations but business intelligence is extremely important for enterprises to stay competitive. RDBMS provide extensive tooling to generate this data but that was not the original intention of NoSQL databases and is lacking in that area. Generating any meaningful information other than CRUDing require extensive programming. Not suited for complex transactions such as banking systems or other highly transactional applications requiring 2-phase commit. SQL cannot be used with NoSQL databases and writing simple queries can be involving. Enough talking, lets take a look at some code. This blog has published multiple blogs on how to access a RDBMS using JPA in a Java EE 6 application. This Tip Of The Day (TOTD) will show you can use MongoDB (a document-oriented database) with a typical 3-tier Java EE 6 application. Lets get started! The complete source code of this project can be downloaded here. Download MongoDB for your platform from here (1.8.2 as of this writing) and start the server as: arun@ArunUbuntu:~/tools/mongodb-linux-x86_64-1.8.2/bin$./mongod./mongod --help for help and startup optionsSun Jun 26 20:41:11 [initandlisten] MongoDB starting : pid=11210port=27017 dbpath=/data/db/ 64-bit Sun Jun 26 20:41:11 [initandlisten] db version v1.8.2, pdfile version4.5Sun Jun 26 20:41:11 [initandlisten] git version:433bbaa14aaba6860da15bd4de8edf600f56501bSun Jun 26 20:41:11 [initandlisten] build sys info: Linuxbs-linux64.10gen.cc 2.6.21.7-2.ec2.v1.2.fc8xen #1 SMP Fri Nov 2017:48:28 EST 2009 x86_64 BOOST_LIB_VERSION=1_41Sun Jun 26 20:41:11 [initandlisten] waiting for connections on port 27017Sun Jun 26 20:41:11 [websvr] web admin interface listening on port 28017 The default directory for the database is /data/db and needs to be created as: sudo mkdir -p /data/db/sudo chown `id -u` /data/db You can specify a different directory using "--dbpath" option. Refer to Quickstart for your specific platform. Using NetBeans, create a Java EE 6 project and make sure to enable CDI and add JavaServer Faces framework. Download MongoDB Java Driver (2.6.3 of this writing) and add it to the project library by selecting "Properties", "LIbraries", "Add Library...", creating a new library by specifying the location of the JAR file, and adding the library to the created project. Edit the generated "index.xhtml" such that it looks like: <h1>Add a new movie</h1><h:form> Name: <h:inputText value="#{movie.name}" size="20"/><br/> Year: <h:inputText value="#{movie.year}" size="6"/><br/> Language: <h:inputText value="#{movie.language}" size="20"/><br/> <h:commandButton actionListener="#{movieSessionBean.createMovie}" action="show" title="Add" value="submit"/></h:form> This page has a simple HTML form with three text boxes and a submit button. The text boxes take name, year, and language of a movie and the submit button invokes the "createMovie" method of "movieSessionBean" and then render "show.xhtml". Create "show.xhtml" ("New" -> "Other..." -> "Other" -> "XHTML File") such that it looks like: <head> <title><h1>List of movies</h1></title> </head> <body> <h:form> <h:dataTable value="#{movieSessionBean.movies}" var="m" > <h:column><f:facet name="header">Name</f:facet>#{m.name}</h:column> <h:column><f:facet name="header">Year</f:facet>#{m.year}</h:column> <h:column><f:facet name="header">Language</f:facet>#{m.language}</h:column> </h:dataTable> </h:form> This page shows the name, year, and language of all movies stored in the database so far. The list of movies is returned by "movieSessionBean.movies" property. Now create the "Movie" class such that it looks like: import com.mongodb.BasicDBObject;import com.mongodb.BasicDBObject;import com.mongodb.DBObject;import javax.enterprise.inject.Model;import javax.validation.constraints.Size;/** * @author arun */@Modelpublic class Movie { @Size(min=1, max=20) private String name; @Size(min=1, max=20) private String language; private int year; // getters and setters for "name", "year", "language" public BasicDBObject toDBObject() { BasicDBObject doc = new BasicDBObject(); doc.put("name", name); doc.put("year", year); doc.put("language", language); return doc; } public static Movie fromDBObject(DBObject doc) { Movie m = new Movie(); m.name = (String)doc.get("name"); m.year = (int)doc.get("year"); m.language = (String)doc.get("language"); return m; } @Override public String toString() { return name + ", " + year + ", " + language; }} Other than the usual boilerplate code, the key methods here are "toDBObject" and "fromDBObject". These methods provide a conversion from "Movie" -> "DBObject" and vice versa. The "DBObject" is a MongoDB class that comes as part of the mongo-2.6.3.jar file and which we added to our project earlier.  The complete javadoc for 2.6.3 can be seen here. Notice, this class also uses Bean Validation constraints and will be honored by the JSF layer. Finally, create "MovieSessionBean" stateless EJB with all the business logic such that it looks like: package org.glassfish.samples;import com.mongodb.BasicDBObject;import com.mongodb.DB;import com.mongodb.DBCollection;import com.mongodb.DBCursor;import com.mongodb.DBObject;import com.mongodb.Mongo;import java.net.UnknownHostException;import java.util.ArrayList;import java.util.List;import javax.annotation.PostConstruct;import javax.ejb.Stateless;import javax.inject.Inject;import javax.inject.Named;/** * @author arun */@Stateless@Namedpublic class MovieSessionBean { @Inject Movie movie; DBCollection movieColl; @PostConstruct private void initDB() throws UnknownHostException { Mongo m = new Mongo(); DB db = m.getDB("movieDB"); movieColl = db.getCollection("movies"); if (movieColl == null) { movieColl = db.createCollection("movies", null); } } public void createMovie() { BasicDBObject doc = movie.toDBObject(); movieColl.insert(doc); } public List<Movie> getMovies() { List<Movie> movies = new ArrayList(); DBCursor cur = movieColl.find(); System.out.println("getMovies: Found " + cur.size() + " movie(s)"); for (DBObject dbo : cur.toArray()) { movies.add(Movie.fromDBObject(dbo)); } return movies; }} The database is initialized in @PostConstruct. Instead of a working with a database table, NoSQL databases work with a schema-less document. The "Movie" class is the document in our case and stored in the collection "movies". The collection allows us to perform query functions on all movies. The "getMovies" method invokes "find" method on the collection which is equivalent to the SQL query "select * from movies" and then returns a List<Movie>. Also notice that there is no "persistence.xml" in the project. Right-click and run the project to see the output as: Enter some values in the text box and click on enter to see the result as: If you reached here then you've successfully used MongoDB in your Java EE 6 application, congratulations! Some food for thought and further play ... SQL to MongoDB mapping shows mapping between traditional SQL -> Mongo query language. Tutorial shows fun things you can do with MongoDB. Try the interactive online shell  The cookbook provides common ways of using MongoDB In terms of this project, here are some tasks that can be tried: Encapsulate database management in a JPA persistence provider. Is it even worth it because the capabilities are going to be very different ? MongoDB uses "BSonObject" class for JSON representation, add @XmlRootElement on a POJO and how a compatible JSON representation can be generated. This will make the fromXXX and toXXX methods redundant.

    Read the article

  • TOTD #166: Using NoSQL database in your Java EE 6 Applications on GlassFish - MongoDB for now!

    - by arungupta
    The Java EE 6 platform includes Java Persistence API to work with RDBMS. The JPA specification defines a comprehensive API that includes, but not restricted to, how a database table can be mapped to a POJO and vice versa, provides mechanisms how a PersistenceContext can be injected in a @Stateless bean and then be used for performing different operations on the database table and write typesafe queries. There are several well known advantages of RDBMS but the NoSQL movement has gained traction over past couple of years. The NoSQL databases are not intended to be a replacement for the mainstream RDBMS. As Philosophy of NoSQL explains, NoSQL database was designed for casual use where all the features typically provided by an RDBMS are not required. The name "NoSQL" is more of a category of databases that is more known for what it is not rather than what it is. The basic principles of NoSQL database are: No need to have a pre-defined schema and that makes them a schema-less database. Addition of new properties to existing objects is easy and does not require ALTER TABLE. The unstructured data gives flexibility to change the format of data any time without downtime or reduced service levels. Also there are no joins happening on the server because there is no structure and thus no relation between them. Scalability and performance is more important than the entire set of functionality typically provided by an RDBMS. This set of databases provide eventual consistency and/or transactions restricted to single items but more focus on CRUD. Not be restricted to SQL to access the information stored in the backing database. Designed to scale-out (horizontal) instead of scale-up (vertical). This is important knowing that databases, and everything else as well, is moving into the cloud. RBDMS can scale-out using sharding but requires complex management and not for the faint of heart. Unlike RBDMS which require a separate caching tier, most of the NoSQL databases comes with integrated caching. Designed for less management and simpler data models lead to lower administration as well. There are primarily three types of NoSQL databases: Key-Value stores (e.g. Cassandra and Riak) Document databases (MongoDB or CouchDB) Graph databases (Neo4J) You may think NoSQL is panacea but as I mentioned above they are not meant to replace the mainstream databases and here is why: RDBMS have been around for many years, very stable, and functionally rich. This is something CIOs and CTOs can bet their money on without much worry. There is a reason 98% of Fortune 100 companies run Oracle :-) NoSQL is cutting edge, brings excitement to developers, but enterprises are cautious about them. Commercial databases like Oracle are well supported by the backing enterprises in terms of providing support resources on a global scale. There is a full ecosystem built around these commercial databases providing training, performance tuning, architecture guidance, and everything else. NoSQL is fairly new and typically backed by a single company not able to meet the scale of these big enterprises. NoSQL databases are good for CRUDing operations but business intelligence is extremely important for enterprises to stay competitive. RDBMS provide extensive tooling to generate this data but that was not the original intention of NoSQL databases and is lacking in that area. Generating any meaningful information other than CRUDing require extensive programming. Not suited for complex transactions such as banking systems or other highly transactional applications requiring 2-phase commit. SQL cannot be used with NoSQL databases and writing simple queries can be involving. Enough talking, lets take a look at some code. This blog has published multiple blogs on how to access a RDBMS using JPA in a Java EE 6 application. This Tip Of The Day (TOTD) will show you can use MongoDB (a document-oriented database) with a typical 3-tier Java EE 6 application. Lets get started! The complete source code of this project can be downloaded here. Download MongoDB for your platform from here (1.8.2 as of this writing) and start the server as: arun@ArunUbuntu:~/tools/mongodb-linux-x86_64-1.8.2/bin$./mongod./mongod --help for help and startup optionsSun Jun 26 20:41:11 [initandlisten] MongoDB starting : pid=11210port=27017 dbpath=/data/db/ 64-bit Sun Jun 26 20:41:11 [initandlisten] db version v1.8.2, pdfile version4.5Sun Jun 26 20:41:11 [initandlisten] git version:433bbaa14aaba6860da15bd4de8edf600f56501bSun Jun 26 20:41:11 [initandlisten] build sys info: Linuxbs-linux64.10gen.cc 2.6.21.7-2.ec2.v1.2.fc8xen #1 SMP Fri Nov 2017:48:28 EST 2009 x86_64 BOOST_LIB_VERSION=1_41Sun Jun 26 20:41:11 [initandlisten] waiting for connections on port 27017Sun Jun 26 20:41:11 [websvr] web admin interface listening on port 28017 The default directory for the database is /data/db and needs to be created as: sudo mkdir -p /data/db/sudo chown `id -u` /data/db You can specify a different directory using "--dbpath" option. Refer to Quickstart for your specific platform. Using NetBeans, create a Java EE 6 project and make sure to enable CDI and add JavaServer Faces framework. Download MongoDB Java Driver (2.6.3 of this writing) and add it to the project library by selecting "Properties", "LIbraries", "Add Library...", creating a new library by specifying the location of the JAR file, and adding the library to the created project. Edit the generated "index.xhtml" such that it looks like: <h1>Add a new movie</h1><h:form> Name: <h:inputText value="#{movie.name}" size="20"/><br/> Year: <h:inputText value="#{movie.year}" size="6"/><br/> Language: <h:inputText value="#{movie.language}" size="20"/><br/> <h:commandButton actionListener="#{movieSessionBean.createMovie}" action="show" title="Add" value="submit"/></h:form> This page has a simple HTML form with three text boxes and a submit button. The text boxes take name, year, and language of a movie and the submit button invokes the "createMovie" method of "movieSessionBean" and then render "show.xhtml". Create "show.xhtml" ("New" -> "Other..." -> "Other" -> "XHTML File") such that it looks like: <head> <title><h1>List of movies</h1></title> </head> <body> <h:form> <h:dataTable value="#{movieSessionBean.movies}" var="m" > <h:column><f:facet name="header">Name</f:facet>#{m.name}</h:column> <h:column><f:facet name="header">Year</f:facet>#{m.year}</h:column> <h:column><f:facet name="header">Language</f:facet>#{m.language}</h:column> </h:dataTable> </h:form> This page shows the name, year, and language of all movies stored in the database so far. The list of movies is returned by "movieSessionBean.movies" property. Now create the "Movie" class such that it looks like: import com.mongodb.BasicDBObject;import com.mongodb.BasicDBObject;import com.mongodb.DBObject;import javax.enterprise.inject.Model;import javax.validation.constraints.Size;/** * @author arun */@Modelpublic class Movie { @Size(min=1, max=20) private String name; @Size(min=1, max=20) private String language; private int year; // getters and setters for "name", "year", "language" public BasicDBObject toDBObject() { BasicDBObject doc = new BasicDBObject(); doc.put("name", name); doc.put("year", year); doc.put("language", language); return doc; } public static Movie fromDBObject(DBObject doc) { Movie m = new Movie(); m.name = (String)doc.get("name"); m.year = (int)doc.get("year"); m.language = (String)doc.get("language"); return m; } @Override public String toString() { return name + ", " + year + ", " + language; }} Other than the usual boilerplate code, the key methods here are "toDBObject" and "fromDBObject". These methods provide a conversion from "Movie" -> "DBObject" and vice versa. The "DBObject" is a MongoDB class that comes as part of the mongo-2.6.3.jar file and which we added to our project earlier.  The complete javadoc for 2.6.3 can be seen here. Notice, this class also uses Bean Validation constraints and will be honored by the JSF layer. Finally, create "MovieSessionBean" stateless EJB with all the business logic such that it looks like: package org.glassfish.samples;import com.mongodb.BasicDBObject;import com.mongodb.DB;import com.mongodb.DBCollection;import com.mongodb.DBCursor;import com.mongodb.DBObject;import com.mongodb.Mongo;import java.net.UnknownHostException;import java.util.ArrayList;import java.util.List;import javax.annotation.PostConstruct;import javax.ejb.Stateless;import javax.inject.Inject;import javax.inject.Named;/** * @author arun */@Stateless@Namedpublic class MovieSessionBean { @Inject Movie movie; DBCollection movieColl; @PostConstruct private void initDB() throws UnknownHostException { Mongo m = new Mongo(); DB db = m.getDB("movieDB"); movieColl = db.getCollection("movies"); if (movieColl == null) { movieColl = db.createCollection("movies", null); } } public void createMovie() { BasicDBObject doc = movie.toDBObject(); movieColl.insert(doc); } public List<Movie> getMovies() { List<Movie> movies = new ArrayList(); DBCursor cur = movieColl.find(); System.out.println("getMovies: Found " + cur.size() + " movie(s)"); for (DBObject dbo : cur.toArray()) { movies.add(Movie.fromDBObject(dbo)); } return movies; }} The database is initialized in @PostConstruct. Instead of a working with a database table, NoSQL databases work with a schema-less document. The "Movie" class is the document in our case and stored in the collection "movies". The collection allows us to perform query functions on all movies. The "getMovies" method invokes "find" method on the collection which is equivalent to the SQL query "select * from movies" and then returns a List<Movie>. Also notice that there is no "persistence.xml" in the project. Right-click and run the project to see the output as: Enter some values in the text box and click on enter to see the result as: If you reached here then you've successfully used MongoDB in your Java EE 6 application, congratulations! Some food for thought and further play ... SQL to MongoDB mapping shows mapping between traditional SQL -> Mongo query language. Tutorial shows fun things you can do with MongoDB. Try the interactive online shell  The cookbook provides common ways of using MongoDB In terms of this project, here are some tasks that can be tried: Encapsulate database management in a JPA persistence provider. Is it even worth it because the capabilities are going to be very different ? MongoDB uses "BSonObject" class for JSON representation, add @XmlRootElement on a POJO and how a compatible JSON representation can be generated. This will make the fromXXX and toXXX methods redundant.

    Read the article

  • Node.js Adventure - Node.js on Windows

    - by Shaun
    Two weeks ago I had had a talk with Wang Tao, a C# MVP in China who is currently running his startup company and product named worktile. He asked me to figure out a synchronization solution which helps his product in the future. And he preferred me implementing the service in Node.js, since his worktile is written in Node.js. Even though I have some experience in ASP.NET MVC, HTML, CSS and JavaScript, I don’t think I’m an expert of JavaScript. In fact I’m very new to it. So it scared me a bit when he asked me to use Node.js. But after about one week investigate I have to say Node.js is very easy to learn, use and deploy, even if you have very limited JavaScript skill. And I think I became love Node.js. Hence I decided to have a series named “Node.js Adventure”, where I will demonstrate my story of learning and using Node.js in Windows and Windows Azure. And this is the first one.   (Brief) Introduction of Node.js I don’t want to have a fully detailed introduction of Node.js. There are many resource on the internet we can find. But the best one is its homepage. Node.js was created by Ryan Dahl, sponsored by Joyent. It’s consist of about 80% C/C++ for core and 20% JavaScript for API. It utilizes CommonJS as the module system which we will explain later. The official definition of Node.js is Node.js is a platform built on Chrome's JavaScript runtime for easily building fast, scalable network applications. Node.js uses an event-driven, non-blocking I/O model that makes it lightweight and efficient, perfect for data-intensive real-time applications that run across distributed devices. First of all, Node.js utilizes JavaScript as its development language and runs on top of V8 engine, which is being used by Chrome. It brings JavaScript, a client-side language into the backend service world. So many people said, even though not that actually, “Node.js is a server side JavaScript”. Additionally, Node.js uses an event-driven, non-blocking IO model. This means in Node.js there’s no way to block currently working thread. Every operation in Node.js executed asynchronously. This is a huge benefit especially if our code needs IO operations such as reading disks, connect to database, consuming web service, etc.. Unlike IIS or Apache, Node.js doesn’t utilize the multi-thread model. In Node.js there’s only one working thread serves all users requests and resources response, as the ST star in the figure below. And there is a POSIX async threads pool in Node.js which contains many async threads (AT stars) for IO operations. When a user have an IO request, the ST serves it but it will not do the IO operation. Instead the ST will go to the POSIX async threads pool to pick up an AT, pass this operation to it, and then back to serve any other requests. The AT will actually do the IO operation asynchronously. Assuming before the AT complete the IO operation there is another user comes. The ST will serve this new user request, pick up another AT from the POSIX and then back. If the previous AT finished the IO operation it will take the result back and wait for the ST to serve. ST will take the response and return the AT to POSIX, and then response to the user. And if the second AT finished its job, the ST will response back to the second user in the same way. As you can see, in Node.js there’s only one thread serve clients’ requests and POSIX results. This thread looping between the users and POSIX and pass the data back and forth. The async jobs will be handled by POSIX. This is the event-driven non-blocking IO model. The performance of is model is much better than the multi-threaded blocking model. For example, Apache is built in multi-threaded blocking model while Nginx is in event-driven non-blocking mode. Below is the performance comparison between them. And below is the memory usage comparison between them. These charts are captured from the video NodeJS Basics: An Introductory Training, which presented at Cloud Foundry Developer Advocate.   Node.js on Windows To execute Node.js application on windows is very simple. First of you we need to download the latest Node.js platform from its website. After installed, it will register its folder into system path variant so that we can execute Node.js at anywhere. To confirm the Node.js installation, just open up a command windows and type “node”, then it will show the Node.js console. As you can see this is a JavaScript interactive console. We can type some simple JavaScript code and command here. To run a Node.js JavaScript application, just specify the source code file name as the argument of the “node” command. For example, let’s create a Node.js source code file named “helloworld.js”. Then copy a sample code from Node.js website. 1: var http = require("http"); 2:  3: http.createServer(function (req, res) { 4: res.writeHead(200, {"Content-Type": "text/plain"}); 5: res.end("Hello World\n"); 6: }).listen(1337, "127.0.0.1"); 7:  8: console.log("Server running at http://127.0.0.1:1337/"); This code will create a web server, listening on 1337 port and return “Hello World” when any requests come. Run it in the command windows. Then open a browser and navigate to http://localhost:1337/. As you can see, when using Node.js we are not creating a web application. In fact we are likely creating a web server. We need to deal with request, response and the related headers, status code, etc.. And this is one of the benefit of using Node.js, lightweight and straightforward. But creating a website from scratch again and again is not acceptable. The good news is that, Node.js utilizes CommonJS as its module system, so that we can leverage some modules to simplify our job. And furthermore, there are about ten thousand of modules available n the internet, which covers almost all areas in server side application development.   NPM and Node.js Modules Node.js utilizes CommonJS as its module system. A module is a set of JavaScript files. In Node.js if we have an entry file named “index.js”, then all modules it needs will be located at the “node_modules” folder. And in the “index.js” we can import modules by specifying the module name. For example, in the code we’ve just created, we imported a module named “http”, which is a build-in module installed alone with Node.js. So that we can use the code in this “http” module. Besides the build-in modules there are many modules available at the NPM website. Thousands of developers are contributing and downloading modules at this website. Hence this is another benefit of using Node.js. There are many modules we can use, and the numbers of modules increased very fast, and also we can publish our modules to the community. When I wrote this post, there are totally 14,608 modules at NPN and about 10 thousand downloads per day. Install a module is very simple. Let’s back to our command windows and input the command “npm install express”. This command will install a module named “express”, which is a MVC framework on top of Node.js. And let’s create another JavaScript file named “helloweb.js” and copy the code below in it. I imported the “express” module. And then when the user browse the home page it will response a text. If the incoming URL matches “/Echo/:value” which the “value” is what the user specified, it will pass it back with the current date time in JSON format. And finally my website was listening at 12345 port. 1: var express = require("express"); 2: var app = express(); 3:  4: app.get("/", function(req, res) { 5: res.send("Hello Node.js and Express."); 6: }); 7:  8: app.get("/Echo/:value", function(req, res) { 9: var value = req.params.value; 10: res.json({ 11: "Value" : value, 12: "Time" : new Date() 13: }); 14: }); 15:  16: console.log("Web application opened."); 17: app.listen(12345); For more information and API about the “express”, please have a look here. Start our application from the command window by command “node helloweb.js”, and then navigate to the home page we can see the response in the browser. And if we go to, for example http://localhost:12345/Echo/Hello Shaun, we can see the JSON result. The “express” module is very populate in NPM. It makes the job simple when we need to build a MVC website. There are many modules very useful in NPM. - underscore: A utility module covers many common functionalities such as for each, map, reduce, select, etc.. - request: A very simple HTT request client. - async: Library for coordinate async operations. - wind: Library which enable us to control flow with plain JavaScript for asynchronous programming (and more) without additional pre-compiling steps.   Node.js and IIS I demonstrated how to run the Node.js application from console. Since we are in Windows another common requirement would be, “can I host Node.js in IIS?” The answer is “Yes”. Tomasz Janczuk created a project IISNode at his GitHub space we can find here. And Scott Hanselman had published a blog post introduced about it.   Summary In this post I provided a very brief introduction of Node.js, includes it official definition, architecture and how it implement the event-driven non-blocking model. And then I described how to install and run a Node.js application on windows console. I also described the Node.js module system and NPM command. At the end I referred some links about IISNode, an IIS extension that allows Node.js application runs on IIS. Node.js became a very popular server side application platform especially in this year. By leveraging its non-blocking IO model and async feature it’s very useful for us to build a highly scalable, asynchronously service. I think Node.js will be used widely in the cloud application development in the near future.   In the next post I will explain how to use SQL Server from Node.js.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • PASS Summit Feedback

    - by Rob Farley
    PASS Feedback came in last week. I also saw my dentist for some fillings... At the PASS Summit this year, I delivered a couple of regular sessions and a Lightning Talk. People told me they enjoyed it, but when the rankings came out, they showed that I didn’t score particularly well. Brent Ozar was keen to discuss it with me. Brent: PASS speaker feedback is out. You did two sessions and a Lightning Talk. How did you go? Rob: Not so well actually, thanks for asking. Brent: Ha! Sorry. Of course you know that's why I wanted to discuss this with you. I was in one of your sessions at SQLBits in the UK a month before PASS, and I thought you rocked. You've got a really good and distinctive delivery style.  Then I noticed your talks were ranked in the bottom quarter of the Summit ratings and wanted to discuss it. Rob: Yeah, I know. You did ask me if we could do this...  I should explain – my presentation style is not the stereotypical IT conference one. I throw in jokes, and try to engage the audience thoroughly. I find many talks amazingly dry, and I guess I try to buck that trend. I also run training courses, and find that I get a lot of feedback from people thanking me for keeping things interesting. That said, I also get feedback criticising me for my style, and that’s basically what’s happened here. For the rest of this discussion, let’s focus on my talk about the Incredible Shrinking Execution Plan, which I considered to be my main talk. Brent: I thought that session title was the very best one at the entire Summit, and I had it on my recommended sessions list.  In four words, you managed to sum up the topic and your sense of humor.  I read that and immediately thought, "People need to be in this session," and then it didn't score well.  Tell me about your scores. Rob: The questions on the feedback form covered the usefulness of the information, the speaker’s presentation skills, their knowledge of the subject, how well the session was described, the amount of time allocated, and the quality of the presentation materials. Brent: Presentation materials? But you don’t do slides.  Did they rate your thong? Rob: No-one saw my flip-flops in this talk, Brent. I created a script in Management Studio, and published that afterwards, but I think people will have scored that question based on the lack of slides. I wasn’t expecting to do particularly well on that one. That was the only section that didn’t have 5/5 as the most popular score. Brent: See, that sucks, because cookbook-style scripts are often some of my favorites.  Adam Machanic's Service Broker workbench series helped me immensely when I was prepping for the MCM.  As an attendee, I'd rather have a commented script than a slide deck.  So how did you rank so low? Rob: When I look at the scores that you got (based on your blog post), you got very few scores below 3 – people that felt strong enough about your talk to post a negative score. In my scores, between 5% and 10% were below 3 (except on the question about whether I knew my stuff – I guess I came as knowledgeable). Brent: Wow – so quite a few people really didn’t like your talk then? Rob: Yeah. Mind you, based on the comments, some people really loved it. I’d like to think that there would be a certain portion of the room who may have rated the talk as one of the best of the conference. Some of my comments included “amazing!”, “Best presentation so far!”, “Wow, best session yet”, “fantastic” and “Outstanding!”. I think lots of talks can be “Great”, but not so many talks can be “Outstanding” without the word losing its meaning. One wrote “Pretty amazing presentation, considering it was completely extemporaneous.” Brent: Extemporaneous, eh? Rob: Yeah. I guess they don’t realise how much preparation goes into coming across as unprepared. In many ways it’s much easier to give a written speech than to deliver a presentation without slides as a prompt. Brent: That delivery style, the really relaxed, casual, college-professor approach was one of the things I really liked about your presentation at SQLbits.  As somebody who presents a lot, I "get" it - I know how hard it is to come off as relaxed and comfortable with your own material.  It's like improv done by jazz players and comedians - if you've never tried it, you don't realize how hard it is.  People also don't realize how hard it is to make a tough subject fun. Rob: Yeah well... There will be people writing comments on this post that say I wasn't trying to make the subject fun, and that I was making it all about me. Sometimes the style works, sometimes it doesn't. Most of the comments mentioned the fact that I tell jokes, some in a nice way, but some not so much (and it wasn't just a PASS thing - that's the mix of feedback I generally get). One comment at PASS was: “great stand up comedian - not what I'm looking for at pass”, and there were certainly a few that said “too many jokes”. I’m not trying to do stand-up – jokes are my way of engaging with the audience while I demonstrate some of the amazing things that the Query Optimizer can do if you write your queries the right way. Some people didn’t think it was technical enough, but I’ve also had some people tell me that the concepts I’m explaining are deep and profound. Brent: To me, that's a hallmark of a great explanation - when someone says, "But of course it has to work that way - how could it work any other way?  It seems so simple and logical."  Well, sure it does when it's explained correctly, but now pick up any number of thick SQL Server books and try to understand the Redundant Joins concept.  I guarantee it'll take more than 45 minutes. Rob: Some people in my audiences realise that, but definitely not everyone. There's only so much you can tell someone that something is profound. Generally it's something that they either have an epiphany on or not. I like to lull my audience into knowing what's going on, and do something that surprises them. Gain their trust, build a rapport, and then show them the deeper truth of what just happened. Brent: So you've learned your lesson about presentation scores, right?  From here on out, you're going to be dry, humorless, and all your presentations will consist of you reading bullet points off the screen. Rob: No Brent, I’m not. I'm also not going to suggest that most presentations at PASS are like that. No-one tries to present like that. There's a big space to occupy between what "dry and humourless" and me. My difference is to focus on the relationship I have with the crowd, rather than focussing on delivering the perfect session. I want to see people smiling and know they're relaxed. I think most presenters focus on the material, which is completely reasonable and safe. I remember once hearing someone talking about product creation. They talked about mediocrity. They said that one of the worst things that people can ever say about your product is that it’s “good”. What you want is for 10% of the world to love it enough to want to buy it. If 10% the world gave me a dollar, I’d have more money than I could ever use (assuming it wasn’t the SAME dollar they were giving me I guess). Brent: It's the Raving Fans theory.  It's better to have a small number of raving customers than a large number of almost-but-not-really customers who don't care that much about your product or service.  I know exactly how you feel - when I got survey feedback from my Quest video presentation when I was dressed up in a Richard Simmons costume, some of the attendees said I was unprofessional and distracting.  Some of the attendees couldn't get enough and Photoshopped all kinds of stuff into the screen captures.  On a whole, I probably didn't score that well, and I'm fine with that.  It sucks to look at the scores though - do those lower scores bother you? Rob: Of course they do. It hurts deeply. I open myself up and give presentations in a very personal way. All presenters do that, and we all feel the pain of negative feedback. I hate coming 146th & 162nd out of 185, but have to acknowledge that many sessions did worse still. Plus, once I feel the wounds have healed, I’ll be able to remember that there are people in the world that rave about my presentation style, and figure that people will hopefully talk about me. One day maybe those people that don’t like my presentation style will stay away and I might be able to score better. You don’t pay to hear country music if you prefer western... Lots of people find chili too spicy, but it’s still a popular food. Brent: But don’t you want to appeal to everyone? Rob: I do, but I don’t want to be lukewarm as in Revelation 3:16. I’d rather disgust and be discussed. Well, maybe not ‘disgust’, but I don’t want to conform. Conformity just isn’t the same any more. I’m not sure I’ve ever been one to do that. I try not to offend, but definitely like to be different. Brent: Count me among your raving fans, sir.  Where can we see you next? Rob: Considering I live in Adelaide in Australia, I’m not about to appear at anyone’s local SQL Saturday. I’m still trying to plan which events I’ll get to in 2011. I’ve submitted abstracts for TechEd North America, but won’t hold my breath. I’m also considering the SQLBits conferences in the UK in April, PASS in October, and I’m sure I’ll do some LiveMeeting presentations for user groups. Online, people download some of my recent SQLBits presentations at http://bit.ly/RFSarg and http://bit.ly/Simplification though. And they can download a 5-minute MP3 of my Lightning Talk at http://www.lobsterpot.com.au/files/Collation.mp3, in which I try to explain the idea behind collation, using thongs as an example. Brent: I was in the audience for http://bit.ly/RFSarg. That was a great presentation. Rob: Thanks, Brent. Now where’s my dollar?

    Read the article

  • NRF Online Merchandising Workshop: Where Online Retailers Are Focusing for Holiday and Beyond

    - by Rose Spicer-Oracle
    0 0 1 1204 6863 Oracle Corporation 57 16 8051 14.0 Normal 0 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:Cambria; mso-ascii-font-family:Cambria; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Cambria; mso-hansi-theme-font:minor-latin;} Last month we attended the NRF Online Merchandising Workshop in LA, and it was a great opportunity to catch up with our customers, meet new retailers, and hear some great presentations from VF Corporation, Zazzle, Julep Beauty, Backcountry, eBags and more. The one-on-one conversations with Merchants and the keynote presentations carry the same themes across companies of all sizes and across verticals. With only 125 days left (and counting) until Black Friday, these conversations provided some great insight in to what’s top of mind for retailers during the most stressful time of their year, and a sneak peek in to what they will deliver this holiday season.  Some of the most popular topics were: When to start promoting for holiday: seems like a funny conversation to have in July, but a number of retailers said they already had their holiday shopping gift guides live on their site, and it was attracting a significant portion of their onsite traffic. When it comes to timing, most retailers were questioning when to begin their holiday promotions -- carefully balancing when to release pricing and specials, and knowing that customers are holding out for last-minute deals and price drops. Many retailers noted the frustrations around transparent pricing by Amazon and a few other mega-retailers last year, publishing their “lowest prices of the season” as early as October – ensuring shoppers that those prices were the best they could get all season long. Many retailers felt their hands were forced to drop prices. Others kept their set pricing with negative customer reaction, causing some to miss their holiday goals. The pressure is on, and most retailers identified November 1 as their target start date for the holiday promotions blitz. Some are even waiting for the big guys to release their “lowest prices of the season” guides and will then follow suit.      Attribution is tough – and a huge focus: understanding the path to conversion is a tough nut to crack, especially in the new omnichannel world where consumers use multiple touchpoints to make a single purchase, and internal management wants to know hard data. This has lead many retailers to invest in attribution; carefully tracking their online marketing efforts to determine what gets “credit” for the sale, instead of giving credit to the “last click.” Retailers noted that it is very difficult to determine the numbers when online and offline worlds collide – like when a shopper uses digital channels for research and then makes a purchase in a store. As one of the presenters from The North Face mentioned in her keynote, a key to enabling better customer service and satisfaction when it comes to converged online and offline sales is training the in-store staff, and creating a culture where it eventually “doesn’t matter what group gets the credit” if they all add to the sale. No doubt, the area of attribution will be a big area of retail investment in the coming years.      How to plan for the converged world: planning to ensure inventory gets where it needs to be was another concern. In conversations with retailers, we advised them to analyze customer patterns: where shoppers purchase items, where the items were sourced from and even where items are returned. This analysis is very valuable in determining inventory plans. From there, retailers can more accurately plan and allocate inventory to support both the online and offline customer behavior. As we head into the holiday season, the need for accurate enterprise-wide inventory visibility, and providing that information to associates, is even more critical to the brand-wide customer experience.       Improving the search / navigation / usability of the site(s): Aside from some of the big ideas and standard holiday pricing pressure, most conversations we had centered around continuing to improve the basics of the site. Reinvesting in search and navigation came up time and time again (FitForCommerce blogged about what a big topic it was at the event as well). Obviously getting shoppers on their path quickly and allowing them to find what they need fast is critical, but it was definitely interesting to hear just how much effort is still going in to honing the search and navigation experience. Adding new elements to search and navigation like typeahed, inventive navigation refinements, and new navigation categories like gift guides, specialized boutiques and flash sales were top of mind, in addition to searchandising and making search-driven product recommendations. (Oracle can help!)       Reducing cart abandonment: always a hot topic that is top of mind for every online retailer. Getting shoppers to the cart is often less then half the battle; getting them to click “buy” and complete the transaction is much more difficult. While retailers carefully study the checkout process and where shoppers tend to bounce, they know that how they design their checkout page is critical. We’re all online shoppers in our personal lives and we know how frustrating it can be when total prices are not transparent (i.e. shipping, processing, taxes is not included until the very last possible screen before clicking that buy button). Online retailers are struggling with where in the checkout process to surface the total price to be charged to reduce cart abandonment, while not showing the total figure too early in the process that it keeps shoppers from getting to checkout altogether. Recent research shows that providing total pricing prior to the checkout process dramatically reduces cart abandonment – as it serves as a filter to those shopping within a specific price band. Much of the cart abandonment discussion leads us to…       The free shipping / free returns question: it’s no secret that because of Amazon and programs like Prime, consumers expect free shipping, much to the chagrin of the smaller retailer. The reality is that if you’re not a mega-retailer, shipping is an expensive part of doing business that doesn’t allow most retailers to keep their prices low and offer free shipping. This has many retailers venturing out on the “free returns” path, especially in apparel. A number of retailers we spoke with are testing a flat rate shipping fee with free returns to see if they can crack the price threshold where shoppers are willing to pay for shipping with an added service. But, free shipping remains king.      Social ads and retargeting: they are working, but do they turn off consumers? That’s the big question. Every retailer we spoke with during a roundtable on the topic said that social ads and retargeting (where that pair of boots you’re been eyeing on a site magically follows you around the Internet) work and are meeting campaign goals. The larger question many retailers are asking is if this type of tactic is turning off a large number of shoppers, even if these campaigns are meeting their early goals. Retailers also mentioned that Facebook ads are working very well for them, especially when it comes to new customer acquisition, serving as a complimentary a channel to SEO when it comes to engaging new customers. While there are always new things to experiment with in retail, standard challenges are top of mind as retailers scramble to get ready for holiday. It will undoubtedly be another record-breaking online shopping season, but as retailers get more and more advanced with each Black Friday, expect some exciting things. This excitement needs to be backed by sound solutions and optimized operations. Then again, consumers are expecting more than ever, so I don’t doubt that retailers are already thinking about the possibilities of holiday 2015… and beyond. Customers who read this article, also found value in the following stories: Personalization for Retail: http://blogs.oracle.com/retail/entry/personalization_for_retailShop Direct User Experience Focus Drives Sales:https://blogs.oracle.com/retail/entry/shop_direct_user_experience_focusMaking Waves: Australian Online Retailer SurfStitch: https://blogs.oracle.com/oracleretail/entry/surf_stitchWhat’s new in Oracle Commerce v11.1 for RetailWhat the Content+Commerce Equation is Missing

    Read the article

  • CodePlex Daily Summary for Friday, June 29, 2012

    CodePlex Daily Summary for Friday, June 29, 2012Popular ReleasesSupporting Guidance and Whitepapers: v1 - Supporting Media: Welcome to the Release Candidate (RC) release of the ALM Rangers Readiness supporting edia As this is a RC release and the quality bar for the final Release has not been achieved, we value your candid feedback and recommend that you do not use or deploy these RC artifacts in a production environment. Quality-Bar Details All critical bugs have been resolved Known Issues / Bugs Practical Ruck training workshop not yet includedDesigning Windows 8 Applications with C# and XAML: Chapters 1 - 7 Release Preview: Source code for all examples from Chapters 1 - 7 for the Release PreviewDataBooster - Extension to ADO.NET Data Provider: DataBooster Library for Oracle + SQL Server Beta2: This is a derivative library of dbParallel project http://dbparallel.codeplex.com. All above binaries releases require .NET Framework 4.0 or later. SQL Server support is always build-in (can't be unplugged). The first download (DLL) also requires ODP.NET to connect Oracle; The second download (DLL) also requires DataDirect(3.5) to connect Oracle; The third download (DLL) doesn't support Oracle. Please download the source code if the provider need to be replaced by others. For example ODP.NE...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.57: Fix for issue #18284: evaluating literal expressions in the pattern c1 * (x / c2) where c1/c2 is an integer value (as opposed to c2/c1 being the integer) caused the expression to be destroyed.Visual Studio ALM Quick Reference Guidance: v2 - Visual Studio 2010 (Japanese): Rex Tang (?? ??) http://blogs.msdn.com/b/willy-peter_schaub/archive/2011/12/08/introducing-the-visual-studio-alm-rangers-rex-tang.aspx, Takaho Yamaguchi (?? ??), Masashi Fujiwara (?? ??), localized and reviewed the Quick Reference Guidance for the Japanese communities, based on http://vsarquickguide.codeplex.com/releases/view/52402. The Japanese guidance is available in AllGuides and Everything packages. The AllGuides package contains guidances in PDF file format, while the Everything packag...Visual Studio Team Foundation Server Branching and Merging Guide: v1 - Visual Studio 2010 (Japanese): Rex Tang (?? ??) http://blogs.msdn.com/b/willy-peter_schaub/archive/2011/12/08/introducing-the-visual-studio-alm-rangers-rex-tang.aspx, Takaho Yamaguchi (?? ??), Hirokazu Higashino (?? ??), localized and reviewed the Branching Guidance for the Japanese communities, based on http://vsarbranchingguide.codeplex.com/releases/view/38849. The Japanese guidance is available in AllGuides and Everything packages. The AllGuides package contains guidances in PDF file format, while the Everything packag...SQL Server FineBuild: Version 3.1.0: Top SQL Server FineBuild Version 3.1.0This is the stable version of FineBuild for SQL Server 2012, 2008 R2, 2008 and 2005 Documentation FineBuild Wiki containing details of the FineBuild process Known Issues Limitations with this release FineBuild V3.1.0 Release Contents List of changes included in this release Please DonateFineBuild is free, but please donate what you think FineBuild is worth as everything goes to charity. Tearfund is one of the UK's leading relief and de...EasySL: RapidSL V2: Rewrite RapidSL UI Framework, Using Silverlight 5.0 EF4.1 Code First Ria Service SP2 + Lastest Silverlight Toolkit.NETDeob0: NETDeob 0.1.1: http://i.imgur.com/54C78.pngSOLID by example: All examples: All solid examplesSiteMap Editor for Microsoft Dynamics CRM 2011: SiteMap Editor (1.1.1726.406): Use of new version of connection controls for a full support of OSDP authentication mechanism for CRM Online.Umbraco CMS: Umbraco CMS 5.2: Development on Umbraco v5 discontinued After much discussion and consultation with leaders from the Umbraco community it was decided that work on the v5 branch would be discontinued with efforts being refocused on the stable and feature rich v4 branch. For full details as to why this decision was made please watch the CodeGarden 12 Keynote. What about all that hard work?!?? We are not binning everything and it does not mean that all work done on 5 is lost! we are taking all of the best and m...CodeGenerate: CodeGenerate Alpha: The Project can auto generate C# code. Include BLL Layer、Domain Layer、IDAL Layer、DAL Layer. Support SqlServer And Oracle This is a alpha program,but which can run and generate code. Generate database table info into MS WordXDA ROM HUB: XDA ROM HUB v0.9: Kernel listing added -- Thanks to iONEx Added scripts installer button. Added "Nandroid On The Go" -- Perform a Nandroid backup without a PC! Added official Android app!ExtAspNet: ExtAspNet v3.1.8.2: +2012-06-24 v3.1.8 +????Grid???????(???????ExpandUnusedSpace????????)(??)。 -????MinColumnWidth(??????)。 -????AutoExpandColumn,???????????????(ColumnID)(?????ForceFitFirstTime??ForceFitAllTime,??????)。 -????AutoExpandColumnMax?AutoExpandColumnMin。 -????ForceFitFirstTime,????????????,??????????(????????????)。 -????ForceFitAllTime,????????????,??????????(??????????????????)。 -????VerticalScrollWidth,????????(??????????,0?????????????)。 -????grid/grid_forcefit.aspx。 -???????????En...AJAX Control Toolkit: June 2012 Release: AJAX Control Toolkit Release Notes - June 2012 Release Version 60623June 2012 release of the AJAX Control Toolkit. AJAX Control Toolkit .NET 4 – AJAX Control Toolkit for .NET 4 and sample site (Recommended). AJAX Control Toolkit .NET 3.5 – AJAX Control Toolkit for .NET 3.5 and sample site (Recommended). Notes: - The current version of the AJAX Control Toolkit is not compatible with ASP.NET 2.0. The latest version that is compatible with ASP.NET 2.0 can be found here: 11121. - Pages using ...WPF Application Framework (WAF): WPF Application Framework (WAF) 2.5.0.5: Version: 2.5.0.5 (Milestone 5): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Changelog Legend: [B] Breaking change; [O] Marked member as obsolete WAF: Add IsInDesignMode property to the WafConfiguration class. WAF: Introduce the IModuleController interface. WAF: Add ...Windows 8 Metro RSS Reader: Metro RSS Reader.v7: Updated for Windows 8 Release Preview Changed background and foreground colors Used VariableSizeGrid layout to wrap blog posts with images Sort items with Images first, text-only last Enabled Caching to improve navigation between framesConfuser: Confuser 1.9: Change log: * Stable output (i.e. given the same seed & input assemblies, you'll get the same output assemblies) + Generate debug symbols, now it is possible to debug the output under a debugger! (Of course without enabling anti debug) + Generating obfuscation database, most of the obfuscation data in stored in it. + Two tools utilizing the obfuscation database (Database viewer & Stack trace decoder) * Change the protection scheme -----Please read Bug Report before you report a bug-----...XDesigner.Development: First release: First releaseNew ProjectsArcGIS Server Rest Catalog: jquery widget that display all MapServer service of ESRI ArcGIS Server in accordion control. Require jQuery and JsRender. BugsBox.Lib: BugsBoxlibCodeStudy: The project includes all my code written for .NET studyCommunity Tfs Team Tools: Community TFS Team Tools is a community project based on the example code from ALM Rangers - Quick Response Sample Command line utility to manage TFS TeamsDiveBaseManager: DiveBaseManagerEclipse App: Aplicacion para android para el envio de coordenaads a un servidorJQMdotNET: JQMdotNet is an early attempt to make a series of MVC HTML helpers to quickly render JQuery Mobile pages.MathBuilderFramework: Math Builder FX is a framework for math problems. The idea of this framework is to create real exercises of mathematics throught base classes.MDS MODELING WORKBOOK: MDS Modeling Workbook is a modeling tool and a solution accelerator for Microsoft Master Data Services. Minesweeper: a clean old minesweeperMultithreaded Port Scanner Utility: Mulithreaded Port Scanner Utility is a very simple port scanner that take advantage of the new System.Threading.Task namespace in .Net 4.0Nonocast.Data: Nonocast.Data is a free, open source developer focused object persistence for small and medium software.On{X} Scripts: This project is to share scripts I create or modify from available free/opensource scripts (off-course with due mentions & links to original developer).PCS MAP: ladPowerShell Module for Mayhem: A module for Mayhem with a reaction which executes a PowerShell script.PunkBuster™ Screenshot Viewer: PunkBuster™ Screenshot Viewer shows screenshots of all games protected with PunkBuster™.Random reminder: Objective: To create a simple Windows application that can schedule and automatically reschedule a random timer that falls within a defined interval.Remembrall: Some useful stuff (at least for me) ! Javascript plugins jQuery plugins C# utilities ASP.NET MVC helper extensionsrepotfs: Repositório de exemplosSharepoint Designer: Sharepoint ExplororSPFluid: Simple modularity and data access framework.SPManager: Helper classSQL Server : xp_fixeddrives2: C++ Extended Stored Procedure Get volumes InfoSWORD Extensions for Sharepoint: Push documents in and out of Sharepoint using the SWORD protocol. ***incomplete***Test projects: My testing projecttestdd06282012git01: xzc testdd06282012git1: sadtestdd06282012tfs01: xzctestddhg06282012: xctestddtfs062820121: ,lm;.testhg06282012dd1: zxTlrk: NATotal_Mayhem: Add-on for Mayhem ( http://mayhem.codeplex.com/ ) that includes networking events, power reactions, and more.WallSwitch: An application to cycle your desktop wallpaper.??????: ????????

    Read the article

  • T-SQL Tuesday #33: Trick Shots: Undocumented, Underdocumented, and Unknown Conspiracies!

    - by Most Valuable Yak (Rob Volk)
    Mike Fal (b | t) is hosting this month's T-SQL Tuesday on Trick Shots.  I love this choice because I've been preoccupied with sneaky/tricky/evil SQL Server stuff for a long time and have been presenting on it for the past year.  Mike's directives were "Show us a cool trick or process you developed…It doesn’t have to be useful", which most of my blogging definitely fits, and "Tell us what you learned from this trick…tell us how it gave you insight in to how SQL Server works", which is definitely a new concept.  I've done a lot of reading and watching on SQL Server Internals and even attended training, but sometimes I need to go explore on my own, using my own tools and techniques.  It's an itch I get every few months, and, well, it sure beats workin'. I've found some people to be intimidated by SQL Server's internals, and I'll admit there are A LOT of internals to keep track of, but there are tons of excellent resources that clearly document most of them, and show how knowing even the basics of internals can dramatically improve your database's performance.  It may seem like rocket science, or even brain surgery, but you don't have to be a genius to understand it. Although being an "evil genius" can help you learn some things they haven't told you about. ;) This blog post isn't a traditional "deep dive" into internals, it's more of an approach to find out how a program works.  It utilizes an extremely handy tool from an even more extremely handy suite of tools, Sysinternals.  I'm not the only one who finds Sysinternals useful for SQL Server: Argenis Fernandez (b | t), Microsoft employee and former T-SQL Tuesday host, has an excellent presentation on how to troubleshoot SQL Server using Sysinternals, and I highly recommend it.  Argenis didn't cover the Strings.exe utility, but I'll be using it to "hack" the SQL Server executable (DLL and EXE) files. Please note that I'm not promoting software piracy or applying these techniques to attack SQL Server via internal knowledge. This is strictly educational and doesn't reveal any proprietary Microsoft information.  And since Argenis works for Microsoft and demonstrated Sysinternals with SQL Server, I'll just let him take the blame for it. :P (The truth is I've used Strings.exe on SQL Server before I ever met Argenis.) Once you download and install Strings.exe you can run it from the command line.  For our purposes we'll want to run this in the Binn folder of your SQL Server instance (I'm referencing SQL Server 2012 RTM): cd "C:\Program Files\Microsoft SQL Server\MSSQL11\MSSQL\Binn" C:\Program Files\Microsoft SQL Server\MSSQL11\MSSQL\Binn> strings *sql*.dll > sqldll.txt C:\Program Files\Microsoft SQL Server\MSSQL11\MSSQL\Binn> strings *sql*.exe > sqlexe.txt   I've limited myself to DLLs and EXEs that have "sql" in their names.  There are quite a few more but I haven't examined them in any detail. (Homework assignment for you!) If you run this yourself you'll get 2 text files, one with all the extracted strings from every SQL DLL file, and the other with the SQL EXE strings.  You can open these in Notepad, but you're better off using Notepad++, EditPad, Emacs, Vim or another more powerful text editor, as these will be several megabytes in size. And when you do open it…you'll find…a TON of gibberish.  (If you think that's bad, just try opening the raw DLL or EXE file in Notepad.  And by the way, don't do this in production, or even on a running instance of SQL Server.)  Even if you don't clean up the file, you can still use your editor's search function to find a keyword like "SELECT" or some other item you expect to be there.  As dumb as this sounds, I sometimes spend my lunch break just scanning the raw text for anything interesting.  I'm boring like that. Sometimes though, having these files available can lead to some incredible learning experiences.  For me the most recent time was after reading Joe Sack's post on non-parallel plan reasons.  He mentions a new SQL Server 2012 execution plan element called NonParallelPlanReason, and demonstrates a query that generates "MaxDOPSetToOne".  Joe (formerly on the Microsoft SQL Server product team, so he knows this stuff) mentioned that this new element was not currently documented and tried a few more examples to see what other reasons could be generated. Since I'd already run Strings.exe on the SQL Server DLLs and EXE files, it was easy to run grep/find/findstr for MaxDOPSetToOne on those extracts.  Once I found which files it belonged to (sqlmin.dll) I opened the text to see if the other reasons were listed.  As you can see in my comment on Joe's blog, there were about 20 additional non-parallel reasons.  And while it's not "documentation" of this underdocumented feature, the names are pretty self-explanatory about what can prevent parallel processing. I especially like the ones about cursors – more ammo! - and am curious about the PDW compilation and Cloud DB replication reasons. One reason completely stumped me: NoParallelHekatonPlan.  What the heck is a hekaton?  Google and Wikipedia were vague, and the top results were not in English.  I found one reference to Greek, stating "hekaton" can be translated as "hundredfold"; with a little more Wikipedia-ing this leads to hecto, the prefix for "one hundred" as a unit of measure.  I'm not sure why Microsoft chose hekaton for such a plan name, but having already learned some Greek I figured I might as well dig some more in the DLL text for hekaton.  Here's what I found: hekaton_slow_param_passing Occurs when a Hekaton procedure call dispatch goes to slow parameter passing code path The reason why Hekaton parameter passing code took the slow code path hekaton_slow_param_pass_reason sp_deploy_hekaton_database sp_undeploy_hekaton_database sp_drop_hekaton_database sp_checkpoint_hekaton_database sp_restore_hekaton_database e:\sql11_main_t\sql\ntdbms\hekaton\sqlhost\sqllang\hkproc.cpp e:\sql11_main_t\sql\ntdbms\hekaton\sqlhost\sqllang\matgen.cpp e:\sql11_main_t\sql\ntdbms\hekaton\sqlhost\sqllang\matquery.cpp e:\sql11_main_t\sql\ntdbms\hekaton\sqlhost\sqllang\sqlmeta.cpp e:\sql11_main_t\sql\ntdbms\hekaton\sqlhost\sqllang\resultset.cpp Interesting!  The first 4 entries (in red) mention parameters and "slow code".  Could this be the foundation of the mythical DBCC RUNFASTER command?  Have I been passing my parameters the slow way all this time? And what about those sp_xxxx_hekaton_database procedures (in blue)? Could THEY be the secret to a faster SQL Server? Could they promise a "hundredfold" improvement in performance?  Are these special, super-undocumented DIB (databases in black)? I decided to look in the SQL Server system views for any objects with hekaton in the name, or references to them, in hopes of discovering some new code that would answer all my questions: SELECT name FROM sys.all_objects WHERE name LIKE '%hekaton%' SELECT name FROM sys.all_objects WHERE object_definition(OBJECT_ID) LIKE '%hekaton%' Which revealed: name ------------------------ (0 row(s) affected) name ------------------------ sp_createstats sp_recompile sp_updatestats (3 row(s) affected)   Hmm.  Well that didn't find much.  Looks like these procedures are seriously undocumented, unknown, perhaps forbidden knowledge. Maybe a part of some unspeakable evil? (No, I'm not paranoid, I just like mysteries and thought that punching this up with that kind of thing might keep you reading.  I know I'd fall asleep without it.) OK, so let's check out those 3 procedures and see what they reveal when I search for "Hekaton": sp_createstats: -- filter out local temp tables, Hekaton tables, and tables for which current user has no permissions -- Note that OBJECTPROPERTY returns NULL on type="IT" tables, thus we only call it on type='U' tables   OK, that's interesting, let's go looking down a little further: ((@table_type<>'U') or (0 = OBJECTPROPERTY(@table_id, 'TableIsInMemory'))) and -- Hekaton table   Wellllll, that tells us a few new things: There's such a thing as Hekaton tables (UPDATE: I'm not the only one to have found them!) They are not standard user tables and probably not in memory UPDATE: I misinterpreted this because I didn't read all the code when I wrote this blog post. The OBJECTPROPERTY function has an undocumented TableIsInMemory option Let's check out sp_recompile: -- (3) Must not be a Hekaton procedure.   And once again go a little further: if (ObjectProperty(@objid, 'IsExecuted') <> 0 AND ObjectProperty(@objid, 'IsInlineFunction') = 0 AND ObjectProperty(@objid, 'IsView') = 0 AND -- Hekaton procedure cannot be recompiled -- Make them go through schema version bumping branch, which will fail ObjectProperty(@objid, 'ExecIsCompiledProc') = 0)   And now we learn that hekaton procedures also exist, they can't be recompiled, there's a "schema version bumping branch" somewhere, and OBJECTPROPERTY has another undocumented option, ExecIsCompiledProc.  (If you experiment with this you'll find this option returns null, I think it only works when called from a system object.) This is neat! Sadly sp_updatestats doesn't reveal anything new, the comments about hekaton are the same as sp_createstats.  But we've ALSO discovered undocumented features for the OBJECTPROPERTY function, which we can now search for: SELECT name, object_definition(OBJECT_ID) FROM sys.all_objects WHERE object_definition(OBJECT_ID) LIKE '%OBJECTPROPERTY(%'   I'll leave that to you as more homework.  I should add that searching the system procedures was recommended long ago by the late, great Ken Henderson, in his Guru's Guide books, as a great way to find undocumented features.  That seems to be really good advice! Now if you're a programmer/hacker, you've probably been drooling over the last 5 entries for hekaton (in green), because these are the names of source code files for SQL Server!  Does this mean we can access the source code for SQL Server?  As The Oracle suggested to Neo, can we return to The Source??? Actually, no. Well, maybe a little bit.  While you won't get the actual source code from the compiled DLL and EXE files, you'll get references to source files, debugging symbols, variables and module names, error messages, and even the startup flags for SQL Server.  And if you search for "DBCC" or "CHECKDB" you'll find a really nice section listing all the DBCC commands, including the undocumented ones.  Granted those are pretty easy to find online, but you may be surprised what those web sites DIDN'T tell you! (And neither will I, go look for yourself!)  And as we saw earlier, you'll also find execution plan elements, query processing rules, and who knows what else.  It's also instructive to see how Microsoft organizes their source directories, how various components (storage engine, query processor, Full Text, AlwaysOn/HADR) are split into smaller modules. There are over 2000 source file references, go do some exploring! So what did we learn?  We can pull strings out of executable files, search them for known items, browse them for unknown items, and use the results to examine internal code to learn even more things about SQL Server.  We've even learned how to use command-line utilities!  We are now 1337 h4X0rz!  (Not really.  I hate that leetspeak crap.) Although, I must confess I might've gone too far with the "conspiracy" part of this post.  I apologize for that, it's just my overactive imagination.  There's really no hidden agenda or conspiracy regarding SQL Server internals.  It's not The Matrix.  It's not like you'd find anything like that in there: Attach Matrix Database DM_MATRIX_COMM_PIPELINES MATRIXXACTPARTICIPANTS dm_matrix_agents   Alright, enough of this paranoid ranting!  Microsoft are not really evil!  It's not like they're The Borg from Star Trek: ALTER FEDERATION DROP ALTER FEDERATION SPLIT DROP FEDERATION   #tsql2sday

    Read the article

  • CodePlex Daily Summary for Wednesday, May 05, 2010

    CodePlex Daily Summary for Wednesday, May 05, 2010New Projects2010微软精英大挑战Heritage of Dragon项目: 我们来自上海市同济大学,兴趣相投,集聚于此共同构建一个开放的网络平台。致力于运用构建在云端基于地图的服务,使用文字、图片、视频、互动动画等形式来展示全国各地的传统手工艺。并且充分发挥网络的优势,通过开放协作的维基平台人人都可以参与到内容的添加修改与完善中来。目的在于记录、展示、挖掘、传承中国古...AutoArchive: Auto archive your "my documents" to a remote machine. I'm writing this so my wife can put things in "my documents" and it'll automaticly archive i...BigDoor .NET Client: A .NET client for the BigDoor Media API. The API enables secured virtual transactions with support for any number of currencies, transactions, awar...bubujie: Dreamweaver LibraryGeckoGit: GeckoGit is a combination of TortoiseSVN and AnkhSVN, but for Git repositories, and built on the GitSharp library.Global: global, config, mail, http, rest, xml, serialization, helper, path, ioIndustrial Dashboard Connected Grid webpart: This Sharepoint 2007/10 webpart provides a simple way to display grid based reports populated with data that comes from a SQL Server stored procedu...IpControls: "IpControls" contains IPv4 and IPv6 text boxes, both as Windows Forms and WPF version. The IPv6 control automatically detects the older hybrid for...LiteME: LiteME is short for LiteMapleStoryEmulator... it is v75, open-source, and still going through it's alpha stages. It is still in development!Meditel PHP Class: Une classe PHP qui vous permet de d'envoyer des SMS vers tous les numeros Meditel en utilisant leservice des SMS gratuits depuis le site Meditel.maMoneySafe: Help people.Mouse Zoom - Visual Studio Extension: Mouse Zoom is a Visual Studio 2010 extension that will cause the mouse zoom functionality to zoom at the mouse's cursor instead of at the top of th...Multi-Language Words Memorizer: This .net application is designed for learning words and help foreign language learners by lots of automatic features. After you select a list of ...Navigation for ASP.NET Web Forms: Navigation for ASP.NET Web Forms manages movement and data passing between aspx Pages in a unit testable manner. There is no Client-side logic, so ...NazTek.Extension.Clr4: CLR 4.0 extensions and utility APIOpalis Community Releases: Sample workflows, objects, code and other items related to System Center's Opalis Integration Server, published by the Opalis team.Power Video Player: Power Video Player is a slim feature-rich video/dvd player that meets everyday needs in video playback on PC with a bunch of advanced features on b...SchemeEditor: <WPF> <.NET> <Editor> <Silverlight> <Scheme> <Graphics> <simulink> <schematic>StyleCop+: StyleCop+ is a plug-in that extends original StyleCop features.timemanager2010: Just another work time managerTweetTunes: Updates Twitter with current song playing in iTunes - if your Twitter account is linked to Facebook - it will update that too The twittervb2 down...WCF Discovery Library: WCF Discovery Library is a small collection of utilities that makes it easy to add WCF 4.0 Discovery features into your projects.New ReleasesAjaxControlToolkit additional extenders: ControlToolkitExtended: this build contains web example with BreadCrumbsAnyCAD: AnyCAD Free Beta1: AnyCAD Free Beta1Baccarat: Single player practice baccarat: This is a simple baccarat game for Windows Mobile. It is single player and is only a practice version, which will help users familiarize themselve...BigDoor .NET Client: BigDoor .NET 2.0 Client (Alpha): Our first iteration of the .NET client. Please fork and or ask to be added if you want to make any contributions.CBM-Command: 2010-05-04: Release NotesNew Features Panel navigation now complete. Scroll up and down through directories using the up and down cursor keys. Switch between...Directory Linker: Directory Linker 2.1: This release introduces XP support, more information about all features can be found at http://www.humblecoder.co.uk/?p=141Extend SmallBasic: Teaching Extensions v.015: added high low quizGoogle AJAX Search Services for jQuery: jquery.gss-0.1.3.js: First official release - use at your own discretion. Thanks, AndrewIndustrial Dashboard Connected Grid webpart: Filtered Industrial Grid: Filtered Industrial Grid web part for SharePoint 2007/2010, First Release.jQuery Library for SharePoint Web Services: SPServices 0.5.5: IMPORTANT NOTE: This release is in an alpha state. You should only download it if you know what you are getting and are interested in testing it f...Meditel PHP Class: Meditel PHP Class: Zipped File : Example : exemplemeditel.php PHP Class : meditel.class.phpMulti-Language Words Memorizer: Memorizer 1.0: First release.mwNSPECT: mwNSPECT Plugin DLL: mwNSPECT Mapwindow plugin dll. Place in your MapWindow or BASINS plugins directory. Presently only for testing form functionality (not including...mwNSPECT: mwNSPECT Simple Installer: Simplistic mwNSPECT Mapwindow plugin installer using Inno setup. Installs all the files you'll need for NSPECT into the C:\NSPECT folder and insta...MyWSAT - ASP.NET Membership Administration Tool: MyWSAT v3.5.3: MyWSAT 3.5.3 Update Notes - May 4th 2010 1.) Added the user search box and a-z navigation menu to all relevant user gridviews. 2.) Added a membersh...Object/Relational Mapper & Code Generator in Net 2.0 for Relational & XML Schema: 2.7: Upgraded UI-generation templates for special case of associative tables (2-column primary keys). Minor bugfix with template-editor.Open NFSe: Open NFSe 2.0: Versao para Belo Horizonte utilizando Windows Services.Power Video Player: PVP 1.1.3776: v1.1.3776 This is mainly a rebuild of version 1.1 under Ms-PL license and is the 1st version available at CodePlex.PROGRAMMABLE SOFTWARE DEVELOPMENT ENVIRONMENT: PROGRAMMABLE SOFTWARE DEVELOPMENT ENVIRONMENT-3.1: The following error has been corrected: PCG ERROR: srcproj -- 3933 PCG ERROR: srcproj -- 2943 PCG ERROR: devproj -- 1474 PCG ERROR: mainprj -- 128...Rehost Image: 1.3.9: Fixed locations saving for mac and linux platforms.Robot Shootans: Robot Shootans 0.5.1 (Windows): This is the first public release of this game. Instructions on how to play are included in the game itself Known issues: Changing control style wh...SchemeEditor: SchemeEditor Beta: First release. Wait for documentation & update for some new functionSharePoint Rsync List: SharePoint Rsync 0.9.0.0: Initial release of sprsync. Comments, questions, feedback, and code enhancements are welcome!Software Is Hardwork: Sw. Is Hw. Lib. 3.0.0.x+01: Sw. Is Hw. Lib. 3.0.0.x+01 UNSUPPORTED, UNTESTED ALPHA RELEASE Code may disappear. This is just a preview of code that was in progress. Code is s...Software Localization Tool: SharpSLT 1.0.1: Minor release: bug fixes slight changes in the UIStyleCop+: StyleCop+ 0.6: Several important improvements made for Advanced Naming Rules: - Added new entities for fields and constants - Added new entities for methods (incl...turing machine simulator: First version of turing machine: Overview: First version of turing simulator with example script (transaction function). Files: SimulatorGui.exe - main GUI of simulator TuringMach...VCC: Latest build, v2.1.30504.0: Automatic drop of latest buildVocabulary Training Center: Basic Edition 1.1: A release with medium large changes: New functionality: Multiple-choice questions added Grammatical questions added Evaluation changed accordin...Web Service Software Factory: Web Service Software Factory 2010 RC: To use the Web Service Software Factory 2010, you need the following software installed on your computer: • Microsoft Visual Studio 2010 (Ultima...Web Service Software Factory: WSSF2010 Guide: This is the help and guidance for Web Service Software Factory 2010Windows Phone 7 Panorama control: panorama control v0.6 + samples: IMPORTANT NOTE: Please read the following bug + suggested workaround. I'll fix this in a new release shortly. Panorama Control source code + sampl...WPF Behavior Library: WPF Behavior Library 0.2 Release: Drag & Drop Took away the ItemType and DataTemplate requirements Added functions for inheritors to be able to provide custom logic to handle movi...Most Popular ProjectsRawrWBFS ManagerAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight Toolkitpatterns & practices – Enterprise LibraryWindows Presentation Foundation (WPF)iTuner - The iTunes CompanionDotNetNuke® Community EditionASP.NETMost Active Projectspatterns & practices – Enterprise LibraryAJAX Control FrameworkHydroServer - CUAHSI Hydrologic Information System ServerIonics Isapi Rewrite Filterpatterns & practices: Azure Security GuidanceRawrBlogEngine.NETTinyProjectNB_Store - Free DotNetNuke Ecommerce Catalog ModuleAll-In-One Code Framework

    Read the article

  • CodePlex Daily Summary for Friday, April 09, 2010

    CodePlex Daily Summary for Friday, April 09, 2010New Projects(SocketCoder) Free Silverlight Voice/Video Conferencing Modules: The Goal of this project is to provide complete Open Source Voice/Video Chatting Client/Server Modules Using Silverlight techniques, this project i...AJAX Control Framework: Do PageMethods and the UpdatePanel make you feel dirty? Think making AJAX enabled custom ASP.NET controls should WAY easier than it is? Wish ASP.NE...Bluetooth Radar: WPF 4.0 Application working with The final release of 32feet.net (v2.2) to Discover Bluetooth devices, send files and more cool stuff for Bluetooth...Bomberman: Bomberman c++ Project Code Library: This is just a personal storage place for a utility library containing extension methods, new classes, and/or improvements to existing classes.DianPing.com MogileFS Client: MogileFS Client for .Net 2.0Dirty City Hearts Website: Dirty City Hearts WebsiteDocGen - SharePoint 2010 Bulk Document Loader: DocGen is a SharePoint 2010 multithreaded console application for bulk loading sample documents into SharePoint. This program generates Microsoft ...dou24: WebSite for DOUExplora: Explora es un navegador de archivos que no pretende ser un sustituto del explorador de Windows, sino un experimento de codificación que compartir c...HobbyBrew Mobile: This project is basic beer brewing software for Windows Mobile able to read HobbyBrew xml files. Developed in C# and Windows FormsjLight: Interop between Silverlight and the javascript based on jQuery. The syntax used in Silverlight is as close as posible to the jQuery syntax.johandekoning.nl samples: Sample code project which are discussed on johandekoning.nl / johandekoning.com. Most examples are / will be developed with C#Kanban: this is a agile paroject managementMETAR.NET Decoder: Project libraries used to decode airport METAR weather information into adequate data types, change them and back, create resulting METAR informati...Micro Framework: MFDeploy with Set/Get mote SKU ID: This is a modification to the Micro Framework's MFDeploy utility that lets the user set and get the mote's ID (aka SKU). It can be done via the GUI...MobySharp: MobySharp is a implementation of the Mobypicture.com API written in C#NGilead: NGilead permits you to use your NHibernate POCO (and especially the partially loaded ones) outside the .NET Virtual Machine (to Silverlight for exa...OpenIdPortableArea: OpenIdPortableArea is an MvcContrib powered Portable Area that encapsulates logic for implementing OpenId encapsulation (using DotNetOpenAuth).OrderToList Extension for IEnumerable: An extension method for IEnumerable<T> that will sort the IEnumerable based on a list of keys. Suppose you have a list of IDs {10, 5, 12} and wa...project3140.org: Code repository for project3140.org.Prometheus Backup Solution: The Prometheus Backup Solution is a free and small Backup Utility for personal use and for small businesses.Roids: an asteroids clone for Silverlight and XNA: An example of a simple game cross-compiling for both Silverlight and XNA using SilverSprite.SemanticAnalyzer: 3rd phase of Compiler Design ProjectSSRS SDK for PHP: SQL Server Reporting Service SDK for PHPWorking Memory Workout: Working Memory Workout is a working memory training game based on the N-back, a task researchers say may improve fluid intelligence. It greatly ex...Wouters Code Samples: This Project will host some of my sample projects I created. I'm a professional SharePoint/BizTalk developer so most of the provided samples will ...New Releases(SocketCoder) Free Silverlight Voice/Video Conferencing Modules: Silverlight Voice Video Chat Modules: Client/Server Silverlight Voice Video Chat ModulesAccessibilityChecker: Accessibility Checker V0.2: Accessibility Checker V0.2 - Direct url´s input functionality added - XHTML, WAI validation modules, easy to extend. (W3C and Achecker modules incl...AStar.net: AStar.net 1.1 downloads: AStar.net 1.1 Version detailsGreatly improved path finding speed and memory usage from version 1.0. Avalaible downloads:AStar.net 1.1 dll - Runtim...AutoPoco: AutoPoco 0.2: This release will bring some non-generic alternatives to configuration + some more automatic configuration options such as assembly scanningBluetooth Radar: Version 1: Basic version only with the ability to discover Bluetooth devices around you.Convert-Media PowerShell Module for Expression Encoder: Release 1.0.0.2: This is a build that incorporates the latest change sets including perform publish. No other changesDevTreks -social budgeting that improves lives and livelihoods: Social Budgeting Web Software, DevTreks alpha 3e: Alpha 3e is a general debug. It also upgrades the software's family budgeting capabilities, including the addition of a new 'Food Nutrition Input'...dV2t Enterprise Library: dV2tEntLib 1.0.0.3: dV2tEntLib 1.0.0.3EnhSim: Release v1.9.8.3: Release v1.9.8.3 Change Armour Penetration calcs to apply the "Rouncer fix" (current version displays debug info to assist users in testing that th...HouseFly controls: HouseFly controls alpha 0.9: HouseFly controls 0.9 alpha binaries (Includes HouseFly.Classes and HouseFly.Controls).Jitbit WYSWYG BBCode Editor: Release: ReleaseMicro Framework: MFDeploy with Set/Get mote SKU ID: MFDeploy with get, set mote ID: The Micro Framework 4.0 MFDeploy, modified to let the user get & set the mote IDMobySharp: MobySharp 1.0: Initial ReleaseOpenIdPortableArea: OpenIdPortableArea: OpenIdPortableArea.Release: DotNetOpenAuth.dll DotNetOpenAuth.xml MvcContrib.dll MvcContrib.xml OpenIdPortableArea.dll OpenIdPortableAre...OrderToList Extension for IEnumerable: Release 0.9b: I'm calling this 0.9 because I came up with it yesterday and there's little real word use so there's probably something that needs fixing or improv...Prometheus Backup Solution: Prometheus BETA: Actual BETA Release. Restore Functions are not available...Reusable Library: V1.0.6: A collection of reusable abstractions for enterprise application developer.Reusable Library Demo: V1.0.4: A demonstration of reusable abstractions for enterprise application developerSharePoint Labs: SPLab4005A-FRA-Level100: SPLab4005A-FRA-Level100 This SharePoint Lab will teach you the 5th best practice you should apply when writing code with the SharePoint API. Lab La...SharePoint Labs: SPLab6001A-FRA-Level200: SPLab6001A-FRA-Level200 This SharePoint Lab will teach you how to create a generic Feature Receiver within Visual Studio. Creating a Feature Receiv...SharePoint LogViewer: SharePoint LogViewer 2.0: Supports live Farm monitoring. Many bug fixes.Simple Savant: Simple Savant v0.5: Added support for custom constraint/validation logic (See Versioning and Consistency) Added support for reliable cross-domain writes (See Version...SQL Server Extended Properties Quick Editor: Release 1.6.1: Whats new in 1.6.1: Add an edit form to support long text editing. double click to open editor. Add an ORM extended properties initializer to creat...SSRS SDK for PHP: SSRS SDK for PHP: Current release includes the SSRSReport library to connect to SQL Server Reporting Services and a sample application to show the basic steps needed...Table Storage Backup & Restore for Windows Azure: Table Storage Backup 1.0.3751: Bug fix: Crash when creating a table if the existing table had not finished deleting. Bug fix: Incorrect batch URI if the storage account ended in ...VCC: Latest build, v2.1.30408.0: Automatic drop of latest buildVisual Studio DSite: Audio Player (Visual C++ 2008): An audio player that can play wav files.Working Memory Workout: Working Memory Workout 1.0: Working Memory Workout is a working memory trainer based on the N-back memory task.Wouters Code Samples: XMLReceiveCBR: This is a Custom Pipeline component. It will help you create a Content Based Routing solution in combination of a WCF Requst/Response service. Gene...Xen: Graphics API for XNA: Xen 1.8: Version 1.8 (XNA 3.1) This update fixes a number of bugs in several areas of the API and introduces a large new Tutorial. [Added] L2 Spherical Ha...Most Popular ProjectsWBFS ManagerRawrMicrosoft SQL Server Product Samples: DatabaseASP.NET Ajax LibrarySilverlight ToolkitAJAX Control ToolkitWindows Presentation Foundation (WPF)ASP.NETMicrosoft SQL Server Community & SamplesFacebook Developer ToolkitMost Active ProjectsnopCommerce. Open Source online shop e-commerce solution.Shweet: SharePoint 2010 Team Messaging built with PexRawrAutoPocopatterns & practices – Enterprise LibraryIonics Isapi Rewrite FilterNB_Store - Free DotNetNuke Ecommerce Catalog ModuleFacebook Developer ToolkitFarseer Physics EngineNcqrs Framework - The CQRS framework for .NET

    Read the article

< Previous Page | 79 80 81 82 83 84 85 86  | Next Page >