Search Results

Search found 555 results on 23 pages for 'promotional pricing'.

Page 21/23 | < Previous Page | 17 18 19 20 21 22 23  | Next Page >

  • Oracle ATG Web Commerce 10 Implementation Developer Boot Camp - Reading (UK) - October 1-12, 2012

    - by Richard Lefebvre
    REGISTER NOW: Oracle ATG Web Commerce 10 Implementation Developer Boot Camp Reading, UK, October 1-12, 2012! OPN invites you to join us for a 10-day implementation bootcamp on Oracle ATG Web Commerce in Reading, UK from October 1-12, 2012.This 10-day boot camp is designed to provide partners with hands-on experience and technical training to successfully build and deploy Oracle ATG Web Commerce 10 Applications. This particular boot camp is focused on helping partners develop the essential skills needed to implement every aspect of an ATG Commerce Application from scratch, (not CRS-based), with a specific goal of enabling experienced Java/J2EE developers with a path towards becoming functional, effective, and contributing members of an ATG implementation team. Built for both new and experienced ATG developers alike, the collaborative nature of this program and its exercises, have proven to be highly effective and extremely valuable in learning the best practices for implementing ATG solutions. Though not required, this bootcamp provides a structured path to earning a Certified Oracle ATG Web Commerce 10 Specialization! What Is Covered: This boot camp is for Application Developers and Software Architects wanting to gain valuable insight into ATG application development best practices, as well as relevant and applicable implementation experience on projects modeled after four of the most common types of applications built on the ATG platform. The following learning objectives are all critical, and are of equal priority in enabling this role to succeed. This learning boot camp will help with: Building a basic functional transaction-ready ATG Web Commerce 10 Application. Utilizing ATG’s platform features such as scenarios, slots, targeters, user profiles and segments, to create a personalized user experience. Building Nucleus components to support and/or extend application functionality. Understanding the intricacies of ATG order checkout and fulfillment. Specifying, designing and implementing new commerce features in ATG 10. Building a functional commerce application modeled after four of the most common types of applications built on the ATG platform, within an agile-based project team environment and under simulated real-world project conditions. Duration: The Oracle ATG Web Commerce 10 Implementation Developer Boot Camp is an instructor-led workshop spanning 10 days. Audience: Application Developers Software Architects Prerequisite Training and Environment Requirements: Programming and Markup Experience with Java J2EE, JavaScript, XML, HTML and CSS Completion of Oracle ATG Web Commerce 10 Implementation Specialist Development Guided Learning Path modules Participants will be required to bring their own laptop that meets the minimum specifications:   64-bit PC and OS (e.g. Windows 7 64-bit) 4GB RAM or more 40GB Hard Disk Space Laptops will require access to the Internet through Remote Desktop via Windows. Agenda Topics: Week 1 – Day 1 through 5 Build a Basic Commerce Application In week one of the boot camp training, we will apply knowledge learned from the ATG Web Commerce 10 Implementation Developer Guided Learning Path modules, towards building a basic transaction-ready commerce application. There will be little to no lectures delivered in this boot camp, as developers will be fully engaged in ATG Application Development activities and best practices. Developers will work independently on the following lab assignments from day's 1 through 5: Lab Assignments  1 Environment Setup 2 Build a dynamic Home Page 3 Site Authentication 4 Build Customer Registration 5 Display Top Level Categories 6 Display Product Sub-Categories 7 Display Product List Page 8 Display Product Detail Page 9 ATG Inventory 10 Build “Add to Cart” Functionality 11 Build Shopping Cart 12 Build Checkout Page  13 Build Checkout Review Page 14 Create an Order and Build Order Confirmation Page 15 Implement Slots and Targeters for Personalization 16 Implement Pricing and Promotions 17 Order Fulfillment Back to top Week 2 – Day 6 through 10 Team-based Case Project In the second week of the boot camp training, participants will be asked to join a project team that will select a case project for the team to implement. Teams will be able to choose from four of the most common application types developed and deployed on the ATG platform. They are as follows: Hard goods with physical fulfillment, Soft goods with electronic fulfillment, a Service or subscription case example, a Course/Event registration case example. Team projects will have approximately 160 hours of use cases/stories for each team to build (40 hours per developer). Each day's Use Cases/Stories will build upon the prior day's work, and therefore must be fully completed at the end of each day. Please note that this boot camp intends to simulate real-world project conditions, and as such will likely require the need for project teams to possibly work beyond normal business hours. To promote further collaboration and group learning, each team will be asked to present their work and share the methodologies and solutions that they've applied to their cases at the end of each day. Location: Oracle Reading CVC TPC510 Room: Wraysbury Reading, UK 9:00 AM – 5:00 PM  Registration Fee (10 Days): US $3,375 Please click on the following link to REGISTER or  visit the Oracle ATG Web Commerce 10 Implementation Developer Boot Camp page for more information. Questions: Patrick Ty Partner Enablement, Oracle Commerce Phone: 310.343.7687 Mobile: 310.633.1013 Email: [email protected]

    Read the article

  • The Oracle Retail Week Awards - in review

    - by user801960
    The Oracle Retail Week Awards 2012 were another great success, building on the legacy of previous award ceremonies. Over 1,600 of the UK's top retailers gathered at the Grosvenor House Hotel and many of Europe's top retail leaders attended the prestigious Oracle Retail VIP Reception in the Grosvenor House Hotel's Red Bar. Over the years the Oracle Retail Week Awards have become a rallying point for the morale of the retail industry, and each nominated retailer served as a demonstration that the industry is fighting fit. It was an honour to speak to so many figureheads of UK - and global - retail. All of us at Oracle Retail would like to congratulate both the winners and the nominees for the awards. Retail is a cornerstone of the economy and it was inspiring to see so many outstanding demonstrations of innovation and dedication in the entries. Winners 2012   The Market Force Customer Service Initiative of the Year Winner: Dixons Retail: Knowhow Highly Commended: Hughes Electrical: Digital Switchover     The Deloitte Employer of the Year Winner: Morrisons     Growing Retailer of the Year Winner: Hallett Retail - The Concessions People Highly Commended: Blue Inc     The TCC Marketing/Advertising Campaign of the Year Winner: Sainsbury's: Feed your Family for £50     The Brandbank Multichannel Retailer of the Year Winner: Debenhams Highly Commended: Halfords     The Ashton Partnership Product Innovation of the Year Winner: Argos: Chad Valley Highly Commended: Halfords: Private label bikes     The RR Donnelley Pure-play Online Retailer of the Year Winner: Wiggle     The Hitachi Consulting Responsible Retailer of the Year Winner: B&Q: One Planet Home     The CA Technologies Retail Technology Initiative of the Year Winner: Oasis: Argyll Street flagship launch with iPad PoS     The Premier Tax Free Speciality Retailer of the Year Winner: Holland & Barrett     Store Design of the Year Winner: Next Home and Garden, Shoreham, Sussex Highly Commended: Dixons Retail, Black concept store, Birmingham Bullring     Store Manager of the Year Winner: Ian Allcock, Homebase, Aylesford Highly Commended: Darren Parfitt, Boots UK, Melton Mowbray Health Centre     The Wates Retail Destination of the Year Winner: Westfield, Stratford     The AlixPartners Emerging Retail Leader of the Year Winner: Catriona Marshall, HobbyCraft, Chief Executive     The Wipro Retail International Retailer of the Year Winner: Apple     The Clarity Search Retail Leader of the Year Winner: Ian Cheshire, Chief Executive, Kingfisher     The Oracle Retailer of the Year Winner: Burberry     Outstanding Contribution to Retail Winner: Lord Harris of Peckham     Oracle Retail and "Your Experience Platform" Technology is the key to providing that differentiated retail experience. More specifically, it is what we at Oracle call ‘the experience platform’ - a set of integrated, cross-channel business technology solutions, selected and operated by a retail business and IT team, and deployed in accordance with that organisation’s individual strategy and processes. This business systems architecture simultaneously: Connects customer interactions across all channels and touchpoints, and every customer lifecycle phase to provide a differentiated customer experience that meets consumers’ needs and expectations. Delivers actionable insight that enables smarter decisions in planning, forecasting, merchandising, supply chain management, marketing, etc; Optimises operations to align every aspect of the retail business to gain efficiencies and economies, to align KPIs to eliminate strategic conflicts, and at the same time be working in support of customer priorities.   Working in unison, these three goals not only help retailers to successfully navigate the challenges of today but also to focus on delivering that personalised customer experience based on differentiated products, pricing, services and interactions that will help you to gain market share and grow sales.  

    Read the article

  • ISACA Webcast follow up: Managing High Risk Access and Compliance with a Platform Approach to Privileged Account Management

    - by Darin Pendergraft
    Last week we presented how Oracle Privileged Account Manager (OPAM) could be used to manage high risk, privileged accounts.  If you missed the webcast, here is a link to the replay: ISACA replay archive (NOTE: you will need to use Internet Explorer to view the archive) For those of you that did join us on the call, you will know that I only had a little bit of time for Q&A, and was only able to answer a few of the questions that came in.  So I wanted to devote this blog to answering the outstanding questions.  Here they are. 1. Can OPAM track admin or DBA activity details during a password check-out session? Oracle Audit Vault is monitoring these activities which can be correlated to check-out events. 2. How would OPAM handle simultaneous requests? OPAM can be configured to allow for shared passwords.  By default sharing is turned off. 3. How long are the passwords valid?  Are the admins required to manually check them in? Password expiration can be configured and set in the password policy according to your corporate standards.  You can specify if you want forced check-in or not. 4. Can 2-factor authentication be used with OPAM? Yes - 2-factor integration with OPAM is provided by integration with Oracle Access Manager, and Oracle Adaptive Access Manager. 5. How do you control access to OPAM to ensure that OPAM admins don't override the functionality to access privileged accounts? OPAM provides separation of duties by using Admin Roles to manage access to targets and privileged accounts and to control which operations admins can perform. 6. How and where are the passwords stored in OPAM? OPAM uses Oracle Platform Security Services (OPSS) Credential Store Framework (CSF) to securely store passwords.  This is the same system used by Oracle Applications. 7. Does OPAM support hierarchical/level based privileges?  Is the log maintained for independent review/audit? Yes. OPAM uses the Fusion Middleware (FMW) Audit Framework to store all OPAM related events in a dedicated audit database.  8. Does OPAM support emergency access in the case where approvers are not available until later? Yes.  OPAM can be configured to release a password under a "break-glass" emergency scenario. 9. Does OPAM work with AIX? Yes supported UNIX version are listed in the "certified component section" of the UNIX connector guide at:http://docs.oracle.com/cd/E22999_01/doc.111/e17694/intro.htm#autoId0 10. Does OPAM integrate with Sun Identity Manager? Yes.  OPAM can be integrated with SIM using the REST  APIs.  OPAM has direct integration with Oracle Identity Manager 11gR2. 11. Is OPAM available today and what does it cost? Yes.  OPAM is available now.  Ask your Oracle Account Manager for pricing. 12. Can OPAM be used in SAP environments? Yes, supported SAP version are listed in the "certified component section" of the SAP  connector guide here: http://docs.oracle.com/cd/E22999_01/doc.111/e25327/intro.htm#autoId0 13. How would this product integrate, if at all, with access to a particular field in the DB that need additional security such as SSN's? OPAM can work with DB Vault and DB Firewall to provide the fine grained access control for databases. 14. Is VM supported? As a deployment platform Oracle VM is supported. For further details about supported Virtualization Technologies see Oracle Fusion Middleware Supported System configurations here: http://www.oracle.com/technetwork/middleware/ias/downloads/fusion-certification-100350.html 15. Where did this (OPAM) technology come from? OPAM was built by Oracle Engineering. 16. Are all Linux flavors supported?  How about BSD? BSD is not supported. For supported UNIX version see the "certified component section" of the UNIX connector guide http://docs.oracle.com/cd/E22999_01/doc.111/e17694/intro.htm#autoId0 17. What happens if users don't check passwords in at the end of a work task? In OPAM a time frame can be defined how long a password can be checked out. The security admin can force a check-in at any given time. 18. is MySQL supported? Yes, supported DB version are listed in the "certified component section" of the DB connector guide here: http://docs.oracle.com/cd/E22999_01/doc.111/e28315/intro.htm#BABGJJHA 19. What happens when OPAM crashes and you need to use the password? OPAM can be configured for high availability, but if required, OPAM data can be backed up/recovered.  See the OPAM admin guide. 20. Is OPAM Standalone product or does it leverage other components from IDM? OPAM can be run stand-alone, but will also leverage other IDM components

    Read the article

  • OK - What now? How do we become a Social Business?

    - by Michael Snow
    We hope that those of you that attended yesterday's Webcast with Brian Solis enjoyed Brian's discussion with Christian Finn for our last Webcast of the season for the Oracle Social Business Thought Leaders Series.  For those of you that may have missed the webcast or were stuck at a company holiday party - you'll be glad to hear that the webcast will be available On-Demand starting later today (12/14/12). And any of you who'd like to listen to a quick but informative podcast with Brian - can listen to that here. Some of you may still be left with questions about how to get from point A to point B and even more confused than when you started thinking about this new world of Digital Darwinism. The post below, grabbed from an abundance of great thought leadership prose on Brian's blog may help you frame the path you need to start walking sooner versus later to stay off of the endangered species list.  As you explore your path forward, please keep Oracle in mind - we do offer a wide range of solutions to help your organization 12.00 Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} optimize the engagement for your customers, employees and partners. The Path from a Social Brand to a Social Business Brian Solis Originally posted May 2, 2012 I’ve been a long-time supporter of MediaTemple’s (MT)Residence program along with Gary Vaynerchuk, Neil Patel, and many others whom I respect. I wanted to share my “7 questions to answer to become a social business” with you here.. Social Media is pervasive and is becoming the new normal in corporate marketing. Brands who get this right are starting to build their own media networks rich with customer connections numbering in the millions. Right now, Coca-Cola has over 34 million fans on Facebook, but they’re hardly alone. Disney follows just behind with 29 million fans, Starbucks boasts 25 million, and Oreo, Red Bull, and Converse play host to over 20 million fans. If we were to look at other networks such as Twitter and Youtube, we would see a recurring theme. People are connecting en masse with the businesses they support and new media represents the ability to cultivate consumer relationships in ways not possible with traditional earned or paid media. Sounds great right? This might sound abrupt, but the truth is that we’re hardly realizing the potential of what lies before us. Everything begins with understanding not just how other brands are marketing themselves in social media, but also seeing what they’re not doing and envisioning what’s possible. We’re already approaching the first of many crossroads that new media will present. Do we take the path of a social brand or that of a social business? What’s the difference? A social brand is just that, a business that is remodeling or retrofitting its existing marketing practices to new media. A social business is something altogether different as it embraces introspection and extrospection to reevaluate internal and external processes, systems, and opportunities to transform into a living, breathing entity that adapts to market conditions and opportunities. It’s a tough decision to make right now especially at a time when all we read about is how much success many businesses are finding without having to answer this very question. With all of the newfound success in social networks, the truth is that we’re only just beginning to learn what’s possible and that’s where you come in. When compared to the investment in time and resources across the board, social media represents only a small part of the mix. But with your help, that’s all about to change. The CMO Survey, an organization that disseminates the opinions of top marketers in order to predict the future of markets, recently published a report that gave credence to the fact that social media is taking off. One of the most profound takeaways from the report was this gem; “The “like button” [in Facebook] packs more customer-acquisition punch than other demand-generating activities.” With insights like this, it’s easy to see why the race to social is becoming heated. The report also highlighted exactly where social fits in the marketing mix today and as you can see, despite all of the hype, it’s not a dominant focus yet. As of August 2011, the percentage of overall marketing budgets dedicated to social media hovered at around 7%. However, in 2012 the investment in social media will climb to 10%. And, in five years, social media is expected to represent almost 18% of the total marketing budget. Think about that for a moment. In 2016, social media will only represent 18%? Queue the sound of a record scratching here. With businesses finding success in social networks, why are businesses failing to realize the true opportunity brought forth by the ability to listen to, connect with, and engage with customers? While there’s value in earning views, driving traffic, and building connections through the 3F’s (friends, fans and followers), success isn’t just defined simply by what really amounts to low-hanging fruit. The truth is that businesses cannot measure what it is they don’t know to value. As a result, innovation in new engagement initiatives is stifled because we’re applying dated or inflexible frameworks to new paradigms. Social media isn’t owned by marketing, but instead the entire organization. This changes everything and makes your role so much more important. It’s up to you to learn how to think outside of the proverbial social media box to see what others don’t, the ability to improve customers experiences through the evolution of a social brand into a social business. Doing so will translate customer insights from what they do and don’t share in social networks into better products, services, and processes. See, customers want something more from their favorite businesses than creative campaigns, viral content, and everyday dialogue in social networks. Customers want to be heard and they want to know that you’re listening. How businesses use social media must remind them that they’re more than just an audience, consumer, or a conduit to “trigger” a desired social effect. Herein lies both the challenge and opportunity of social media. It’s bigger than marketing. It’s also bigger than customer service. It’s about building relationships with customers that improve experiences and more importantly, teaches businesses how to re-imagine products and internal processes to better adapt to potential crises and seize new opportunities. When it comes down to it, Twitter, Facebook, Youtube, Foursquare, are all channels for listening, learning, and engaging. It’s what you do within each channel that builds a community around your brand. And, at the end of the day, the value of the community you build counts for everything. It’s important to understand that we cannot assume that these networks simply exist for people to lineup for our marketing messages or promotional campaigns. Nor can we assume that they’re reeling in anticipation for simple dialogue. They want value. They want recognition. They want access to exclusive information and offers. They need direction, answers and resolution. What we’re talking about here is the multidimensional makeup of consumers and how a one-sided approach to social media forces the needs for social media to expand beyond traditional marketing to socialize the various departments, lines of business, and functions to engage based on the nature of the situation or opportunity. In the same CMO study, it was revealed that marketers believe that social media has a long way to go toward integrating into the overall company strategy. On a scale of 1-7, with one being “not integrated at all” and seven being “very integrated,” 22% chose “one.” Critical functions such as service, HR, sales, R&D, product marketing and development, IR, CSR, etc. are either not engaged or are operating social media within a silo disconnected from other efforts or possibilities. The problem is that customers don’t view a company by silo, instead they see one company, one brand, and their experience in social media forms an impression that eventually contributes to their view of your brand. The first step here is to understand business priorities and objectives to assess how social media can be additive in achieving these goals. Additionally, surveying the landscape to determine other areas of interest as its specifically related to your business. • Are customers seeking help or direction? • Who are your most valuable customers and what are they sharing? • How can you use social media to acquire and retain customers? - What ideas are circulating and how can you harness user generated activity and content to innovate or adapt to better meet the needs of customers? - How can you broaden a single customer view to recognize the varying needs of customers and how your organization can organize around each circumstance? - What insights exist based on how consumers are interacting with one another? How can this intelligence inform marketing, service, products and other important business initiatives? - How can your business extend their current efforts to deliver better customer experiences and in turn more effectively unit internal collaboration and communication? Customer demands far exceed the capabilities of the marketing department. While creating a social brand is a necessary endeavor, building a social business is an investment in customer relevance now and over time. Beyond relevance, a social business fosters a culture of change that unites employees and customers and sets a foundation for meaningful and beneficial relationships. Innovation, communication, and creativity are the natural byproducts of engagement and transformation. As a social brand, we are competing for the moment. As a social business, we are competing the future in all that we do today.

    Read the article

  • Part 1 - Load Testing In The Cloud

    - by Tarun Arora
    Azure is fascinating, but even more fascinating is the marriage of Azure and TFS! Introduction Recently a client I worked for had 2 major business critical applications being delivered, with very little time budgeted for Performance testing, we immediately hit a bottleneck when the performance testing phase started, the in house infrastructure team could not support the hardware requirements in the short notice. It was suggested that the performance testing be performed on one of the QA environments which was a fraction of the production environment. This didn’t seem right, the team decided to turn to the cloud. The team took advantage of the elasticity offered by Azure, starting with a single test agent which was provisioned and ready for use with in 30 minutes the team scaled up to 17 test agents to perform a very comprehensive performance testing cycle. Issues were identified and resolved but the highlight was that the cost of running the ‘test rig’ proved to be less than if hosted on premise by the infrastructure team. Thank you for taking the time out to read this blog post, in the series of posts, I’ll try and cover the start to end of everything you need to know to use Azure to build your Test Rig in the cloud. But Why Azure? I have my own Data Centre… If the environment is provisioned in your own datacentre, - No matter what level of service agreement you may have with your infrastructure team there will be down time when the environment is patched - How fast can you scale up or down the environments (keeping the enterprise processes in mind) Administration, Cost, Flexibility and Scalability are the areas you would want to think around when taking the decision between your own Data Centre and Azure! How is Microsoft's Public Cloud Offering different from Amazon’s Public Cloud Offering? Microsoft's offering of the Cloud is a hybrid of Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) which distinguishes Microsoft's offering from other providers such as Amazon (Amazon only offers IaaS). PaaS – Platform as a Service IaaS – Infrastructure as a Service Fills the needs of those who want to build and run custom applications as services. Similar to traditional hosting, where a business will use the hosted environment as a logical extension of the on-premises datacentre. A service provider offers a pre-configured, virtualized application server environment to which applications can be deployed by the development staff. Since the service providers manage the hardware (patching, upgrades and so forth), as well as application server uptime, the involvement of IT pros is minimized. On-demand scalability combined with hardware and application server management relieves developers from infrastructure concerns and allows them to focus on building applications. The servers (physical and virtual) are rented on an as-needed basis, and the IT professionals who manage the infrastructure have full control of the software configuration. This kind of flexibility increases the complexity of the IT environment, as customer IT professionals need to maintain the servers as though they are on-premises. The maintenance activities may include patching and upgrades of the OS and the application server, load balancing, failover clustering of database servers, backup and restoration, and any other activities that mitigate the risks of hardware and software failures.   The biggest advantage with PaaS is that you do not have to worry about maintaining the environment, you can focus all your time in solving the business problems with your solution rather than worrying about maintaining the environment. If you decide to use a VM Role on Azure, you are asking for IaaS, more on this later. A nice blog post here on the difference between Saas, PaaS and IaaS. Now that we are convinced why we should be turning to the cloud and why in specific Azure, let’s discuss about the Test Rig. The Load Test Rig – Topology Now the moment of truth, Of course a big part of getting value from cloud computing is identifying the most adequate workloads to take to the cloud, so I’ve decided to try to make a Load Testing rig where the Agents are running on Windows Azure.   I’ll talk you through the above Topology, - User: User kick starts the load test run from the developer workstation on premise. This passes the request to the Test Controller. - Test Controller: The Test Controller is on premise connected to the same domain as the developer workstation. As soon as the Test Controller receives the request it makes use of the Windows Azure Connect service to orchestrate the test responsibilities to all the Test Agents. The Windows Azure Connect endpoint software must be active on all Azure instances and on the Controller machine as well. This allows IP connectivity between them and, given that the firewall is properly configured, allows the Controller to send work loads to the agents. In parallel, the Controller will collect the performance data from the agents, using the traditional WMI mechanisms. - Test Agents: The Test Agents are on the Windows Azure Public Cloud, as soon as the test controller issues instructions to the test agents, the test agents start executing the load tests. The HTTP requests are issued against the web server on premise, the results are captured by the test agents. And finally the results are passed over to the controller. - Servers: The Web Server and DB Server are hosted on premise in the datacentre, this is usually the case with business critical applications, you probably want to manage them your self. Recap and What’s next? So, in the introduction in the series of blog posts on Load Testing in the cloud I highlighted why creating a test rig in the cloud is a good idea, what advantages does Windows Azure offer and the Test Rig topology that I will be using. I would also like to mention that i stumbled upon this [Video] on Azure in a nutshell, great watch if you are new to Windows Azure. In the next post I intend to start setting up the Load Test Environment and discuss pricing with respect to test agent machine types that will be used in the test rig. Hope you enjoyed this post, If you have any recommendations on things that I should consider or any questions or feedback, feel free to add to this blog post. Remember to subscribe to http://feeds.feedburner.com/TarunArora.  See you in Part II.   Share this post : CodeProject

    Read the article

  • My Feelings About Microsoft Surface

    - by Valter Minute
    Advice: read the title carefully, I’m talking about “feelings” and not about advanced technical points proved in a scientific and objective way I still haven’t had a chance to play with a MS Surface tablet (I would love to, of course) and so my ideas just came from reading different articles on the net and MS official statements. Remember also that the MVP motto begins with “Independent” (“Independent Experts. Real World Answers.”) and this is just my humble opinion about a product and a technology. I know that, being an MS MVP you can be called an “MS-fanboy”, I don’t care, I hope that people can appreciate my opinion, even if it doesn’t match theirs. The “Surface” brand can be confusing for techies that knew the “original” surface concept but I think that will be a fresh new brand name for most of the people out there. But marketing department are here to confuse people… so I can understand this “recycle” of an existing name. So Microsoft is entering the hardware arena… for me this is good news. Microsoft developed some nice hardware in the past: the xbox, zune (even if the commercial success was quite limited) and, last but not least, the two arc mices (old and new model) that I use and appreciate. In the past Microsoft worked with OEMs and that model lead to good and bad things. Good thing (for microsoft, at least) is market domination by windows-based PCs that only in the last years has been reduced by the return of the Mac and tablets. Google is also moving in the hardware business with its acquisition of Motorola, and Apple leveraged his control of both the hardware and software sides to develop innovative products. Microsoft can scare OEMs and make them fly away from windows (but where?) or just lead the pack, showing how devices should be designed to compete in the market and bring back some of the innovation that disappeared from recent PC products (look at the shelves of your favorite electronics store and try to distinguish a laptop between the huge mass of anonymous PCs on displays… only Macs shine out there…). Having to compete with MS “official” hardware will force OEMs to develop better product and bring back some real competition in a market that was ruled only by prices (the lower the better even when that means low quality) and no innovative features at all (when it was the last time that a new PC surprised you?). Moving into a new market is a big and risky move, but with Windows 8 Microsoft is playing a crucial move for its future, trying to be back in the innovation run against apple and google. MS can’t afford to fail this time. I saw the new devices (the WinRT and Pro) and the specifications are scarce, misleading and confusing. The first impression is that the device looks like an iPad with a nice keyboard cover… Using “HD” and “full HD” to define display resolution instead of using the real figures and reviving the “ClearType” brand (now dead on Win8 as reported here and missed by people who hate to read text on displays, like myself) without providing clear figures (couldn’t you count those damned pixels?) seems to imply that MS was caught by surprise by apple recent “retina” displays that brought very high definition screens on tablets.Also there are no specifications about the processors used (even if some sources report NVidia Tegra for the ARM tablet and i5 for the x86 one) and expected battery life (a critical point for tablets and the point that killed Windows7 x86 based tablets). Also nothing about the price, and this will be another critical point because other platform out there already provide lots of applications and have a good user base, if MS want to enter this market tablets pricing must be competitive. There are some expansion ports (SD and USB), so no fixed storage model (even if the specs talks about 32-64GB for RT and 128-256GB for pro). I like this and don’t like the apple model where flash memory (that it’s dirt cheap used in thumdrives or SD cards) is as expensive as gold (or cocaine to have a more accurate per gram measurement) when mounted inside a tablet/phone. For big files you’ll be able to use external media and an SD card could be used to store files that don’t require super-fast SSD-like access times, I hope. To be honest I really don’t like the marketplace model and the limitation of Windows RT APIs (no local database? from a company that based a good share of its success on VB6+Access!) and lack of desktop support on the ARM (even if the support is here and has been used to port office). It’s a step toward the consumer market (where competitors are making big money), but may impact enterprise (and embedded) users that may not appreciate Windows 8 new UI or the limitations of the new app model (if you aren’t connected you are dead ). Not having compatibility with the desktop will require brand new applications and honestly made all the CPU cycles spent to convert .NET IL into real machine code in the past like a huge waste of time… as soon as a new processor architecture is supported by Windows you still have to rewrite part of your application (and MS is pushing HTML5+JS and native code more than .NET in my perception). On the other side I believe that the development experience provided by Visual Studio is still miles (or kilometres) ahead of the competition and even the all-uppercase menu of VS2012 hasn’t changed this situation. The new metro UI got mixed reviews. On my side I should say that is very pleasant to use on a touch screen, I like the minimalist design (even if sometimes is too minimal and hides stuff that, in my opinion, should be visible) but I should also say that using it with mouse and keyboard is like trying to pick your nose with boxing gloves… Metro is also very interesting for embedded devices where touch screen usage is quite common and where having an application taking all the screen is the norm. For devices like kiosks, vending machines etc. this kind of UI can be a great selling point. I don’t need a new tablet (to be honest I’m pretty happy with my wife’s iPad and with my PC), but I may change my opinion after having a chance to play a little bit with those new devices and understand what’s hidden under all this mysterious and generic announcements and specifications!

    Read the article

  • Which third party website thumbnailing services do you use?

    - by Ben Delarre
    I've got a requirement for showing thumbnails of arbitrary websites. I need to be able to show small thumbnails (120px by 90px), and larger thumbnails of around 480px wide. I'll need to specify the queue and invalid placeholder images and preferably have a pingback when the queued images are processed so I can respond appropriately. I'd also need a simple API I can use either directly embedded in my HTML, or from a simple web request to queue the images. I've been looking at various services ranging from low-fi services, to large scale ones - here's some examples: www.bitpixels.com Uses Google AppEngine, seems like a prototype or a toy. Free! www.websnapr.com Tried using this, made a free account and requested a thumbnail. Waited a few minutes and refreshed a couple of times, and ended up having the account banned. Free is tricky yes, but if I can't try it out successfully I'm disinclined to pay. www.shrinktheweb.com Free account seems to be very quick. Lots of documentation on the site, and even covers local caching of the images to your own server (documentation mostly in PHP). Quality of thumbnails look good, and there appear to be sufficient options for setting thumbnail placeholder images and parameters for altering how the thumbnailing is done. Also supports large 'screenshots' of URLs - very useful for me. Discovered the PRO pricing is an à la carte menu, allowing me to select just the features I want and keep the monthly cost low. Excellent stuff, have decided to use this service. www.thumbalizr.com Good coverage of thumbnail sizes and control options - even allowing specification for browser width when thumbnailing. No ping-back, but I can live without that. Supports local caching of images with PHP API, would prefer .NET, but can port it if necessary. Looks like a fairly professional service but seems fairly expensive for the number of thumbnails you get to generate. apologies for lack of proper linking - spam protection! I'm not entirely convinced by any of them, and since this will be a long term service I'd like some stability and support. I'm willing to pay for the service, but I'd want something that fulfills most if not all of my requirements for that. I should also mention that we're hosted on Windows under IIS, so local solutions involving Xvfb and the like sadly can't be used for this project. So my question is: what services do you use? How have they panned out, are you happy with them?

    Read the article

  • Menu floating to the right on IE and to the left in FF

    - by the_drow
    I am working on a website that has a menu which behaves correctly on FF but not on IE (as usuall). On IE it floats to the right while it should float to the left, however if float is set to none it behaves almost correctly, attaching the onto the top of the container. Here's the css: #navigation_wrap { background: url(../images/ltr/nav_bg.png); height: 34px; width: 954px; } .btn_login { float: right; margin: 4px 4px 0 0; } .navigation { float: left; } .navigation ul { list-style: none; margin: 8px 0 0 15px; } .navigation ul li { border-right: 1px solid white; float: left; padding: 0 12px 0 12px; } .navigation ul li.last { border: none; } .navigation ul li a { color: white; font-size: 14px; text-decoration: none; } .navigation ul li a:hover { text-decoration: underline; } .navigation ul li a.active { font-weight: bold; } And here's the html: <div id="navigation_wrap"> <div class="navigation"> <ul> <li><a class="active" href="default.asp">Home Page</a></li> <li><a class="" href="faq.asp">FAQ</a></li><li><a class="" href="articles.asp">Articles</a></li> <li><a class="" href="products.asp">Packages &amp; Pricing</a></li> <li><a class="" href="gp.asp?gpid=15">test1</a></li> <li><a class=" last" href="gp.asp?gpid=17">test asher</a></li> </ul> </div> <div class="btn_login"> ... </div> </div> I hope anyone would have an idea. Thanks, Omer.

    Read the article

  • Something like a manual refresh is needed angularjs, and a $digest() iterations error

    - by Tony Ennis
    (post edited again, new comments follow this line) I'm changing the title of this posting since it was misleading - I was trying to fix a symptom. I was unable to figure out why the code was breaking with a $digest() iterations error. A plunk of my code worked fine. I was totally stuck, so I decided to make my code a little more Angular-like. One anti-pattern I had implemented was to hide my model behind my controller by adding getters/setters to the controller. I tore all that out and instead put the model into the $scope since I had read that was proper Angular. To my surprise, the $digest() iterations error went away. I do not exactly know why and I do not have the intestinal fortitude to put the old code back and figure it out. I surmise that by involving the controller in the get/put of the data I added a dependency under the hood. I do not understand it. edit #2 ends here. (post edited, see EDIT below) I was working through my first Error: 10 $digest() iterations reached. Aborting! error today. I solved it this way: <div ng-init="lineItems = ctrl.getLineItems()"> <tr ng-repeat="r in lineItems"> <td>{{r.text}}</td> <td>...</td> <td>{{r.price | currency}}</td> </tr </div> Now a new issue has arisen - the line items I'm producing can be modified by another control on the page. It's a text box for a promo code. The promo code adds a discount to the lineItem array. It would show up if I could ng-repeat over ctrl.getLineItems(). Since the ng-repeat is looking at a static variable, not the actual model, it doesn't see that the real line items have changed and thus the promotional discount doesn't get displayed until I refresh the browser. Here's the HTML for the promo code: <input type="text" name="promo" ng-model="ctrl.promoCode"/> <button ng-click="ctrl.applyPromoCode()">apply promo code</button> The input tag is writing the value to the model. The bg-click in the button is invoking a function that will apply the code. This could change the data behind the lineItems. I have been advised to use $scope.apply(...). However, since this is applied as a matter of course by ng-click is isn't going to do anything. Indeed, if I add it to ctrl.applyPromoCode(), I get an error since an .apply() is already in progress. I'm at a loss. EDIT The issue above is probably the result of me fixing of symptom, not a problem. Here is the original HTML that was dying with the 10 $digest() iterations error. <table> <tr ng-repeat="r in ctrl.getLineItems()"> <td>{{r.text}}</td> <td>...</td> <td>{{r.price | currency}}</td> </tr> </table> The ctrl.getLineItems() function doesn't do much but invoke a model. I decided to keep the model out of the HTML as much as I could. this.getLineItems = function() { var total = 0; this.lineItems = []; this.lineItems.push({text:"Your quilt will be "+sizes[this.size].block_size+" squares", price:sizes[this.size].price}); total = sizes[this.size].price; this.lineItems.push({text: threads[this.thread].narrative, price:threads[this.thread].price}); total = total + threads[this.thread].price; if (this.sashing) { this.lineItems.push({text:"Add sashing", price: this.getSashingPrice()}); total = total + sizes[this.size].sashing; } else { this.lineItems.push({text:"No sashing", price:0}); } if(isNaN(this.promo)) { this.lineItems.push({text:"No promo code", price:0}); } else { this.lineItems.push({text:"Promo code", price: promos[this.promo].price}); total = total + promos[this.promo].price; } this.lineItems.push({text:"Shipping", price:this.shipping}); total = total + this.shipping; this.lineItems.push({text:"Order Total", price:total}); return this.lineItems; }; And the model code assembled an array of objects based upon the items selected. I'll abbreviate the class as it croaks as long as the array has a row. function OrderModel() { this.lineItems = []; // Result of the lineItems call ... this.getLineItems = function() { var total = 0; this.lineItems = []; ... this.lineItems.push({text:"Order Total", price:total}); return this.lineItems; }; }

    Read the article

  • Unlimited SMS API/Gateway (Sending and Receiving)

    - by Naif
    I am creating a chat application which requires that users be able to send and receive sms messages through a web interface. It would be somewhat similar to the text messaging service available in yahoo mail or in aol instant messenger. The situation is this: Given the high quantity of messages that would be sent and received, paying on a per message basis is not economically feasible. How is it that sites such as yahoo, aim, and twitter can send and receive unlimited sms messages? Essentially, I am looking for a way to send and receive unlimited sms from my computer. Below is a list of some approaches I've come up with but have run into problems with as well. If just one of the approaches can be utilized effectively, then I am fine. As a note on the nature of my application: I will only be sending messages to users that explicitly sign up for the service and permit the receiving of messages. They can unsubscribe at any time. This is to prevent spam. I am aware of software such as Kannel which allows one to connect to a providers smsc gateway. However, this adds the risk of not being approved by the provider which would be unacceptable. Is there any way to significantly mitigate this risk? Utilizing a gateway provider eliminates this risk, but adds the issue of per message pricing. I am also aware of email to sms. However, I have done some testing with that and it appears that this method results in many messages being undelivered or delivered VERY late. If it weren't for that, this approach would have been ideal. Is there any way to negotiate with carriers to remove me from their filters (considering the nature of my service as stated before)? I could use a gsm modem, but even with an "unilimited" plan on a sim card, there are still limits (around 100,000 messages or so). Furthermore, from my understanding, gsm modems are capable of only sending out around a dozen messages per minute. I need to be able to send out as much as several hundred messages per second. During the first 2 months, around 10 messages per second would suffice. There are ways to send out ads with messages to cover the per message costs. However, this is a deal-breaker since it has a high chance of tarnishing the quality of service. Furthermore, I know it is possible to do it without such ads since yahoo, aim, and twitter do not send ads with their messages.

    Read the article

  • Recommended ASP.NET Shared Hosting

    - by coffeeaddict
    Ok, I have to admit I'm getting fed up with www.discountasp.net's pricing model and this annoyance has built up over the past 8 years or so. I've been with them for years and absolutely love them on the technical side, however it's getting ridiculously expensive for so little that you get. I mean here's my scenario: 1) I am running 2 SQL Server databases which costs me $10/ea per month so that's $20/month for 2 and I only get 500 mb disk space which is horrible 2) I am paying $10/mo just for the hosting itself which I only get 1 gig of disk space! I mean common! 3) I am simply running 2 small apps (Screwturn Wiki & Subtext Blog)...so I don't really care if it's up 99% or not, it's not worth paying a total of $300 just to keep these 2 apps running over discountasp.net Anyone else feel the same? Yes, I know they have great support, probably have great servers running behind this but in the end I really don't care as long as my site is up 95% or better. Yes, the hosting toolset rocks. But you know I bet you I can find a similar set somewhere else. I like how I can totally control IIS 7 at discountasp and I can control my own app pool etc. That's very powerful and essential. But anyone have any good alternatives to discountasp that gives me close to the same at a much more reasonable cost point? I mean http://www.m6.net/prices.aspx gives you 10 SQL Databases for $7 and 200 gigs disk space! I don't know about their tools or support but just looking at those numbers and some other hosts I've seen, I feel that discountasp.net is way out of line. They don't even offer any purchasing discounts such as it would be nice if my 2nd SQL Server is only $5/month not $10...stuff like this, to make it much more realistic and fair. Opinions (people who do have discountasp.net, people who have left them, or people who have another host they like)??? But geez $300 just to host a couple DBs and lightweight open source apps? Not worth the price they are charging. I'm almost at a price point that enables me to get a decent dedicated server! I really don't care about beta support. Not a big deal to me.

    Read the article

  • In R, how do you get the best fitting equation to a set of data?

    - by Matherion
    I'm not sure wether R can do this (I assume it can, but maybe that's just because I tend to assume that R can do anything :-)). What I need is to find the best fitting equation to describe a dataset. For example, if you have these points: df = data.frame(x = c(1, 5, 10, 25, 50, 100), y = c(100, 75, 50, 40, 30, 25)) How do you get the best fitting equation? I know that you can get the best fitting curve with: plot(loess(df$y ~ df$x)) But as I understood you can't extract the equation, see Loess Fit and Resulting Equation. When I try to build it myself (note, I'm not a mathematician, so this is probably not the ideal approach :-)), I end up with smth like: y.predicted = 12.71 + ( 95 / (( (1 + df$x) ^ .5 ) / 1.3)) Which kind of seems to approximate it - but I can't help to think that smth more elegant probably exists :-) I have the feeling that fitting a linear or polynomial model also wouldn't work, because the formula seems different from what those models generally use (i.e. this one seems to need divisions, powers, etc). For example, the approach in Fitting polynomial model to data in R gives pretty bad approximations. I remember from a long time ago that there exist languages (Matlab may be one of them?) that do this kind of stuff. Can R do this as well, or am I just at the wrong place? (Background info: basically, what we need to do is find an equation for determining numbers in the second column based on the numbers in the first column; but we decide the numbers ourselves. We have an idea of how we want the curve to look like, but we can adjust these numbers to an equation if we get a better fit. It's about the pricing for a product (a cheaper alternative to current expensive software for qualitative data analysis); the more 'project credits' you buy, the cheaper it should become. Rather than forcing people to buy a given number (i.e. 5 or 10 or 25), it would be nicer to have a formula so people can buy exactly what they need - but of course this requires a formula. We have an idea for some prices we think are ok, but now we need to translate this into an equation.

    Read the article

  • Include weather information in ASP.Net site from weather.com services

    - by sreejukg
    In this article, I am going to demonstrate how you can use the XMLOAP services (referred as XOAP from here onwards) provided by weather.com to display the weather information in your website. The XOAP services are available to be used for free of charge, provided you are comply with requirements from weather.com. I am writing this article from a technical point of view. If you are planning to use weather.com XOAP services in your application, please refer to the terms and conditions from weather.com website. In order to start using the XOAP services, you need to sign up the XOAP datafeed. The signing process is simple, you simply browse the url http://www.weather.com/services/xmloap.html. The URL looks similar to the following. Click on the sign up button, you will reach the registration page. Here you need to specify the site name you need to use this feed for. The form looks similar to the following. Once you fill all the mandatory information, click on save and continue button. That’s it. The registration is over. You will receive an email that contains your partner id, license key and SDK. The SDK available in a zipped format, contains the terms of use and documentation about the services available. Other than this the SDK includes the logos and icons required to display the weather information. As per the SDK, currently there are 2 types of information available through XOAP. These services are Current Conditions for over 30,000 U.S. and over 7,900 international Location IDs Updated at least Hourly Five-Day Forecast (today + 4 additional forecast days in consecutive order beginning with tomorrow) for over 30,000 U.S. and over 7,900 international Location IDs Updated at least Three Times Daily The SDK provides detailed information about the fields included in response of each service. Additionally there is a refresh rate that you need to comply with. As per the SDK, the refresh rate means the following “Refresh Rate” shall mean the maximum frequency with which you may call the XML Feed for a given LocID requesting a data set for that LocID. During the time period in between refresh periods the data must be cached by you either in the memory on your servers or in Your Desktop Application. About the Services Weather.com will provide you with access to the XML Feed over the Internet through the hostname xoap.weather.com. The weather data from the XML feed must be requested for a specific location. So you need a location ID (LOC ID). The XML feed work with 2 types of location IDs. First one is with City Identifiers and second one is using 5 Digit US postal codes. If you do not know your location ID, don’t worry, there is a location id search service available for you to retrieve the location id from city name. Since I am a resident in the Kingdom of Bahrain, I am going to retrieve the weather information for Manama(the capital of Bahrain) . In order to get the location ID for Manama, type the following URL in your address bar. http://xoap.weather.com/search/search?where=manama I got the following XML output. <?xml version="1.0" encoding="UTF-8"?> <!-- This document is intended only for use by authorized licensees of The –> <!-- Weather Channel. Unauthorized use is prohibited. Copyright 1995-2011, –> <!-- The Weather Channel Interactive, Inc. All Rights Reserved. –> <search ver="3.0">       <loc id="BAXX0001" type="1">Al Manama, Bahrain</loc> </search> You can try this with any city name, if the city is available, it will return the location id, and otherwise, it will return nothing. In order to get the weather information, from XOAP,  you need to pass certain parameters to the XOAP service. A brief about the parameters are as follows. Please refer SDK for more details. Parameter name Possible Value cc Optional, if you include this, the current condition will be returned. Value can be anything, as it will be ignored e.g. cc=* dayf If you want the forecast for 5 days, specify dayf=5 This is optional iink Value should be XOAP par Your partner id. You can find this in your registration email from weather.com prod Value should be XOAP key The license key assigned to you. This will be available in the registration email unit s or m (standard or matric or you can think of Celsius/Fahrenheit) this is optional field, if not specified the unit will be standard(s) The URL host for the XOAP service is http://xoap.weather.com. So for my purpose, I need the following request to be made to access the XOAP services. http://xoap.weather.com/weather/local/BAXX0001?cc=*&link=xoap&prod=xoap&par=*********&key=************** (The ***** to be replaced with the corresponding alternatives) The response XML have a root element “weather”. Under the root element, it has the following sections <head> - the meta data information about the weather results returned. <loc> - the location data block that provides, the information about the location for which the wheather data is retrieved. <lnks> - the 4 promotional links you need to place along with the weather display. Additional to these 4 links, there should be another link with weather channel logo to the home page of weather.com. <cc> - the current condition data. This element will be there only if you specify the cc element in the request. <dayf> - the forcast data as you specified. This element will be there only if you specify the dayf in the request. In this walkthrough, I am going to capture the weather information for Manama (Location ID: BAXX0001). You need 2 applications to display weather information in your website. A Console application that retrieves data from the XMLOAP and store in the SQL Server database (or any data store as you prefer).This application will be scheduled to execute in every 25 minutes using windows task scheduler, so that we can comply with the refresh rate. A web application that display data from the SQL Server database Retrieve the Weather from XOAP I have created a console application named, Weather Service. I created a SQL server database, with the following columns. I named the table as tblweather. You are free to choose any name. Column name Description lastUpdated Datetime, this is the last time when the weather data is updated. This is the time of the service running TemparatureDateTime The date and time returned by XML feed Temparature The temperature returned by the XML feed. TemparatureUnit The unit of the temperature returned by the XML feed iconId The id of the icon to be used. Currently 48 icons from 0 to 47 are available. WeatherDescription The Weather Description Phrase returned by the feed. Link1url The url to the first promo link Link1Text The text for the first promo link Link2url The url to the second promo link Link2Text The text for the second promo link Link3url The url to the third promo link Link3Text The text for the third promo link Link4url The url to the fourth promo link Link4Text The text for the fourth promo link Every time when the service runs, the application will update the database columns from the XOAP data feed. When the application starts, It is going to get the data as XML from the url. This demonstration uses LINQ to extract the necessary data from the fetched XML. The following are the code segment for extracting data from the weather XML using LINQ. // first, create an instance of the XDocument class with the XOAP URL. replace **** with the corresponding values. XDocument weather = XDocument.Load("http://xoap.weather.com/weather/local/BAXX0001?cc=*&link=xoap&prod=xoap&par=***********&key=c*********"); // construct a query using LINQ var feedResult = from item in weather.Descendants() select new { unit = item.Element("head").Element("ut").Value, temp = item.Element("cc").Element("tmp").Value, tempDate = item.Element("cc").Element("lsup").Value, iconId = item.Element("cc").Element("icon").Value, description = item.Element("cc").Element("t").Value, links = from link in item.Elements("lnks").Elements("link") select new { url = link.Element("l").Value, text = link.Element("t").Value } }; // Load the root node to a variable, you may use foreach construct instead. var item1 = feedResult.First(); *If you want to learn more about LINQ and XML, read this nice blog from Scott GU. http://weblogs.asp.net/scottgu/archive/2007/08/07/using-linq-to-xml-and-how-to-build-a-custom-rss-feed-reader-with-it.aspx Now you have all the required values in item1. For e.g. if you want to get the temperature, use item1.temp; Now I just need to execute an SQL query against the database. See the connection part. using (SqlConnection conn = new SqlConnection(@"Data Source=sreeju\sqlexpress;Initial Catalog=Sample;Integrated Security=True")) { string strSql = @"update tblweather set lastupdated=getdate(), temparatureDateTime = @temparatureDateTime, temparature=@temparature, temparatureUnit=@temparatureUnit, iconId = @iconId, description=@description, link1url=@link1url, link1text=@link1text, link2url=@link2url, link2text=@link2text,link3url=@link3url, link3text=@link3text,link4url=@link4url, link4text=@link4text"; SqlCommand comm = new SqlCommand(strSql, conn); comm.Parameters.AddWithValue("temparatureDateTime", item1.tempDate); comm.Parameters.AddWithValue("temparature", item1.temp); comm.Parameters.AddWithValue("temparatureUnit", item1.unit); comm.Parameters.AddWithValue("description", item1.description); comm.Parameters.AddWithValue("iconId", item1.iconId); var lstLinks = item1.links; comm.Parameters.AddWithValue("link1url", lstLinks.ElementAt(0).url); comm.Parameters.AddWithValue("link1text", lstLinks.ElementAt(0).text); comm.Parameters.AddWithValue("link2url", lstLinks.ElementAt(1).url); comm.Parameters.AddWithValue("link2text", lstLinks.ElementAt(1).text); comm.Parameters.AddWithValue("link3url", lstLinks.ElementAt(2).url); comm.Parameters.AddWithValue("link3text", lstLinks.ElementAt(2).text); comm.Parameters.AddWithValue("link4url", lstLinks.ElementAt(3).url); comm.Parameters.AddWithValue("link4text", lstLinks.ElementAt(3).text); conn.Open(); comm.ExecuteNonQuery(); conn.Close(); Console.WriteLine("database updated"); } Now click ctrl + f5 to run the service. I got the following output Check your database and make sure, the data is updated with the latest information from the service. (Make sure you inserted one row in the database by entering some values before executing the service. Otherwise you need to modify your application code to count the rows and conditionally perform insert/update query) Display the Weather information in ASP.Net page Now you got all the data in the database. You just need to create a web application and display the data from the database. I created a new ASP.Net web application with a default.aspx page. In order to comply with the terms of weather.com, You need to use Weather.com logo along with the weather display. You can find the necessary logos to use under the folder “logos” in the SDK. Additionally copy any of the icon set from the folder “icons” to your web application. I used the 93x93 icon set. You are free to use any other sizes available. The design view of the page in VS2010 looks similar to the following. The page contains a heading, an image control (for displaying the weather icon), 2 label controls (for displaying temperature and weather description), 4 hyperlinks (for displaying the 4 promo links returned by the XOAP service) and weather.com logo with hyperlink to the weather.com home page. I am going to write code that will update the values of these controls from the values stored in the database by the service application as mentioned in the previous step. Go to the code behind file for the webpage, enter the following code under Page_Load event handler. using (SqlConnection conn = new SqlConnection(@"Data Source=sreeju\sqlexpress;Initial Catalog=Sample;Integrated Security=True")) { SqlCommand comm = new SqlCommand("select top 1 * from tblweather", conn); conn.Open(); SqlDataReader reader = comm.ExecuteReader(); if (reader.Read()) { lblTemparature.Text = reader["temparature"].ToString() + "&deg;" + reader["temparatureUnit"].ToString(); lblWeatherDescription.Text = reader["description"].ToString(); imgWeather.ImageUrl = "icons/" + reader["iconId"].ToString() + ".png"; lnk1.Text = reader["link1text"].ToString(); lnk1.NavigateUrl = reader["link1url"].ToString(); lnk2.Text = reader["link2text"].ToString(); lnk2.NavigateUrl = reader["link2url"].ToString(); lnk3.Text = reader["link3text"].ToString(); lnk3.NavigateUrl = reader["link3url"].ToString(); lnk4.Text = reader["link4text"].ToString(); lnk4.NavigateUrl = reader["link4url"].ToString(); } conn.Close(); } Press ctrl + f5 to run the page. You will see the following output. That’s it. You need to configure the console application to run every 25 minutes so that the database is updated. Also you can fetch the forecast information and store those in the database, and then retrieve it later in your web page. Since the data resides in your database, you have the full control over your display. You need to make sure your website comply with weather.com license requirements. If you want to get the source code of this walkthrough through the application, post your email address below. Hope you enjoy the reading.

    Read the article

  • CodePlex Daily Summary for Tuesday, October 23, 2012

    CodePlex Daily Summary for Tuesday, October 23, 2012Popular ReleasesDNN Module Creator: 01.01.00: Updated templates for DNN7 ( ie. DAL2, Web Service API ). Numerous bug fixes and enhancements.WPF Application Framework (WAF): WPF Application Framework (WAF) 2.5.0.390: Version 2.5.0.390 (Release Candidate): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Changelog Legend: [B] Breaking change; [O] Marked member as obsolete WAF: Fix recent file list remove issue. WAF: Minor code improvements. BookLibrary: Fix Blend design time support o...ltxml.js - LINQ to XML for JavaScript: 1.0 - Beta 1: First release!EvoGame: EvoGame PreAlpha v0.0.2_c InDev: Yup, a new update, go the sprites working.Ficharts.Net: 1.0 Alpha: ??Ficharts????、???????????,?? ??/???、??/???、???、??/???ZXMAK2: Version 2.6.6.0: + fix refresh debugger after open RZX file + add NoFlic video filterSQLLib: Alpha release 17: Added CLR UDFs: * clr.fn_regex_instr - similar to Oracle REGEX_INSTR * clr.fn_regex_substr - similar to Oracle REGEX_SUBSTR To deploy CLR objects copy ClrAgg.dll and ClrRegEx.dll to a folder of you choice (currently deployment script points to C:\Program Files\Microsoft SQL Server\100\CLR\ClrAgg.dll) and execute deployment scripts InstallCLRAggregates.sql and InstallCLRRegEx.sql Thank you for rating the download and/or your feedback.EPiServer CMS ElencySolutions.MultipleProperty: ElencySolutions.MultipleProperty v1.6.3: The ElencySolutions.MulitpleProperty property controls have been developed by Lee Crowe a technical developer at Fortune Cookie (London). Installation notes The property copy page can be locked down by adding the following location element, the path of this will be different depending on whether you use the embedded or non embedded resource version. When installing the nuget package these will be added automatically, examples below: Embedded: <location path="util/ElencySolutionsMultipleP...Fiskalizacija za developere: FiskalizacijaDev 1.1: Ovo je prva nadogradnja ovog projekta nakon inicijalnog predstavljanja - dodali smo nekoliko feature-a, bilo zato što smo sami primijetili da bi ih bilo dobro dodati, bilo na osnovu vaših sugestija - hvala svima koji su se ukljucili :) Ovo su stvari riješene u v1.1.: 1. Bilo bi dobro da se XML dokument koji se šalje u CIS može snimiti u datoteku (http://fiskalizacija.codeplex.com/workitem/612) 2. Podrška za COM DLL (VB6) (http://fiskalizacija.codeplex.com/workitem/613) 3. Podrška za DOS (unu...MCEBuddy 2.x: MCEBuddy 2.3.4: Changelog for 2.3.4 (32bit and 64bit) 1. Fixed a bug introduced in 2.3.3 that would cause HD recordings and recordings with multiple audio channels to fail. 2. Updated <encoder-unsupported> option to compare with all Audio tracks for videos with multiple audio tracks. 3. Fixed a bug with SRT and EDL files, when input and output directory are the same the files are not preserved.BlogEngine.NET: BlogEngine.NET 2.7 RC: Cheap ASP.NET Hosting - $4.95/Month - Click Here!! Click Here for More Info Cheap ASP.NET Hosting - $4.95/Month - Click Here! dot This is a Release Candidate version for BlogEngine.NET 2.7. The most current, stable version of BlogEngine.NET is version 2.6. Find out more about the BlogEngine.NET 2.7 RC here. To get started, be sure to check out our installation documentation. If you are upgrading from a previous version, please take a look at the Upgrading to BlogEngine.NET 2.7 instructions...Pulse: Pulse 0.6.3.0: Fixed a number of bugs that showed up since my update yesterday. Fixes included are for: - Weird issue where the initial "Nature" wallbase.cc search would duplicate itself - After changing a providers settings it wouldn't take affect until you restarted Pulse (removing or adding a provider entirely did take effect though) - Another small issue with the regex for the wallbase.cc wallpapers that I tweaked yesterday, seems good now though.Liberty: v3.4.0.0 Release 20th October 2012: Change Log -Added -Halo 4 support (invincibility, ammo editing) -Reach A warning dialog now shows up when you first attempt to swap a weapon -Fixed -A few minor bugsClosedXML - The easy way to OpenXML: ClosedXML 0.68.1: ClosedXML now resolves formulas! Yes it finally happened. If you call cell.Value and it has a formula the library will try to evaluate the formula and give you the result. For example: var wb = new XLWorkbook(); var ws = wb.AddWorksheet("Sheet1"); ws.Cell("A1").SetValue(1).CellBelow().SetValue(1); ws.Cell("B1").SetValue(1).CellBelow().SetValue(1); ws.Cell("C1").FormulaA1 = "\"The total value is: \" & SUM(A1:B2)"; var...Orchard Project: Orchard 1.6 RC: RELEASE NOTES This is the Release Candidate version of Orchard 1.6. You should use this version to prepare your current developments to the upcoming final release, and report problems. Please read our release notes for Orchard 1.6 RC: http://docs.orchardproject.net/Documentation/Orchard-1-6-Release-Notes Please do not post questions as reviews. Questions should be posted in the Discussions tab, where they will usually get promptly responded to. If you post a question as a review, you wil...Rawr: Rawr 5.0.1: This is the Downloadable WPF version of Rawr!For web-based version see http://elitistjerks.com/rawr.php You can find the version notes at: http://rawr.codeplex.com/wikipage?title=VersionNotes Rawr Addon (NOT UPDATED YET FOR MOP)We now have a Rawr Official Addon for in-game exporting and importing of character data hosted on Curse. The Addon does not perform calculations like Rawr, it simply shows your exported Rawr data in wow tooltips and lets you export your character to Rawr (including ba...Yahoo! UI Library: YUI Compressor for .Net: Version 2.1.1.0 - Sartha (BugFix): - Revered back the embedding of the 2x assemblies.Visual Studio Team Foundation Server Branching and Merging Guide: v2.1 - Visual Studio 2012: Welcome to the Branching and Merging Guide What is new? The Version Control specific discussions have been moved from the Branching and Merging Guide to the new Advanced Version Control Guide. The Branching and Merging Guide and the Advanced Version Control Guide have been ported to the new document style. See http://blogs.msdn.com/b/willy-peter_schaub/archive/2012/10/17/alm-rangers-raising-the-quality-bar-for-documentation-part-2.aspx for more information. Quality-Bar Details Documentatio...D3 Loot Tracker: 1.5.5: Compatible with 1.05.Write Once, Play Everywhere: MonoGame 3.0 (BETA): This is a beta release of the up coming MonoGame 3.0. It contains an Installer which will install a binary release of MonoGame on windows boxes with the following platforms. Windows, Linux, Android and Windows 8. If you need to build for iOS or Mac you will need to get the source code at this time as the installers for those platforms are not available yet. The installer will also install a bunch of Project templates for Visual Studio 2010 , 2012 and MonoDevleop. For those of you wish...New ProjectsAddition of two numbers: Addition of two integer numbersAddTwoNumbers: Add two numbersASP_BANMAYTINH: Xây d?ng web bán máy tính b?ng ASPAvalon MVC: Do not use, still in alphaCaio Proiete's HG Playground: Simple test project to leverage Mercurial features using CodePlexCaio Proiete's TFS Playground: Simple test project to leverage TFS features using CodePlexcodeplexaddproject: Task 1 adding two numbers.Compresor markov orden 1 shannon: Compresor de fuentes basado en el algoritmo de shannon con markov orden 1Cricket Mania: addd39 grid system: A web-based combat grid system for use in play-by-post DnD (or similar) role playing games.DarkSky Tagit: An Orchard module that exposes the jQuery Tagit plugin written by Hailwood as a script resource.DnnExpert: ??? ????? ?? ???? ???? ????? ? ???? ??? ?????? ??? ?? ???? ?? ???? ?? ???????? ???? ?? ???? ??? ???? ?? ?? ? ?????? ????? ???? ? ?????? ? ???? ???? ?? ????.Expandable Text/HTML for DotNetNuke by IowaComputerGurus Inc.: The DNN Expandable Text/HTML module allows you to display multiple text items with the ability to expand and collapse individual items.FarajsWeb2Project: This project is intended to design a Web2.0 website for 7COM0203 ModuleGeminorum Software Contacts for DotNetNuke: A simple contact manager for DotNetNuke.GitText: Test olyGoDarting by Harsh Maurya: Darts Game developed in WPF. requires .Net Framework 4.0Info Gempa BMKG: Aplikasi pembaca informasi gempa BMKGMassive encryption of files: "Massive Encrypt" allows you to encrypt or rename many files at once. Of course you can decrypt later encrypted files!Metal Player: Simple and easy to use, Metal Player has basic multimedia player functions, and some new functions that will enjoy you.NAntDefineTasks: NAnt Define Tasks allows you to define NAnt tasks in terms of other NAnt tasks, instead of having to write any C# code. Nebulosa: Nebulosa is a complete engine to create a complete websitesNigeria Single Mothers: This website project helps single mothers in Nigeria share ideas on how to raise children given the socio-economic and cultural challenges they face.PCV_Clinic_Pro: PCVClinicPro is a software proRendering.NET: Rendering.NET is an abstraction for any visualization device and over several APIs like OpenGL, DirectX, XNA, WebGL, WPF, Silverlight, Mobile DirectX, etc.RLA: A template for illustrate a MVC2 websiteSecure Password Recovery for DotNetNuke by IowaComputerGurus Inc.: IowaComputerGurus's Secure Password Recovery module is the next step in preventing user passwords from being sent via e-mail!SimpleSum: This calculates a simple sum using Visual Basic.SiteCetic: Sitio de CETIC Social Learning: Social Learning Project for BC 2012Towards a generic DSL for modeling page types in WCMSs: An exploration of creating DSLs to facilitate the creation of page models in WCMSs using VMSDK. The concepts of PIM, PSM, DSL, M2M, and M2T will be explored.TrafficArchives: TrafficArchives is a two people group of TrafficArchives team, in this project ,we will use asp.net do Traffic Archives information manager system.UnivDevs: university test developmentUppityUp: UppityUp is a simple and light-weight tray application which monitors a remote server and shows a notification when it comes online. This is useful when you need to connect to a server that is currently down and you want to be notified the moment it becomes available.uurrooster: Hier wordt nog aan gewerktUWE Computer Science: A collection of all work submitted and completed during my course at UWE - Bristol.ViewMyDeals: This Site is all about sharing dramatic deals and offers of several products using Promotional codes and vouchers .WebUntis4Win8: WebUntis4Win8X.MetaWeblog.Model: This is a model for MetaWeblog API. Detail info at: http://xmlrpc.scripting.com/metaWeblogApi.html http://en.wikipedia.org/wiki/MetaWeblogXPath execution utility: CommonXPath is a utility to execute an XPath expression on some XML and see the result.

    Read the article

  • Issue 15: The Benefits of Oracle Exastack

    - by rituchhibber
         SOLUTIONS FOCUS The Benefits of Oracle Exastack Paul ThompsonDirector, Alliances and Solutions Partner ProgramsOracle EMEA Alliances & Channels RESOURCES -- Oracle PartnerNetwork (OPN) Oracle Exastack Program Oracle Exastack Ready Oracle Exastack Optimized Oracle Exastack Labs and Enablement Resources Oracle Exastack Labs Video Tour SUBSCRIBE FEEDBACK PREVIOUS ISSUES Exastack is a revolutionary programme supporting Oracle independent software vendor partners across the entire Oracle technology stack. Oracle's core strategy is to engineer software and hardware together, and our ISV strategy is the same. At Oracle we design engineered systems that are pre-integrated to reduce the cost and complexity of IT infrastructures while increasing productivity and performance. Oracle innovates and optimises performance at every layer of the stack to simplify business operations, drive down costs and accelerate business innovation. Our engineered systems are optimised to achieve enterprise performance levels that are unmatched in the industry. Faster time to production is achieved by implementing pre-engineered and pre-assembled hardware and software bundles. Our strategy of delivering a single-vendor stack simplifies and reduces costs associated with purchasing, deploying, and supporting IT environments for our customers and partners. In parallel to this core engineered systems strategy, the Oracle Exastack Program enables our Oracle ISV partners to leverage a scalable, integrated infrastructure that delivers their applications tuned, tested and optimised for high-performance. Specifically, the Oracle Exastack Program helps ISVs run their solutions on the Oracle Exadata Database Machine, Oracle Exalogic Elastic Cloud, and Oracle SPARC SuperCluster T4-4 - integrated systems products in which the software and hardware are engineered to work together. These products provide OPN members with a lower cost and high performance infrastructure for database and application workloads across on-premise and cloud based environments. Ready and Optimized Oracle Partners can now leverage our new Oracle Exastack Program to become Oracle Exastack Ready and Oracle Exastack Optimized. Partners can achieve Oracle Exastack Ready status through their support for Oracle Solaris, Oracle Linux, Oracle VM, Oracle Database, Oracle WebLogic Server, Oracle Exadata Database Machine, Oracle Exalogic Elastic Cloud, and Oracle SPARC SuperCluster T4-4. By doing this, partners can demonstrate to their customers that their applications are available on the latest major releases of these products. The Oracle Exastack Ready programme helps customers readily differentiate Oracle partners from lesser software developers, and identify applications that support Oracle engineered systems. Achieving Oracle Exastack Optimized status demonstrates that an OPN member has proven itself against goals for performance and scalability on Oracle integrated systems. This status enables end customers to readily identify Oracle partners that have tested and tuned their solutions for optimum performance on an Oracle Exadata Database Machine, Oracle Exalogic Elastic Cloud, and Oracle SPARC SuperCluster T4-4. These ISVs can display the Oracle Exadata Optimized, Oracle Exalogic Optimized or Oracle SPARC SuperCluster Optimized logos on websites and on all their collateral to show that they have tested and tuned their application for optimum performance. Deliver higher value to customers Oracle's investment in engineered systems enables ISV partners to deliver higher value to customer business processes. New innovations are enabled through extreme performance unachievable through traditional best-of-breed multi-vendor server/software approaches. Core product requirements can be launched faster, enabling ISVs to focus research and development investment on core competencies in order to bring value to market as quickly as possible. Through Exastack, partners no longer have to worry about the underlying product stack, which allows greater focus on the development of intellectual property above the stack. Partners are not burdened by platform issues and can concentrate simply on furthering their applications. The advantage to end customers is that partners can focus all efforts on business functionality, rather than bullet-proofing underlying technologies, and so will inevitably deliver application updates faster. Exastack provides ISVs with a number of flexible deployment options, such as on-premise or Cloud, while maintaining one single code base for applications regardless of customer deployment preference. Customers buying their solutions from Exastack ISVs can therefore be confident in deploying on their own networks, on private clouds or into a public cloud. The underlying platform will support all conceivable deployments, enabling a focus on the ISV's application itself that wouldn't be possible with other vendor partners. It stands to reason that Exastack accelerates time to value as well as lowering implementation costs all round. There is a big competitive advantage in partners being able to offer customers an optimised, pre-configured solution rather than an assortment of components and a suggested fit. Once a customer has decided to buy an Oracle Exastack Ready or Optimized partner solution, it will be up and running without any need for the customer to conduct testing of its own. Operational costs and complexity are also reduced, thanks to streamlined customer support through standardised configurations and pro-active monitoring. 'Engineered to Work Together' is a significant statement of Oracle strategy. It guarantees smoother deployment of a single vendor solution, clear ownership with no finger-pointing and the peace of mind of the Oracle Support Centre underpinning the entire product stack. Next steps Every OPN member with packaged applications must seriously consider taking steps to become Exastack Ready, or Exastack Optimized at the first opportunity. That first step down the track is to talk to an expert on the OPN Portal, at the Oracle Partner Business Center or to discuss the next steps with the closest Oracle account manager. Oracle Exastack lab environments and other technical enablement resources are available for OPN members wishing to further their knowledge of Oracle Exastack and qualify their applications for Oracle Exastack Optimized. New Boot Camps and Guided Learning Paths (GLPs), tailored specifically for ISVs, are available for Oracle Exadata Database Machine, Oracle Exalogic Elastic Cloud, Oracle Linux, Oracle Solaris, Oracle Database, and Oracle WebLogic Server. More information about these GLPs and Boot Camps (including delivery dates and locations) are posted on the OPN Competency Center and corresponding OPN Knowledge Zones. Learn more about Oracle Exastack labs and ISV specific enablement resources. "Oracle Specialized partners are of course front-and-centre, with potential customers clearly directed to those partners and to Exadata Ready partners as a matter of priority." --More OpenWorld 2011 highlights for Oracle partners and customers Oracle Application Testing Suite 9.3 application testing solution for Web, SOA and Oracle Applications Oracle Application Express Release 4.1 improving the development of database-centric Web 2.0 applications and reports Oracle Unified Directory 11g helping customers manage the critical identity information that drives their business applications Oracle SOA Suite for healthcare integration Oracle Enterprise Pack for Eclipse 11g demonstrating continued commitment to the developer and open source communities Oracle Coherence 3.7.1, the latest release of the industry's leading distributed in-memory data grid Oracle Process Accelerators helping to simplify and accelerate time-to-value for customers' business process management initiatives Oracle's JD Edwards EnterpriseOne on the iPad meeting the increasingly mobile demands of today's workforces Oracle CRM On Demand Release 19 Innovation Pack introducing industry-leading hosted call centre and enterprise-marketing capabilities designed to drive further revenue and productivity while reducing costs and improving the customer experience Oracle's Primavera Portfolio Management 9 for businesses delivering on project portfolio goals with increased versatility, transparency and accuracy Oracle's PeopleSoft Human Capital Management (HCM) 9.1 On Demand Standard Edition helping customers manage their long-term investment in enterprise-wide business applications New versions of Oracle FLEXCUBE Universal Banking and Oracle FLEXCUBE Investor Servicing for Financial Institutions, as well as Oracle Financial Services Enterprise Case Management, Oracle Financial Services Pricing Management, Oracle Financial Management Analytics and Oracle Tax Analytics Oracle Utilities Network Management System 1.11 offering new modelling and analysis features to improve distribution-grid management for electric utilities Oracle Communications Network Charging and Control 4.4 helping communications service providers (CSPs) offer their customers more flexible charging options Plus many, many more technology announcements, enhancements, momentum news and community updates -- Oracle OpenWorld 2012 A date has already been set for Oracle OpenWorld 2012. Held once again in San Francisco, exhibitors, partners, customers and Oracle people will gather from 30 September until 4 November to meet, network and learn together with the rest of the global Oracle community. Register now for Oracle OpenWorld 2012 and save $$$! We'll reward your early planning for Oracle OpenWorld 2012 with reduced rates. Super Saver deals are now available! -- Back to the welcome page

    Read the article

  • Cloud Based Load Testing Using TF Service &amp; VS 2013

    - by Tarun Arora [Microsoft MVP]
    Originally posted on: http://geekswithblogs.net/TarunArora/archive/2013/06/30/cloud-based-load-testing-using-tf-service-amp-vs-2013.aspx One of the new features announced as part of the Visual Studio 2013 Ultimate Preview is ‘Cloud Based Load Testing’. In this blog post I’ll walk you through, What is Cloud Based Load Testing? How have I been using this feature? – Success story! Where can you find more resources on this feature? What is Cloud Based Load Testing? It goes without saying that performance testing your application not only gives you the confidence that the application will work under heavy levels of stress but also gives you the ability to test how scalable the architecture of your application is. It is important to know how much is too much for your application! Working with various clients in the industry I have realized that the biggest barriers in Load Testing & Performance Testing adoption are, High infrastructure and administration cost that comes with this phase of testing Time taken to procure & set up the test infrastructure Finding use for this infrastructure investment after completion of testing Is cloud the answer? 100% Visual Studio Compatible Scalable and Realistic Start testing in < 2 minutes Intuitive Pay only for what you need Use existing on premise tests on cloud There are a lot of vendors out there offering Cloud Based Load Testing, to name a few, Load Storm Soasta Blaze Meter Blitz And others… The question you may want to ask is, why should you go with Microsoft’s Cloud based Load Test offering. If you are a Microsoft shop or already have investments in Microsoft technologies, you’ll see great benefit in the natural integration this offers with existing Microsoft products such as Visual Studio and Windows Azure. For example, your existing Web tests authored in Visual Studio 2010 or Visual Studio 2012 will run on the cloud without requiring any modifications what so ever. Microsoft’s cloud test rig also supports API based testing, for example, if you are building a WPF application which consumes WCF services, you can write unit tests to invoke the WCF service, these tests can be run on the cloud test rig and loaded with ‘N’ concurrent users for performance testing. If you have your assets already hosted in the Azure and possibly in the same data centre as the Cloud test rig, your Azure app will not incur a usage cost because of the generated traffic since the traffic is coming from the same data centre. The licensing or pricing information on Microsoft’s cloud based Load test service is yet to be announced, but I would expect this to be priced attractively to match the market competition.   The only additional configuration required for running load tests on Microsoft Cloud based Load Tests service is to select the Test run location as Run tests using Visual Studio Team Foundation Service, How have I been using Microsoft’s Cloud based Load Test Service? I have been part of the Microsoft Cloud Based Load Test Service advisory council for the last 7 months. This gave the opportunity to see the product shape up from concept to working solution. I was also the first person outside of Microsoft to try this offering out. This gave me the opportunity to test real world application at various clients using the Microsoft Load Test Service and provide real world feedback to the Microsoft product team. One of the most recent systems I tested using the Load Test Service has been an insurance quote generation engine. This insurance quote generation engine is,   hosted in Windows Azure expected to get quote requests from across the globe expected to handle 5 Million quote requests in a day (not clear how this load will be distributed across the day) There was no way, I could simulate such kind of load from on premise without standing up additional hardware. But Microsoft’s Cloud based Load Test service allowed me to test my key performance testing scenarios, i.e. Simulate expected Load, Endurance Testing, Threshold Testing and Testing for Latency. Simulating expected load: approach to devising a load pattern My approach to devising a load test pattern has been to run the test scenario with 1 user to figure out the response time. Then work out how many users are required to reach the target load. So, for example, to invoke 1 quote from the quote engine software takes 0.5 seconds. Now if you do the math,   1 quote request by 1 user = 0.5 seconds   quotes generated by 1 user in 24 hour = 1 * (((2 * 60) * 60) * 24) = 172,800   quotes generated by 30 users in 24 hours = 172,800 * 30 =  5,184,000 This was a very simple example, if your application requires more concurrent users to test scenario’s such as caching, etc then you can devise your own load pattern, some examples of load test patterns can be found here.  Endurance Testing To test for endurance, I loaded the quote generation engine with an expected fixed user load and ran the test for very long duration such as over 48 hours and observed the affect of the long running test on the Azure infrastructure. Currently Microsoft Load Test service does not support metrics from the machine under test. I used Azure diagnostics to begin with, but later started using Cerebrata Azure Diagnostics Manager to capture the metrics of the machine under test. Threshold Testing To figure out how much user load the application could cope with before falling on its belly, I opted to step load the quote generation engine by incrementing user load with different variations of incremental user load per minute till the application crashed out and forced an IIS reset. Testing for Latency Currently the Microsoft Load Test service does not support generating geographically distributed load, I however, deployed the insurance quote generation engine in different Azure data centres and ran the same set of performance tests to measure for latency. Because I could compare load test results from different runs by exporting the results to excel (this feature is provided out of the box right from Visual Studio 2010) I could see the different in response times. More resources on Microsoft Cloud based Load Test Service A few important links to get you started, Download Visual Studio Ultimate 2013 Preview Getting started guide for load testing using Team Foundation Service Troubleshooting guide for FAQs and known issues Team Foundation Service forum for questions and support Detailed demo and presentation (link to Tech-Ed session recording) Detailed demo and presentation (link to Build session recording) There a few limits on the usage of Microsoft Cloud based Load Test service that you can read about here. If you have any feedback on Microsoft Cloud based Load Test service, feel free to share it with the product team via the Visual Studio User Voice forum. I hope you found this useful. Thank you for taking the time out and reading this blog post. If you enjoyed the post, remember to subscribe to http://feeds.feedburner.com/TarunArora. Stay tuned!

    Read the article

  • B2B and B2C Commerce are alike… but a little different – Oracle Commerce named Leader in Forrester B2B Commerce Wave

    - by Katrina Gosek
    We weren’t surprised to see Oracle Commerce positioned as a Leader in Forrester’s first Commerce Wave focused on B2B, released earlier this month. The reports validates much of what we’ve heard from our largest customers – the world’s largest distribution, manufacturing and high-tech customers who sell billions of dollars of goods and services to other businesses through their Web channels. More importantly, the report confirms something very important: B2B and B2C Commerce are alike… but a little different. B2B and B2C Commerce are alike… Clearly, B2C experiences have set expectations for B2B. Every B2B buyer is a consumer at home and brings the same expectations to a website selling electronic components, aftermarket parts, or MRO products. Forrester calls these rich consumer-based capabilities that help B2B customers do their jobs “table stakes”: search & navigation, promotions, cross-channel commerce and mobile: “Whether they are just beginning to sell online or are in the late stages of launching a next-generation site, B2B eCommerce operations today must: offer a customer experience standard comparable to what leading b2c sites now offer; address the growing influence that mobile devices are having in the workplace; make a qualitative and quantitative business case that drives sustained investment.” Just five years ago, many of our B2B customers’ online business comprised only 5-10% of their total revenue. Today, when we speak to those same brands, we hear about double and triple digit growth in their online channels. Many have seen the percentage of the business they perform in their web channels cross the 30-50% threshold. You can hear first-hand from several Oracle Commerce B2B customers about the success they are seeing, and what they’re trying to accomplish (Carolina Biological, Premier Farnell, DeliXL, Elsevier). This momentum is likely the reason Forrester broke out the separate B2B Commerce Wave from the B2C Wave. In fact, B2B is becoming the larger force in commerce, expected to collect twice the online dollars of B2C this year ($559 billion). But a little different… Despite the similarities, there is a key and very important difference between B2C and B2B. Unlike a consumer shopping for shoes, a business shopper buying from a distributor or manufacturer is coming to the Web channel as a part of their job. So in addition to a rich, consumer-like experience this shopper expects, these B2B buyers need quoting tools and complex pricing capabilities, like eProcurement, bulk order entry, and other self-service tools such as account, contract and organization management.  Forrester also is emphasizing three additional “back-end” tools and capabilities their clients say they need to drive growth in their B2B online channels: i) product information management (PIM), which provides a single system of record for large part lists and product catalogs; ii) web content management (WCM), needed to manage large volumes of unstructured marketing information, and iii) order management systems (OMS), which manage and orchestrate the complex B2B order life cycle from quote through approval, submission to manufacturing, distribution and delivery.  We would like to expand on each of these 3 areas: As Forrester highlights, back-end PIM is definitely needed by B2B Commerce providers. Most B2B companies have made significant investments in enterprise-grade PIMs, given the importance of product data management for aggregation and syndication of content, product attribution, analytics, and handling of complex workflows. While in principle it may sound appealing to have a PIM as part of a commerce offering (especially for SMBs who have to do more with less), our customers have typically found that PIM in a commerce platform is largely redundant with what they already have in-place, and is not fully-featured or robust enough to handle the complexity of the product data sets that B2B distributors and manufacturers usually handle. To meet the PIM needs for commerce, Oracle offers enterprise PIM (Product Hub/Fusion PIM) and a robust enterprise data quality product (EDQP) integrated with the Oracle Commerce solution. These are key differentiators of our offering and these capabilities are becoming even more tightly integrated with Oracle Commerce over time. For Commerce, what customers really need is a robust product catalog and content management system for enabling business users to further enrich and ready catalog and content data to be presented and sold online.  This has been a significant area of investment in the Oracle Commerce platform , which continue to get stronger. We see this combination of capabilities as best meeting the needs of our customers for a commerce platform without adding a largely redundant, less functional PIM in the commerce front-end.   On the topic of web content management, we were pleased to see Forrester recognize Oracle’s unique functional capabilities in this area and the “unique opportunity in the market to lead the convergence of commerce and content management with the amalgamation of Oracle Commerce with WebCenter Sites (formally FatWire).” Strong content management capabilities are critical for distributors and manufacturers who are frequently serving an engineering audience coming to their websites to conduct product research in search of technical data sheets, drawings, videos and more. The convergence of content, commerce, and experience is critical for B2B brands selling online. Regarding order management, Forrester notes that many businesses use their existing back-end enterprise resource planning (ERP) systems to manage order life cycles.  We hear the same from most of our B2B customers, as they already have an ERP system—if not several of them—and are not interested in yet another one.  So what do we take away from the Wave results? Forrester notes that the Oracle Commerce Platform “has always had strong B2B commerce capabilities and Oracle has an exhaustive list of B2B customers using the solution.”  What makes us excited about developing leading B2B solutions are the close relationships with our customers and the clear opportunity in the market – which we’ll address in an exciting new release in the coming months. Oracle has one of the world’s largest B2B customer bases, providing leading solutions across key business-to-business functions – from marketing, sales automation, and service to master data management, and ERP.  To learn more about Oracle’s Commerce product vision and strategy, visit our website and check out these other B2B Commerce Resources: - 2013 B2B Commerce Trends Report - B2B Commerce Whitepaper: Consumerization, Complexity, Change - B2B Commerce Webcast: What Industry Trend Setters Do Right - Internet Retailer, Web Drives Sales for B2B Companies - Internet Retailer, The Web Means Business: B2B Companies Beef Up Their Websites, borrowing from b2c retailers and breaking new ground - Internet Retailer, B2B e-Commerce is poised for growth ----------THIS DOCUMENT IS FOR INFORMATIONAL PURPOSES ONLY AND MAY NOT BE INCORPORATED INTO A CONTRACT OR AGREEMENT 

    Read the article

  • How Oracle Data Integration Customers Differentiate Their Business in Competitive Markets

    - by Irem Radzik
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 With data being a central force in driving innovation and competing effectively, data integration has become a key IT approach to remove silos and ensure working with consistent and trusted data. Especially with the release of 12c version, Oracle Data Integrator and Oracle GoldenGate offer easy-to-use and high-performance solutions that help companies with their critical data initiatives, including big data analytics, moving to cloud architectures, modernizing and connecting transactional systems and more. In a recent press release we announced the great momentum and analyst recognition Oracle Data Integration products have achieved in the data integration and replication market. In this press release we described some of the key new features of Oracle Data Integrator 12c and Oracle GoldenGate 12c. In addition, a few from our 4500+ customers explained how Oracle’s data integration platform helped them achieve their business goals. In this blog post I would like to go over what these customers shared about their experience. Land O’Lakes is one of America’s premier member-owned cooperatives, and offers an extensive line of agricultural supplies, as well as production and business services. Rich Bellefeuille, manager, ETL & data warehouse for Land O’Lakes told us how GoldenGate helped them modernize their critical ERP system without impacting service and how they are moving to new projects with Oracle Data Integrator 12c: “With Oracle GoldenGate 11g, we've been able to migrate our enterprise-wide implementation of Oracle’s JD Edwards EnterpriseOne, ERP system, to a new database and application server platform with minimal downtime to our business. Using Oracle GoldenGate 11g we reduced database migration time from nearly 30 hours to less than 30 minutes. Given our quick success, we are considering expansion of our Oracle GoldenGate 12c footprint. We are also in the midst of deploying a solution leveraging Oracle Data Integrator 12c to manage our pricing data to handle orders more effectively and provide a better relationship with our clients. We feel we are gaining higher productivity and flexibility with Oracle's data integration products." ICON, a global provider of outsourced development services to the pharmaceutical, biotechnology and medical device industries, highlighted the competitive advantage that a solid data integration foundation brings. Diarmaid O’Reilly, enterprise data warehouse manager, ICON plc said “Oracle Data Integrator enables us to align clinical trials intelligence with the information needs of our sponsors. It helps differentiate ICON’s services in an increasingly competitive drug-development industry."  You can find more info on ICON's implementation here. A popular use case for Oracle GoldenGate’s real-time data integration is offloading operational reporting from critical transaction processing systems. SolarWorld, one of the world’s largest solar-technology producers and the largest U.S. solar panel manufacturer, implemented Oracle GoldenGate for real-time data integration of manufacturing data for fast analysis. Russ Toyama, U.S. senior database administrator for SolarWorld told us real-time data helps their operations and GoldenGate’s solution supports high performance of their manufacturing systems: “We use Oracle GoldenGate for real-time data integration into our decision support system, which performs real-time analysis for manufacturing operations to continuously improve product quality, yield and efficiency. With reliable and low-impact data movement capabilities, Oracle GoldenGate also helps ensure that our critical manufacturing systems are stable and operate with high performance."  You can watch the full interview with SolarWorld's Russ Toyama here. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Starwood Hotels and Resorts is one of the many customers that found out how well Oracle Data Integration products work with Oracle Exadata. Gordon Light, senior director of information technology for StarWood Hotels, says they had notable performance gain in loading Oracle Exadata reporting environment: “We leverage Oracle GoldenGate to replicate data from our central reservations systems and other OLTP databases – significantly decreasing the overall ETL duration. Moving forward, we plan to use Oracle GoldenGate to help the company achieve near-real-time reporting.”You can listen about Starwood Hotels' implementation here. Many companies combine the power of Oracle GoldenGate with Oracle Data Integrator to have a single, integrated data integration platform for variety of use cases across the enterprise. Ufone is another good example of that. The leading mobile communications service provider of Pakistan has improved customer service using timely customer data in its data warehouse. Atif Aslam, head of management information systems for Ufone says: “Oracle Data Integrator and Oracle GoldenGate help us integrate information from various systems and provide up-to-date and real-time CRM data updates hourly, rather than daily. The applications have simplified data warehouse operations and allowed business users to make faster and better informed decisions to protect revenue in the fast-moving Pakistani telecommunications market.” You can read more about Ufone's use case here. In our Oracle Data Integration 12c launch webcast back in November we also heard from BT’s CTO Surren Parthab about their use of GoldenGate for moving to private cloud architecture. Surren also shared his perspectives on Oracle Data Integrator 12c and Oracle GoldenGate 12c releases. You can watch the video here. These are only a few examples of leading companies that have made data integration and real-time data access a key part of their data governance and IT modernization initiatives. They have seen real improvements in how their businesses operate and differentiate in today’s competitive markets. You can read about other customer examples in our Ebook: The Path to the Future and access resources including white papers, data sheets, podcasts and more via our Oracle Data Integration resource kit. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • SQL SERVER – 5 Tips for Improving Your Data with expressor Studio

    - by pinaldave
    It’s no secret that bad data leads to bad decisions and poor results.  However, how do you prevent dirty data from taking up residency in your data store?  Some might argue that it’s the responsibility of the person sending you the data.  While that may be true, in practice that will rarely hold up.  It doesn’t matter how many times you ask, you will get the data however they decide to provide it. So now you have bad data.  What constitutes bad data?  There are quite a few valid answers, for example: Invalid date values Inappropriate characters Wrong data Values that exceed a pre-set threshold While it is certainly possible to write your own scripts and custom SQL to identify and deal with these data anomalies, that effort often takes too long and becomes difficult to maintain.  Instead, leveraging an ETL tool like expressor Studio makes the data cleansing process much easier and faster.  Below are some tips for leveraging expressor to get your data into tip-top shape. Tip 1:     Build reusable data objects with embedded cleansing rules One of the new features in expressor Studio 3.2 is the ability to define constraints at the metadata level.  Using expressor’s concept of Semantic Types, you can define reusable data objects that have embedded logic such as constraints for dealing with dirty data.  Once defined, they can be saved as a shared atomic type and then re-applied to other data attributes in other schemas. As you can see in the figure above, I’ve defined a constraint on zip code.  I can then save the constraint rules I defined for zip code as a shared atomic type called zip_type for example.   The next time I get a different data source with a schema that also contains a zip code field, I can simply apply the shared atomic type (shown below) and the previously defined constraints will be automatically applied. Tip 2:     Unlock the power of regular expressions in Semantic Types Another powerful feature introduced in expressor Studio 3.2 is the option to use regular expressions as a constraint.   A regular expression is used to identify patterns within data.   The patterns could be something as simple as a date format or something much more complex such as a street address.  For example, I could define that a valid IP address should be made up of 4 numbers, each 0 to 255, and separated by a period.  So 192.168.23.123 might be a valid IP address whereas 888.777.0.123 would not be.   How can I account for this using regular expressions? A very simple regular expression that would look for any 4 sets of 3 digits separated by a period would be:  ^[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}$ Alternatively, the following would be the exact check for truly valid IP addresses as we had defined above:  ^(25[0-5]|2[0-4][0-9]|1[0-9]{2}|[1-9]?[0-9])\.(25[0-5]|2[0-4][0-9]|1[0-9]{2}|[1-9]?[0-9])\.(25[0-5]|2[0-4][0-9]|1[0-9]{2}|[1-9]?[0-9])\.(25[0-5]|2[0-4][0-9]|1[0-9]{2}|[1-9]?[0-9])$ .  In expressor, we would enter this regular expression as a constraint like this: Here we select the corrective action to be ‘Escalate’, meaning that the expressor Dataflow operator will decide what to do.  Some of the options include rejecting the offending record, skipping it, or aborting the dataflow. Tip 3:     Email pattern expressions that might come in handy In the example schema that I am using, there’s a field for email.  Email addresses are often entered incorrectly because people are trying to avoid spam.  While there are a lot of different ways to define what constitutes a valid email address, a quick search online yields a couple of really useful regular expressions for validating email addresses: This one is short and sweet:  \b[A-Z0-9._%+-]+@[A-Z0-9.-]+\.[A-Z]{2,4}\b (Source: http://www.regular-expressions.info/) This one is more specific about which characters are allowed:  ^([a-zA-Z0-9_\-\.]+)@((\[[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.)|(([a-zA-Z0-9\-]+\.)+))([a-zA-Z]{2,4}|[0-9]{1,3})(\]?)$ (Source: http://regexlib.com/REDetails.aspx?regexp_id=26 ) Tip 4:     Reject “dirty data” for analysis or further processing Yet another feature introduced in expressor Studio 3.2 is the ability to reject records based on constraint violations.  To capture reject records on input, simply specify Reject Record in the Error Handling setting for the Read File operator.  Then attach a Write File operator to the reject port of the Read File operator as such: Next, in the Write File operator, you can configure the expressor operator in a similar way to the Read File.  The key difference would be that the schema needs to be derived from the upstream operator as shown below: Once configured, expressor will output rejected records to the file you specified.  In addition to the rejected records, expressor also captures some diagnostic information that will be helpful towards identifying why the record was rejected.  This makes diagnosing errors much easier! Tip 5:    Use a Filter or Transform after the initial cleansing to finish the job Sometimes you may want to predicate the data cleansing on a more complex set of conditions.  For example, I may only be interested in processing data containing males over the age of 25 in certain zip codes.  Using an expressor Filter operator, you can define the conditional logic which isolates the records of importance away from the others. Alternatively, the expressor Transform operator can be used to alter the input value via a user defined algorithm or transformation.  It also supports the use of conditional logic and data can be rejected based on constraint violations. However, the best tip I can leave you with is to not constrain your solution design approach – expressor operators can be combined in many different ways to achieve the desired results.  For example, in the expressor Dataflow below, I can post-process the reject data from the Filter which did not meet my pre-defined criteria and, if successful, Funnel it back into the flow so that it gets written to the target table. I continue to be impressed that expressor offers all this functionality as part of their FREE expressor Studio desktop ETL tool, which you can download from here.  Their Studio ETL tool is absolutely free and they are very open about saying that if you want to deploy their software on a dedicated Windows Server, you need to purchase their server software, whose pricing is posted on their website. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Extending SQL Azure with Azure worker role – Guest Post by Paras Doshi

    - by pinaldave
    This is guest post by Paras Doshi. Paras Doshi is a research Intern at SolidQ.com and a Microsoft student partner. He is currently working in the domain of SQL Azure. SQL Azure is nothing but a SQL server in the cloud. SQL Azure provides benefits such as on demand rapid provisioning, cost-effective scalability, high availability and reduced management overhead. To see an introduction on SQL Azure, check out the post by Pinal here In this article, we are going to discuss how to extend SQL Azure with the Azure worker role. In other words, we will attempt to write a custom code and host it in the Azure worker role; the aim is to add some features that are not available with SQL Azure currently or features that need to be customized for flexibility. This way we extend the SQL Azure capability by building some solutions that run on Azure as worker roles. To understand Azure worker role, think of it as a windows service in cloud. Azure worker role can perform background processes, and to handle processes such as synchronization and backup, it becomes our ideal tool. First, we will focus on writing a worker role code that synchronizes SQL Azure databases. Before we do so, let’s see some scenarios in which synchronization between SQL Azure databases is beneficial: scaling out access over multiple databases enables us to handle workload efficiently As of now, SQL Azure database can be hosted in one of any six datacenters. By synchronizing databases located in different data centers, one can extend the data by enabling access to geographically distributed data Let us see some scenarios in which SQL server to SQL Azure database synchronization is beneficial To backup SQL Azure database on local infrastructure Rather than investing in local infrastructure for increased workloads, such workloads could be handled by cloud Ability to extend data to different datacenters located across the world to enable efficient data access from remote locations Now, let us develop cloud-based app that synchronizes SQL Azure databases. For an Introduction to developing cloud based apps, click here Now, in this article, I aim to provide a bird’s eye view of how a code that synchronizes SQL Azure databases look like and then list resources that can help you develop the solution from scratch. Now, if you newly add a worker role to the cloud-based project, this is how the code will look like. (Note: I have added comments to the skeleton code to point out the modifications that will be required in the code to carry out the SQL Azure synchronization. Note the placement of Setup() and Sync() function.) Click here (http://parasdoshi1989.files.wordpress.com/2011/06/code-snippet-1-for-extending-sql-azure-with-azure-worker-role1.pdf ) Enabling SQL Azure databases synchronization through sync framework is a two-step process. In the first step, the database is provisioned and sync framework creates tracking tables, stored procedures, triggers, and tables to store metadata to enable synchronization. This is one time step. The code for the same is put in the setup() function which is called once when the worker role starts. Now, the second step is continuous (or on demand) synchronization of SQL Azure databases by propagating changes between databases. This is done on a continuous basis by calling the sync() function in the while loop. The code logic to synchronize changes between SQL Azure databases should be put in the sync() function. Discussing the coding part step by step is out of the scope of this article. Therefore, let me suggest you a resource, which is given here. Also, note that before you start developing the code, you will need to install SYNC framework 2.1 SDK (download here). Further, you will reference some libraries before you start coding. Details regarding the same are available in the article that I just pointed to. You will be charged for data transfers if the databases are not in the same datacenter. For pricing information, go here Currently, a tool named DATA SYNC, which is built on top of sync framework, is available in CTP that allows SQL Azure <-> SQL server and SQL Azure <-> SQL Azure synchronization (without writing single line of code); however, in some cases, the custom code shown in this blogpost provides flexibility that is not available with Data SYNC. For instance, filtering is not supported in the SQL Azure DATA SYNC CTP2; if you wish to have such a functionality now, then you have the option of developing a custom code using SYNC Framework. Now, this code can be easily extended to synchronize at some schedule. Let us say we want the databases to get synchronized every day at 10:00 pm. This is what the code will look like now: (http://parasdoshi1989.files.wordpress.com/2011/06/code-snippet-2-for-extending-sql-azure-with-azure-worker-role.pdf) Don’t you think that by writing such a code, we are imitating the functionality provided by the SQL server agent for a SQL server? Think about it. We are scheduling our administrative task by writing custom code – in other words, we have developed a “Light weight SQL server agent for SQL Azure!” Since the SQL server agent is not currently available in cloud, we have developed a solution that enables us to schedule tasks, and thus we have extended SQL Azure with the Azure worker role! Now if you wish to track jobs, you can do so by storing this data in SQL Azure (or Azure tables). The reason is that Windows Azure is a stateless platform, and we will need to store the state of the job ourselves and the choice that you have is SQL Azure or Azure tables. Note that this solution requires custom code and also it is not UI driven; however, for now, it can act as a temporary solution until SQL server agent is made available in the cloud. Moreover, this solution does not encompass functionalities that a SQL server agent provides, but it does open up an interesting avenue to schedule some of the tasks such as backup and synchronization of SQL Azure databases by writing some custom code in the Azure worker role. Now, let us see one more possibility – i.e., running BCP through a worker role in Azure-hosted services and then uploading the backup files either locally or on blobs. If you upload it locally, then consider the data transfer cost. If you upload it to blobs residing in the same datacenter, then no transfer cost applies but the cost on blob size applies. So, before choosing the option, you need to evaluate your preferences keeping the cost associated with each option in mind. In this article, I have shown that Azure worker role solution could be developed to synchronize SQL Azure databases. Moreover, a light-weight SQL server agent for SQL Azure can be developed. Also we discussed the possibility of running BCP through a worker role in Azure-hosted services for backing up our precious SQL Azure data. Thus, we can extend SQL Azure with the Azure worker role. But remember: you will be charged for running Azure worker roles. So at the end of the day, you need to ask – am I willing to build a custom code and pay money to achieve this functionality? I hope you found this blog post interesting. If you have any questions/feedback, you can comment below or you can mail me at Paras[at]student-partners[dot]com Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Azure, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • B2B and B2C alike… but a little different – Oracle Commerce named Leader in Forrester B2B Commerce Wave

    - by Katrina Gosek
    We weren’t surprised to see Oracle Commerce positioned as a Leader in Forrester Research, Inc.’s first Commerce Wave focused on B2B, “The Forrester Wave™: B2B Commerce Suites, Q4 2013,” released earlier this month. We believe that the report validates much of what we’ve heard from our largest customers – the world’s largest distribution, manufacturing and high-tech customers who sell billions of dollars of goods and services to other businesses through their Web channels. More importantly, we feel that the report confirms something very important: B2B and B2C Commerce are alike… but a little different. B2B and B2C Commerce are alike… Clearly, B2C experiences have set expectations for B2B. Every B2B buyer is a consumer at home and brings the same expectations to a website selling electronic components, aftermarket parts, or MRO products. Forrester calls these rich consumer-based capabilities that help B2B customers do their jobs “table stakes”: front-office content, community, and commerce features that meet customer expectations for 24x7x365 ordering, real-time customer service, and expedited shipping — both online and on mobile devices: “Whether they are just beginning to sell online or are in the late stages of launching a next-generation site, B2B eCommerce operations today must: offer a customer experience standard comparable to what leading b2c sites now offer; address the growing influence that mobile devices are having in the workplace; make a qualitative and quantitative business case that drives sustained investment.” Just five years ago, many of our B2B customers’ online business comprised only 5-10% of their total revenue. Today, when we speak to those same brands, we hear about double and triple digit growth in their online channels. Many have seen the percentage of the business they perform in their web channels cross the 30-50% threshold. You can hear first-hand from several Oracle Commerce B2B customers about the success they are seeing, and what they’re trying to accomplish (Carolina Biological, Premier Farnell, DeliXL, Elsevier). It seems that this market momentum is likely the reason Forrester broke out the separate B2B Commerce Wave from the B2C Wave. In fact, B2B is becoming the larger force in commerce, expected to collect twice the online dollars of B2C this year ($559 billion). But a little different… Despite the similarities, there is a key and very important difference between B2C and B2B. Unlike a consumer shopping for shoes, a business shopper buying from a distributor or manufacturer is coming to the Web channel as a part of their job. So in addition to a rich, consumer-like experience this shopper expects, these B2B buyers need quoting tools and complex pricing capabilities, like eProcurement, bulk order entry, and other self-service tools such as account, contract and organization management. Forrester also is emphasizing three additional “back-end” tools and capabilities their clients say they need to drive growth in their B2B online channels: i) product information management (PIM), which provides a single system of record for large part lists and product catalogs; ii) web content management (WCM), needed to manage large volumes of unstructured marketing information, and iii) order management systems (OMS), which manage and orchestrate the complex B2B order life cycle from quote through approval, submission to manufacturing, distribution and delivery. We would like to expand on each of these 3 areas: As Forrester suggests, back-end PIM is definitely needed by B2B Commerce providers. Most B2B companies have made significant investments in enterprise-grade PIMs, given the importance of product data management for aggregation and syndication of content, product attribution, analytics, and handling of complex workflows. While in principle it may sound appealing to have a PIM as part of a commerce offering (especially for SMBs who have to do more with less), our customers have typically found that PIM in a commerce platform is largely redundant with what they already have in-place, and is not fully-featured or robust enough to handle the complexity of the product data sets that B2B distributors and manufacturers usually handle. To meet the PIM needs for commerce, Oracle offers enterprise PIM (Product Hub/Fusion PIM) and a robust enterprise data quality product (EDQP) integrated with the Oracle Commerce solution. These are key differentiators of our offering and these capabilities are becoming even more tightly integrated with Oracle Commerce over time. For Commerce, what customers really need is a robust product catalog and content management system for enabling business users to further enrich and ready catalog and content data to be presented and sold online.  This has been a significant area of investment in the Oracle Commerce platform , which continue to get stronger. We see this combination of capabilities as best meeting the needs of our customers for a commerce platform without adding a largely redundant, less functional PIM in the commerce front-end.  On the topic of web content management, we were pleased to see Forrester cite Oracle’s differentiated digital experience capability in this area and the “unique opportunity in the market to lead the convergence of commerce and content management with the amalgamation of Oracle Commerce with WebCenter Sites (formally FatWire).” Strong content management capabilities are critical for distributors and manufacturers who are frequently serving an engineering audience coming to their websites to conduct product research in search of technical data sheets, drawings, videos and more. The convergence of content, commerce, and experience is critical for B2B brands selling online. Regarding order management, Forrester notes that many businesses use their existing back-end enterprise resource planning (ERP) systems to manage order life cycles.  We hear the same from most of our B2B customers, as they already have an ERP system—if not several of them—and are not interested in yet another one. So what do we take away from the Wave results? Forrester notes that the Oracle Commerce Platform “has always had strong B2B commerce capabilities and Oracle certainly has an exhaustive list of B2B customers using the solution.”  What makes us excited about developing leading B2B solutions are the close relationships with our customers and the clear opportunity in the market – which we'll address in an exciting new release planned for the next 12 months. Oracle has one of the world’s largest B2B customer bases, providing leading solutions across key business-to-business functions – from marketing, sales automation, and service to master data management, and ERP. To learn more about Oracle’s Commerce product vision and strategy, visit our website and check out these other B2B Commerce Resources: -       2013 B2B Commerce Trends Report -       B2B Commerce Whitepaper: Consumerization, Complexity, Change -       B2B Commerce Webcast: What Industry Trend Setters Do Right -       Internet Retailer, Web Drives Sales for B2B Companies -       Internet Retailer Article, The Web Means Business: B2B Companies Beef Up Their Websites,        borrowing from b2c retailers and breaking new ground -       Internet Retailer Article, B2B e-Commerce is poised for growth

    Read the article

  • New OFM versions released SOA Suite 11.1.1.4 &amp; BPM 11.1.1.4 &amp; JDeveloper 11.1.1.4 WebLogic on JRockit 10.3.4 feedback from the community

    - by Jürgen Kress
    Oracle SOA Suite 11g Installations This is the latest release of the Oracle SOA Suite 11g. Please see the Documentation tab for Release Notes, Installation Guides and other release specific information. Please also see the List of New Features and Samples provided for this release. Release 11gR1 (11.1.1.4.0) Microsoft Windows (32-bit JVM) Linux (32-bit JVM) Generic Oracle JDeveloper 11g Rel 1 (11.1.1.x) (JDeveloper + ADF) Integrated development environment certified on Windows, Linux, and Macintosh. License is free (read the Pricing FAQ). Studio Edition for Windows (1.2 GB) | Studio Edition for Linux (1.3 GB) | See All See Additional Development Tools Oracle WebLogic Server 11g Rel 1 (10.3.4) Installers The WebLogic Server installers include Oracle Coherence and Oracle Enterprise Pack for Eclipse and supports development with other Fusion Middleware products . The zip includes WebLogic Server only and is intended for WebLogic Server development only. Linux x86 (1.1 GB) | Windows x86 (1 GB) Zip for Windows x86, Linux x86, Mac OS X (316 MB) | See All Oracle WebLogic Server 11gR1 (10.3.4) on JRockit Virtual Edition Download For additional downloads please visit the Oracle Fusion Middleware Products Update Center Share your feedback with the @soacommunity on twitter SOASimone Simone Geib SOA Suite 11gR1 (11.1.1.4.0) has just been released: http://www.oracle.com/technetwork/middleware/soasuite/downloads/index.html gschmutz gschmutz My new blog post: WebLogic Server, JDev, SOA, BPM, OSB and CEP 11.1.1.4 (PS3) available! - http://tinyurl.com/4negnpn simon_haslam Simon Haslam I'm very pleased to see WLS 10.3.4 for JRockit VE launched at the same time as the rest of PS3 http://j.mp/gl1nQm (32bit anyway) lucasjellema Lucas Jellema See http://www.oracle.com/ocom/groups/public/@otn/documents/webcontent/156082.xml for PS3 extension downloads BPM, SOA Editor, WebCenter demed demed List of new features in @OracleSOA 11gR1 PS3: http://bit.ly/fVRwsP is not extremely long but huge release by # of bugs fixed. Go! biemond Edwin Biemond WebLogic 10.3.4 new features http://bit.ly/f7L1Eu Exalogic Elastic Cloud , JPA2 , Maven plugin, OWSM policies on WebLogic SCA applications JDeveloper JDeveloper & ADF JDeveloper and Oracle ADF 11g Release 1 Patch Set 3 (11.1.1.4.0): New Features and Bug Fixes http://bit.ly/feghnY simon_haslam Simon Haslam WebLogic Server 10.3.4 (i.e. 11gR1 PS3) available now too http://bit.ly/eeysZ2 JDeveloper JDeveloper & ADF Share your impressions on the new JDeveloper 11g Patchset 3 release that came out today! Download it here: http://bit.ly/dogRN8 VikasAatOracle Vikas Anand SOA Suite 11gR1PS3 is Hotpluggable ...see list of features that @Demed posted..#soa #soacommunity   New versions of Oracle Fusion Middleware 11g R1 (11.1.1.4.x)  include: Oracle WebLogic Server 11g R1 (10.3.4) Oracle SOA Suite 11g R1 (11.1.1.4.0) Oracle Business Process Management 11g R1 (11.1.1.4.0) Oracle Complex Event Processing 11g R1 (11.1.1.4.0) Oracle Application Integration Architecture Foundation Pack 11g R1 (11.1.1.4.0) Oracle Service Bus 11g R1 (11.1.1.4.0) Oracle Enterprise Repository 11g R1 (11.1.1.4.0) Oracle Identity Management 11g R1 (11.1.1.4.0) Oracle Enterprise Content Management 11g R1 (11.1.1.4.0) Oracle WebCenter 11g R1 (11.1.1.4.0) - coming soon Oracle Forms, Reports, Portal & Discoverer 11g R1 (11.1.1.4.0) Oracle Repository Creation Utility 11g R1 (11.1.1.4.0) Oracle JDeveloper & Application Development Runtime 11g R1 (11.1.1.4.0) Resources Download  (OTN) Certification Documentation   New Features in Oracle SOA Suite 11g Release 1 (11.1.1.4.0) Updated: January, 2011 Go to Oracle SOA Suite 11g Doc Introduction Oracle SOA Suite 11gR1 (11.1.1.4.0) includes both bug fixes as well as new features listed below - click on the title of each feature for more details. Downloads, documentation links and more information on the Oracle SOA Suite available on the SOA Suite OTN page and as always, we welcome your feedback on the SOA OTN forum. New in Oracle SOA Suite in this release BPEL Component BPEL 2.0 support in JDeveloper The BPEL editor in JDeveloper now generates BPEL 2.0 code and introduces several new activities. Augmented XML variables auto-initialization capabilities The XML variable auto-initialization capabilities have been enhanced to support two need additional use cases: to initialize the to-spec node if it doesn't exist during the rule and to initialize array elements. New Assign Activity dialog The new Assign Activity supports the same drag & drop paradigm used for the XSLT mapper, greatly streamlining the task of assigning multiple variables. Mediator Component Time window parameter for the resequencer This new parameter lets users initiate a best-effort resequencing based on a time window rather than a number of messages. Support for attachments in the Mediator assign dialog The Mediator assign dialog now supports attachment, enabling usage of the Mediator to transmit attachments even if source and target schemas are different. Adapters & Bindings ChunkSize property added to the File Adapter header properties The ChunkSize property of the File Adapter is now available as a header property, allowing in-process modification of the value for this property. Improved support for distributed WLS JMS topics though automatic rebalancing of listeners The JMS Adapter has been enhanced to subscribe to administrative events from WLS JMS. Based on these events, it dynamically rebalances listeners when there are changes to the members of a local or remote WLS JMS distributed destination. JDeveloper configuration wizard for custom JCA adapters A new wizard is available in JDeveloper to configure custom-built adapters Administration & Enterprise Manager Enhanced purging capabilities to manage database growth Historical instance data can now be purged using three different strategies: batch script, scheduled batch script or data partitioning. Asynchronous bulk instance deletion in Enterprise Manager Bulk deletion of instances in Enterprise Manager now executes as an asynchronous operation in Enterprise Manager, returning control to the user as soon as the action has been submitted and acknowledged. B2B Ability to schedule partner downtime This feature allows trading partners to notify each other about planned downtime and to delay delivery of messages during that period. Message sequencing B2B now supports both inbound and outbound message sequencing. Simplified BAM integration with B2B B2B ships with various pre-configured artifacts to simplify monitoring in BAM. Instance Message Java API for B2B The new instance message Java API supports programmatic access to B2B instance message data. Oracle Service Bus (OSB) Certification of the File and FTP JCA Adapters The File and FTP JCA adapters are now certified for use with Oracle Service Bus (in addition to the native transports). Security enhancements Oracle Service Bus now supports SAML 2.0 as well as the OWSM authorization policies. Check the Oracle Service Bus 11.1.1.4 Release Notes for a complete list of new features. Installation, Hot-Pluggability & Certifications Ability to run Oracle SOA Suite on IBM WebSphere Application Server Oracle SOA Suite can now be deployed on IBM WebSphere Application Server Network Deployment (ND) 7.0.11 and IBM WebSphere Application Server 7.0.11. Single JVM developer installation template Oracle SOA Suite can now be targeted to the WebLogic admin server - there is no requirement to also have a managed server. This topology is intended to minimize the memory foorprint of development environments. This is in addition to the list of supported browsers, operating systems and databases already certified in prior releases. Complex Event Processing (CEP) IDE enhancements This release introduces several enhancements to the development IDE, such as adapter wizards and event-type repository. CQL enhancements CQL enhancements include JDBC data cartridges and parametrized queries. Tracing and injecting events in the Event Processing Network (EPN) In the development environment you can now trace and inject events. Check the Oracle CEP 11.1.1.4 Release Notes for a complete list of new features. SOA Suite page on OTN For more information on SOA Specialization and the SOA Partner Community please feel free to register at www.oracle.com/goto/emea/soa (OPN account required) Blog Twitter LinkedIn Mix Forum Wiki Website Technorati Tags: SOA Suite 11.1.1.4,JDeveloper 11.1.1.4,WebLogic 10.3.4,JRockit 10.3.4,SOA Community,Oracle,OPN,SOA,Simone Geib,Guido Schmutz,Edwin Biemond,Lucas Jellema,Simon Haslam,Demed,Vikas Anand,Jürgen Kress

    Read the article

  • Windows Azure Virtual Machine Readiness and Capacity Assessment for SQL Server

    - by SQLOS Team
    Windows Azure Virtual Machine Readiness and Capacity Assessment for Windows Server Machine Running SQL Server With the release of MAP Toolkit 8.0 Beta, we have added a new scenario to assess your Windows Azure Virtual Machine Readiness. The MAP 8.0 Beta performs a comprehensive assessment of Windows Servers running SQL Server to determine you level of readiness to migrate an on-premise physical or virtual machine to Windows Azure Virtual Machines. The MAP Toolkit then offers suggested changes to prepare the machines for migration, such as upgrading the operating system or SQL Server. MAP Toolkit 8.0 Beta is available for download here Your participation and feedback is very important to make the MAP Toolkit work better for you. We encourage you to participate in the beta program and provide your feedback at [email protected] or through one of our surveys. Now, let’s walk through the MAP Toolkit task for completing the Windows Azure Virtual Machine assessment and capacity planning. The tasks include the following: Perform an inventory View the Windows Azure VM Readiness results and report Collect performance data for determine VM sizing View the Windows Azure Capacity results and report Perform an inventory: 1. To perform an inventory against a single machine or across a complete environment, choose Perform an Inventory to launch the Inventory and Assessment Wizard as shown below: 2. After the Inventory and Assessment Wizard launches, select either the Windows computers or SQL Server scenario to inventory Windows machines. HINT: If you don’t care about completely inventorying a machine, just select the SQL Server scenario. Click Next to Continue. 3. On the Discovery Methods page, select how you want to discover computers and then click Next to continue. Description of Discovery Methods: Use Active Directory Domain Services -- This method allows you to query a domain controller via the Lightweight Directory Access Protocol (LDAP) and select computers in all or specific domains, containers, or OUs. Use this method if all computers and devices are in AD DS. Windows networking protocols --  This method uses the WIN32 LAN Manager application programming interfaces to query the Computer Browser service for computers in workgroups and Windows NT 4.0–based domains. If the computers on the network are not joined to an Active Directory domain, use only the Windows networking protocols option to find computers. System Center Configuration Manager (SCCM) -- This method enables you to inventory computers managed by System Center Configuration Manager (SCCM). You need to provide credentials to the System Center Configuration Manager server in order to inventory the managed computers. When you select this option, the MAP Toolkit will query SCCM for a list of computers and then MAP will connect to these computers. Scan an IP address range -- This method allows you to specify the starting address and ending address of an IP address range. The wizard will then scan all IP addresses in the range and inventory only those computers. Note: This option can perform poorly, if many IP addresses aren’t being used within the range. Manually enter computer names and credentials -- Use this method if you want to inventory a small number of specific computers. Import computer names from a files -- Using this method, you can create a text file with a list of computer names that will be inventoried. 4. On the All Computers Credentials page, enter the accounts that have administrator rights to connect to the discovered machines. This does not need to a domain account, but needs to be a local administrator. I have entered my domain account that is an administrator on my local machine. Click Next after one or more accounts have been added. NOTE: The MAP Toolkit primarily uses Windows Management Instrumentation (WMI) to collect hardware, device, and software information from the remote computers. In order for the MAP Toolkit to successfully connect and inventory computers in your environment, you have to configure your machines to inventory through WMI and also allow your firewall to enable remote access through WMI. The MAP Toolkit also requires remote registry access for certain assessments. In addition to enabling WMI, you need accounts with administrative privileges to access desktops and servers in your environment. 5. On the Credentials Order page, select the order in which want the MAP Toolkit to connect to the machine and SQL Server. Generally just accept the defaults and click Next. 6. On the Enter Computers Manually page, click Create to pull up at dialog to enter one or more computer names. 7. On the Summary page confirm your settings and then click Finish. After clicking Finish the inventory process will start, as shown below: Windows Azure Readiness results and report After the inventory progress has completed, you can review the results under the Database scenario. On the tile, you will see the number of Windows Server machine with SQL Server that were analyzed, the number of machines that are ready to move without changes and the number of machines that require further changes. If you click this Azure VM Readiness tile, you will see additional details and can generate the Windows Azure VM Readiness Report. After the report is generated, select View | Saved Reports and Proposals to view the location of the report. Open up WindowsAzureVMReadiness* report in Excel. On the Windows tab, you can see the results of the assessment. This report has a column for the Operating System and SQL Server assessment and provides a recommendation on how to resolve, if there a component is not supported. Collect Performance Data Launch the Performance Wizard to collect performance information for the Windows Server machines that you would like the MAP Toolkit to suggest a Windows Azure VM size for. Windows Azure Capacity results and report After the performance metrics are collected, the Azure VM Capacity title will display the number of Virtual Machine sizes that are suggested for the Windows Server and Linux machines that were analyzed. You can then click on the Azure VM Capacity tile to see the capacity details and generate the Windows Azure VM Capacity Report. Within this report, you can view the performance data that was collected and the Virtual Machine sizes.   MAP Toolkit 8.0 Beta is available for download here Your participation and feedback is very important to make the MAP Toolkit work better for you. We encourage you to participate in the beta program and provide your feedback at [email protected] or through one of our surveys. Useful References: Windows Azure Homepage How to guides for Windows Azure Virtual Machines Provisioning a SQL Server Virtual Machine on Windows Azure Windows Azure Pricing     Peter Saddow Senior Program Manager – MAP Toolkit Team

    Read the article

  • Cost Comparison Hard Disk Drive to Solid State Drive on Price per Gigabyte - dispelling a myth!

    - by tonyrogerson
    It is often said that Hard Disk Drive storage is significantly cheaper per GiByte than Solid State Devices – this is wholly inaccurate within the database space. People need to look at the cost of the complete solution and not just a single component part in isolation to what is really required to meet the business requirement. Buying a single Hitachi Ultrastar 600GB 3.5” SAS 15Krpm hard disk drive will cost approximately £239.60 (http://scan.co.uk, 22nd March 2012) compared to an OCZ 600GB Z-Drive R4 CM84 PCIe costing £2,316.54 (http://scan.co.uk, 22nd March 2012); I’ve not included FusionIO ioDrive because there is no public pricing available for it – something I never understand and personally when companies do this I immediately think what are they hiding, luckily in FusionIO’s case the product is proven though is expensive compared to OCZ enterprise offerings. On the face of it the single 15Krpm hard disk has a price per GB of £0.39, the SSD £3.86; this is what you will see in the press and this is what sales people will use in comparing the two technologies – do not be fooled by this bullshit people! What is the requirement? The requirement is the database will have a static size of 400GB kept static through archiving so growth and trim will balance the database size, the client requires resilience, there will be several hundred call centre staff querying the database where queries will read a small amount of data but there will be no hot spot in the data so the randomness will come across the entire 400GB of the database, estimates predict that the IOps required will be approximately 4,000IOps at peak times, because it’s a call centre system the IO latency is important and must remain below 5ms per IO. The balance between read and write is 70% read, 30% write. The requirement is now defined and we have three of the most important pieces of the puzzle – space required, estimated IOps and maximum latency per IO. Something to consider with regard SQL Server; write activity requires synchronous IO to the storage media specifically the transaction log; that means the write thread will wait until the IO is completed and hardened off until the thread can continue execution, the requirement has stated that 30% of the system activity will be write so we can expect a high amount of synchronous activity. The hardware solution needs to be defined; two possible solutions: hard disk or solid state based; the real question now is how many hard disks are required to achieve the IO throughput, the latency and resilience, ditto for the solid state. Hard Drive solution On a test on an HP DL380, P410i controller using IOMeter against a single 15Krpm 146GB SAS drive, the throughput given on a transfer size of 8KiB against a 40GiB file on a freshly formatted disk where the partition is the only partition on the disk thus the 40GiB file is on the outer edge of the drive so more sectors can be read before head movement is required: For 100% sequential IO at a queue depth of 16 with 8 worker threads 43,537 IOps at an average latency of 2.93ms (340 MiB/s), for 100% random IO at the same queue depth and worker threads 3,733 IOps at an average latency of 34.06ms (34 MiB/s). The same test was done on the same disk but the test file was 130GiB: For 100% sequential IO at a queue depth of 16 with 8 worker threads 43,537 IOps at an average latency of 2.93ms (340 MiB/s), for 100% random IO at the same queue depth and worker threads 528 IOps at an average latency of 217.49ms (4 MiB/s). From the result it is clear random performance gets worse as the disk fills up – I’m currently writing an article on short stroking which will cover this in detail. Given the work load is random in nature looking at the random performance of the single drive when only 40 GiB of the 146 GB is used gives near the IOps required but the latency is way out. Luckily I have tested 6 x 15Krpm 146GB SAS 15Krpm drives in a RAID 0 using the same test methodology, for the same test above on a 130 GiB for each drive added the performance boost is near linear, for each drive added throughput goes up by 5 MiB/sec, IOps by 700 IOps and latency reducing nearly 50% per drive added (172 ms, 94 ms, 65 ms, 47 ms, 37 ms, 30 ms). This is because the same 130GiB is spread out more as you add drives 130 / 1, 130 / 2, 130 / 3 etc. so implicit short stroking is occurring because there is less file on each drive so less head movement required. The best latency is still 30 ms but we have the IOps required now, but that’s on a 130GiB file and not the 400GiB we need. Some reality check here: a) the drive randomness is more likely to be 50/50 and not a full 100% but the above has highlighted the effect randomness has on the drive and the more a drive fills with data the worse the effect. For argument sake let us assume that for the given workload we need 8 disks to do the job, for resilience reasons we will need 16 because we need to RAID 1+0 them in order to get the throughput and the resilience, RAID 5 would degrade performance. Cost for hard drives: 16 x £239.60 = £3,833.60 For the hard drives we will need disk controllers and a separate external disk array because the likelihood is that the server itself won’t take the drives, a quick spec off DELL for a PowerVault MD1220 which gives the dual pathing with 16 disks 146GB 15Krpm 2.5” disks is priced at £7,438.00, note its probably more once we had two controller cards to sit in the server in, racking etc. Minimum cost taking the DELL quote as an example is therefore: {Cost of Hardware} / {Storage Required} £7,438.60 / 400 = £18.595 per GB £18.59 per GiB is a far cry from the £0.39 we had been told by the salesman and the myth. Yes, the storage array is composed of 16 x 146 disks in RAID 10 (therefore 8 usable) giving an effective usable storage availability of 1168GB but the actual storage requirement is only 400 and the extra disks have had to be purchased to get the  IOps up. Solid State Drive solution A single card significantly exceeds the IOps and latency required, for resilience two will be required. ( £2,316.54 * 2 ) / 400 = £11.58 per GB With the SSD solution only two PCIe sockets are required, no external disk units, no additional controllers, no redundant controllers etc. Conclusion I hope by showing you an example that the myth that hard disk drives are cheaper per GiB than Solid State has now been dispelled - £11.58 per GB for SSD compared to £18.59 for Hard Disk. I’ve not even touched on the running costs, compare the costs of running 18 hard disks, that’s a lot of heat and power compared to two PCIe cards!Just a quick note: I've left a fair amount of information out due to this being a blog! If in doubt, email me :)I'll also deal with the myth that SSD's wear out at a later date as well - that's just way over done still, yes, 5 years ago, but now - no.

    Read the article

  • Creating a Training Lab on Windows Azure

    - by Michael Stephenson
    Originally posted on: http://geekswithblogs.net/michaelstephenson/archive/2013/06/17/153149.aspxThis week we are preparing for a training course that Alan Smith will be running for the support teams at one of my customers around Windows Azure. In order to facilitate the training lab we have a few prerequisites we need to handle. One of the biggest ones is that although the support team all have MSDN accounts the local desktops they work on are not ideal for running most of the labs as we want to give them some additional developer background training around Azure. Some recent Azure announcements really help us in this area: MSDN software can now be used on Azure VM You don't pay for Azure VM's when they are no longer used  Since the support team only have limited experience of Windows Azure and the organisation also have an Enterprise Agreement we decided it would be best value for money to spin up a training lab in a subscription on the EA and then we can turn the machines off when we are done. At the same time we would be able to spin them back up when the users need to do some additional lab work once the training course is completed. In order to achieve this I wanted to create a powershell script which would setup my training lab. The aim was to create 18 VM's which would be based on a prebuilt template with Visual Studio and the Azure development tools. The script I used is described below The Start & Variables The below text will setup the powershell environment and some variables which I will use elsewhere in the script. It will also import the Azure Powershell cmdlets. You can see below that I will need to download my publisher settings file and know some details from my Azure account. At this point I will assume you have a basic understanding of Azure & Powershell so already know how to do this. Set-ExecutionPolicy Unrestrictedcls $startTime = get-dateImport-Module "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\Azure.psd1"# Azure Publisher Settings $azurePublisherSettings = '<Your settings file>.publishsettings'  # Subscription Details $subscriptionName = "<Your subscription name>" $defaultStorageAccount = "<Your default storage account>"  # Affinity Group Details $affinityGroup = '<Your affinity group>' $dataCenter = 'West Europe' # From Get-AzureLocation  # VM Details $baseVMName = 'TRN' $adminUserName = '<Your admin username>' $password = '<Your admin password>' $size = 'Medium' $vmTemplate = '<The name of your VM template image>' $rdpFilePath = '<File path to save RDP files to>' $machineSettingsPath = '<File path to save machine info to>'    Functions In the next section of the script I have some functions which are used to perform certain actions. The first is called CreateVM. This will do the following actions: If the VM already exists it will be deleted Create the cloud service Create the VM from the template I have created Add an endpoint so we can RDP to them all over the same port Download the RDP file so there is a short cut the trainees can easily access the machine via Write settings for the machine to a log file  function CreateVM($machineNo) { # Specify a name for the new VM $machineName = "$baseVMName-$machineNo" Write-Host "Creating VM: $machineName"       # Get the Azure VM Image      $myImage = Get-AzureVMImage $vmTemplate   #If the VM already exists delete and re-create it $existingVm = Get-AzureVM -Name $machineName -ServiceName $serviceName if($existingVm -ne $null) { Write-Host "VM already exists so deleting it" Remove-AzureVM -Name $machineName -ServiceName $serviceName }   "Creating Service" $serviceName = "bupa-azure-train-$machineName" Remove-AzureService -Force -ServiceName $serviceName New-AzureService -Location $dataCenter -ServiceName $serviceName   Write-Host "Creating VM: $machineName" New-AzureQuickVM -Windows -name $machineName -ServiceName $serviceName -ImageName $myImage.ImageName -InstanceSize $size -AdminUsername $adminUserName -Password $password  Write-Host "Updating the RDP endpoint for $machineName" Get-AzureVM -name $machineName -ServiceName $serviceName ` | Add-AzureEndpoint -Name RDP -Protocol TCP -LocalPort 3389 -PublicPort 550 ` | Update-AzureVM    Write-Host "Get the RDP File for machine $machineName" $machineRDPFilePath = "$rdpFilePath\$machineName.rdp" Get-AzureRemoteDesktopFile -name $machineName -ServiceName $serviceName -LocalPath "$machineRDPFilePath"   WriteMachineSettings "$machineName" "$serviceName" }    The delete machine settings function is used to delete the log file before we start re-running the process.  function DeleteMachineSettings() { Write-Host "Deleting the machine settings output file" [System.IO.File]::Delete("$machineSettingsPath"); }    The write machine settings function will get the VM and then record its details to the log file. The importance of the log file is that I can easily provide the information for all of the VM's to our infrastructure team to be able to configure access to all of the VM's    function WriteMachineSettings([string]$vmName, [string]$vmServiceName) { Write-Host "Writing to the machine settings output file"   $vm = Get-AzureVM -name $vmName -ServiceName $vmServiceName $vmEndpoint = Get-AzureEndpoint -VM $vm -Name RDP   $sb = new-object System.Text.StringBuilder $sb.Append("Service Name: "); $sb.Append($vm.ServiceName); $sb.Append(", "); $sb.Append("VM: "); $sb.Append($vm.Name); $sb.Append(", "); $sb.Append("RDP Public Port: "); $sb.Append($vmEndpoint.Port); $sb.Append(", "); $sb.Append("Public DNS: "); $sb.Append($vmEndpoint.Vip); $sb.AppendLine(""); [System.IO.File]::AppendAllText($machineSettingsPath, $sb.ToString());  } # end functions    Rest of Script In the rest of the script it is really just the bit that orchestrates the actions we want to happen. It will load the publisher settings, select the Azure subscription and then loop around the CreateVM function and create 16 VM's  Import-AzurePublishSettingsFile $azurePublisherSettings Set-AzureSubscription -SubscriptionName $subscriptionName -CurrentStorageAccount $defaultStorageAccount Select-AzureSubscription -SubscriptionName $subscriptionName  DeleteMachineSettings    "Starting creating Bupa International Azure Training Lab" $numberOfVMs = 16  for ($index=1; $index -le $numberOfVMs; $index++) { $vmNo = "$index" CreateVM($vmNo); }    "Finished creating Bupa International Azure Training Lab" # Give it a Minute Start-Sleep -s 60  $endTime = get-date "Script run time " + ($endTime - $startTime)    Conclusion As you can see there is nothing too fancy about this script but in our case of creating a small isolated training lab which is not connected to our corporate network then we can easily use this to provision the lab. Im sure if this is of use to anyone you can easily modify it to do other things with the lab environment too. A couple of points to note are that there are some soft limits in Azure about the number of cores and services your subscription can use. You may need to contact the Azure support team to be able to increase this limit. In terms of the real business value of this approach, it was not possible to use the existing desktops to do the training on, and getting some internal virtual machines would have been relatively expensive and time consuming for our ops team to do. With the Azure option we are able to spin these machines up for a temporary period during the training course and then throw them away when we are done. We expect the costing of this test lab to be very small, especially considering we have EA pricing. As a ball park I think my 18 lab VM training environment will cost in the region of $80 per day on our EA. This is a fraction of the cost of the creation of a single VM on premise.

    Read the article

< Previous Page | 17 18 19 20 21 22 23  | Next Page >