Search Results

Search found 493 results on 20 pages for 'shipping'.

Page 14/20 | < Previous Page | 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • how to effectively keep/update postal and telephone code format for each country?

    - by melaos
    hi there, currently we have a table for regex format for phone and postal code for countries that we use to validate when the user register through our forms. but the problem remains on the maintenance on the correctness of these format, thus what's a good way to ensure that we always have the latest copy of this information? is there a web service/etc that i can use to get this? or does it even make sense to keep all these format but instead use a relaxed method to ensure that the user just keys in something which roughly matches the format? the information is used solely for shipping and billing address. we're using asp.net 2.0 btw. thanks ~steve

    Read the article

  • How to add the UAC shield icon to program that still must target XP?

    - by bsruth
    I have a program that still must target Windows XP (_WIN32_WINNT 0x501), as most of our customers still use XP. However, we have been shipping Vista for a while, and are now pushing Windows 7 upgrades. For the software to work correctly on the newer OSs, there are a couple operations that require UAC elevation. I have the elevation code working, but would like to have the UAC icon present on the buttons that launch the UAC process. Unfortunately, all of the options defined in Microsoft's UAC UI document require _WIN32_WINNT 0x600 or newer. Is there any way to get the appropriate UAC icon (Vista and 7 use different ones) to show on the button while still being able to target XP (where no icon will be shown)? I'm using C++, but may be able to adapt a .NET solution.

    Read the article

  • Magento: Add (and retrieve) custom database field for CMS pages

    - by Toby H
    I want to assign custom parameters to CMS pages in Magento (i.e. 'about', 'customer service', etc), so they can be grouped. The end goal is to use the parameters for each page to show (or hide) them in a nav menu. Writing a quick method in the page/html block to retrieve the pages (active only) for the menu was easy, but I can't figure out how to group them so that 'testimonials', 'history', and 'contact' are associated with 'about', and 'return policy', 'shipping', and 'contact' are associated with 'customer service'. Any help to point me in the right direction would be greatly appreciated. Thanks!

    Read the article

  • Hosted Continuous Application Monitoring Services

    - by Ian Silber
    Does anybody know of a good service or tool for continuos application monitoring? I'm looking specifically for something that is hosted, so we don't have to worry to much about the fact that the monitoring tool is actually running. Specifically, we have a few e-commerce customers that we would like to provide detailed monitoring services for. We don't want to simply monitor uptime, we'd like to go through the entire checkout process once a day or even more often to ensure everything's working (adding to cart, shipping calculations, payment processing, etc). We've tried site24x7.com but their recording tool just doesn't seem to offer the level of customization we need. Does anybody have any recommendations?

    Read the article

  • Databinding a list (in an object) to a combobox in VB.net

    - by Casey
    VB 2008 .NET 3.5 I have a custom "Shipment" object that contains a list of "ShippingRate" objects. This list of "ShippingRates" is accessible through a property "Shipment.PossibleShippingRates." That property simply returns a copy of the list of "ShippingRate" for that particular Shipment. My UI layer receives a list of "Shipment" objects from the BLL. This list is databound to a datagridview, and shows details relevant to the Shipment such as Shipping Address, etc. I want to bind the "Shipment.PossibleShippingRates" property to a combobox, such that when the user changes the grid row, the combobox reflects the "PossibleShippingRates" for that "Shipment." In addition, I need a way to store the "ShippingRate" they selected from the combobox for later use. I have tried a few ideas, but none of them work properly. Any ideas on how to do this?

    Read the article

  • ASP.NET MVC 2 Released

    - by ScottGu
    I’m happy to announce that the final release of ASP.NET MVC 2 is now available for VS 2008/Visual Web Developer 2008 Express with ASP.NET 3.5.  You can download and install it from the following locations: Download ASP.NET MVC 2 using the Microsoft Web Platform Installer Download ASP.NET MVC 2 from the Download Center The final release of VS 2010 and Visual Web Developer 2010 will have ASP.NET MVC 2 built-in – so you won’t need an additional install in order to use ASP.NET MVC 2 with them.  ASP.NET MVC 2 We shipped ASP.NET MVC 1 a little less than a year ago.  Since then, almost 1 million developers have downloaded and used the final release, and its popularity has steadily grown month over month. ASP.NET MVC 2 is the next significant update of ASP.NET MVC. It is a compatible update to ASP.NET MVC 1 – so all the knowledge, skills, code, and extensions you already have with ASP.NET MVC continue to work and apply going forward. Like the first release, we are also shipping the source code for ASP.NET MVC 2 under an OSI-compliant open-source license. ASP.NET MVC 2 can be installed side-by-side with ASP.NET MVC 1 (meaning you can have some apps built with V1 and others built with V2 on the same machine).  We have instructions on how to update your existing ASP.NET MVC 1 apps to use ASP.NET MVC 2 using VS 2008 here.  Note that VS 2010 has an automated upgrade wizard that can automatically migrate your existing ASP.NET MVC 1 applications to ASP.NET MVC 2 for you. ASP.NET MVC 2 Features ASP.NET MVC 2 adds a bunch of new capabilities and features.  I’ve started a blog series about some of the new features, and will be covering them in more depth in the weeks ahead.  Some of the new features and capabilities include: New Strongly Typed HTML Helpers Enhanced Model Validation support across both server and client Auto-Scaffold UI Helpers with Template Customization Support for splitting up large applications into “Areas” Asynchronous Controllers support that enables long running tasks in parallel Support for rendering sub-sections of a page/site using Html.RenderAction Lots of new helper functions, utilities, and API enhancements Improved Visual Studio tooling support You can learn more about these features in the “What’s New in ASP.NET MVC 2” document on the www.asp.net/mvc web-site.  We are going to be posting a lot of new tutorials and videos shortly on www.asp.net/mvc that cover all the features in ASP.NET MVC 2 release.  We will also post an updated end-to-end tutorial built entirely with ASP.NET MVC 2 (much like the NerdDinner tutorial that I wrote that covers ASP.NET MVC 1).  Summary The ASP.NET MVC team delivered regular V2 preview releases over the last year to get feedback on the feature set.  I’d like to say a big thank you to everyone who tried out the previews and sent us suggestions/feedback/bug reports.  We hope you like the final release! Scott

    Read the article

  • SQL SERVER – Out of the Box – Activty and Performance Reports from SSSMS

    - by pinaldave
    SQL Server management Studio 2008 is wonderful tool and has many different features. Many times, an average user does not use them as they are not aware about these features. Today, we will learn one such feature. SSMS comes with many inbuilt performance and activity reports, but we do not use it to the full potential. Let us see how we can access these standard reports. Connect to SQL Server Node >> Right Click on it >> Go to Reports >> Click on Standard Reports >> Pick Any Report. Click to Enlarge You can see there are many reports, which an average users needs right away, are available there. Let me list all the reports available. Server Dashboard Configuration Changes History Schema Changes History Scheduler Health Memory Consumption Activity – All Blocking Transactions Activity – All Cursors Activity – All Sessions Activity – Top Sessions Activity – Dormant Sessions Activity -  Top Connections Top Transactions by Age Top Transactions by Blocked Transactions Count Top Transactions by Locks Count Performance – Batch Execution Statistics Performance – Object Execution Statistics Performance – Top Queries by Average CPU Time Performance – Top Queries by Average IO Performance – Top Queries by Total CPU Time Performance – Top Queries by Total IO Service Broker Statistics Transactions Log Shipping Status In fact, when you look at the above list, it is fairly clear that they are very thought out and commonly needed reports that are available in SQL Server 2008. Let us run a couple of reports and observe their result. Performance – Top Queries by Total CPU Time Click to Enlarge Memory Consumption Click to Enlarge There are options for custom reports as well, which we can configure. We will learn about them in some other post. Additionally, you can right click on the reports and export in Excel or PDF. I think this tool can really help those who are just looking for some quick details. Does any of you use this feature, or this feature has some limitations and You would like to see more features? Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Optimization, SQL Performance, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Microsoft Technical Computing

    - by Daniel Moth
    In the past I have described the team I belong to here at Microsoft (Parallel Computing Platform) in terms of contributing to Visual Studio and related products, e.g. .NET Framework. To be more precise, our team is part of the Technical Computing group, which is still part of the Developer Division. This was officially announced externally earlier this month in an exec email (from Bob Muglia, the president of STB, to which DevDiv belongs). Here is an extract: "… As we build the Technical Computing initiative, we will invest in three core areas: 1. Technical computing to the cloud: Microsoft will play a leading role in bringing technical computing power to scientists, engineers and analysts through the cloud. Existing high- performance computing users will benefit from the ability to augment their on-premises systems with cloud resources that enable ‘just-in-time’ processing. This platform will help ensure processing resources are available whenever they are needed—reliably, consistently and quickly. 2. Simplify parallel development: Today, computers are shipping with more processing power than ever, including multiple cores, but most modern software only uses a small amount of the available processing power. Parallel programs are extremely difficult to write, test and trouble shoot. However, a consistent model for parallel programming can help more developers unlock the tremendous power in today’s modern computers and enable a new generation of technical computing. We are delivering new tools to automate and simplify writing software through parallel processing from the desktop… to the cluster… to the cloud. 3. Develop powerful new technical computing tools and applications: We know scientists, engineers and analysts are pushing common tools (i.e., spreadsheets and databases) to the limits with complex, data-intensive models. They need easy access to more computing power and simplified tools to increase the speed of their work. We are building a platform to do this. Our development efforts will yield new, easy-to-use tools and applications that automate data acquisition, modeling, simulation, visualization, workflow and collaboration. This will allow them to spend more time on their work and less time wrestling with complicated technology. …" Our Parallel Computing Platform team is directly responsible for item #2, and we work very closely with the teams delivering items #1 and #3. At the same time as the exec email, our marketing team unveiled a website with interviews that I invite you to check out: Modeling the World. Comments about this post welcome at the original blog.

    Read the article

  • Who Are the BI Users in Your Neighborhood?

    - by Brian Dayton
    Forrester's Boris Evelson recently wrote a blog titled "Who are the BI Personas?" that I enjoyed for a number of reasons. It's a quick read, easy to grasp and (refreshingly) focuses on the users of technology VS the technology. As Evelson admits, he meant to keep the reference chart at a high-level because there are too many different permutations and additional sub-categories to make such a chart useful. For me, I wouldn't head into the technical permutations but more the contextual use of BI and the issues that users experience.  My thoughts brought up more questions than answers such as: Context: -          HOW: With the exception of the "Power User" persona--likely some sort of business or operations analyst? -          WHEN: Are they using the information to make real-time decisions on the front lines (a customer service manager or shipping/logistics VP) or are they using this information for cumulative analysis and business planning? Or both? -          WHERE: What areas of the business are more or less likely to rely on BI across an organization? Human Resources, Operations, Facilities, Finance--- and why are some more prone to use data-driven analysis than others? Issues: -          DELAYS & DRAG ON IT?: One of the persona characteristics Evelson calls out is a reliance on IT. Every persona except for the "Power User" has a heavy reliance on IT for support. What business issues or delays does that cause to users? What is the drag on IT resources who could potentially be creating instead of reporting? -          HOW MANY CLICKS: If BI is being used within the context of a transaction (sales manager looking for upsell opportunities as an example) is that person getting the information within the context of that action or transaction? Or are they minimizing screens, logging into another application or reporting tool, running queries, etc.?   Who are the BI Users in your neighborhood or line of business? Do Evelson's personas resonate--and do the tools that he calls out (he refers to it as "BI Style") resonate with what your personas have or need? Finally, I'm very interested if BI use is viewed as  a bolt-on...or an integrated part of your daily enterprise processes?

    Read the article

  • You do not need a separate SQL Server license for a Standby or Passive server - this Microsoft White Paper explains all

    - by tonyrogerson
    If you were in any doubt at all that you need to license Standby / Passive Failover servers then the White Paper “Do Not Pay Too Much for Your Database Licensing” will settle those doubts. I’ve had debate before people thinking you can only have a single instance as a standby machine, that’s just wrong; it would mean you could have a scenario where you had a 2 node active/passive cluster with database mirroring and log shipping (a total of 4 SQL Server instances) – in that set up you only need to buy one physical license so long as the standby nodes have the same or less physical processors (cores are irrelevant). So next time your supplier suggests you need a license for your standby box tell them you don’t and educate them by pointing them to the white paper. For clarity I’ve copied the extract below from the White Paper. Extract from “Do Not Pay Too Much for Your Database Licensing” Standby Server Customers often implement standby server to make sure the application continues to function in case primary server fails. Standby server continuously receives updates from the primary server and will take over the role of primary server in case of failure in the primary server. Following are comparisons of how each vendor supports standby server licensing. SQL Server Customers does not need to license standby (or passive) server provided that the number of processors in the standby server is equal or less than those in the active server. Oracle DB Oracle requires customer to fully license both active and standby servers even though the standby server is essentially idle most of the time. IBM DB2 IBM licensing on standby server is quite complicated and is different for every editions of DB2. For Enterprise Edition, a minimum of 100 PVUs or 25 Authorized User is needed to license standby server.   The following graph compares prices based on a database application with two processors (dual-core) and 25 users with one standby server. [chart snipped]  Note   All prices are based on newest Intel Xeon Nehalem processor database pricing for purchases within the United States and are in United States dollars. Pricing is based on information available on vendor Web sites for Enterprise Edition. Microsoft SQL Server Enterprise Edition 25 users (CALs) x $164 / CAL + $8,592 / Server = $12,692 (no need to license standby server) Oracle Enterprise Edition (base license without options) Named User Plus minimum (25 Named Users Plus per Core) = 25 x 2 = 50 Named Users Plus x $950 / Named Users Plus x 2 servers = $95,000 IBM DB2 Enterprise Edition (base license without feature pack) Need to purchase 125 Authorized User (400 PVUs/100 PVUs = 4 X 25 = 100 Authorized User + 25 Authorized Users for standby server) = 125 Authorized Users x $1,040 / Authorized Users = $130,000  

    Read the article

  • IT Admin for Thrill Seekers

    - by Tony Davis
    A developer suggested to me recently that the life of the DBA was, surely, a dull one. My first reaction was indignation, but quickly followed by the thought that for many people excitement isn't necessarily the most desirable aspect of their job. It's true that some aspects of the DBA role seem guaranteed to quieten the pulse; in the days of tape backups, time must have slowed to eternity for the person whose job it was to oversee this process, placing tapes into secure containers, ensuring correct labeling, and.sorry, I drifted off there for a second. On the other hand, if you follow the adventures of the likes of Brent Ozar or Tom LaRock, you'd be forgiven for thinking that much of a database guy's time is spent, metaphorically, diving through plate glass windows in tight fitting underwear in order to extract grateful occupants from burning database applications. Alas it isn't true of the majority, but it isn't as dull as some people imagine, and is a helter-skelter ride compared with some other IT roles. Every IT department has people who toil away in shadowy corners doing quiet but mysterious tasks. When you ask them to explain what they do, you almost immediately want them to stop, but you hear enough to appreciate that these tasks are often absolutely vital to the smooth functioning of an IT organization. Compared with them, the DBAs are prima donnas. Here are a few nominations: Installation engineer - install all of the company's laptops and workstations, and software, deal with licensing, shipping and data entry.many organizations, especially those subject to tight regulation, would simply grind to a halt without their efforts. Localization engineer - Not quite software engineering, not quite translation, the job is to rebuild a product in a different language and make sure everything still works. QA Tester - firstly, I should say that the testers at Red Gate seem to me some of the most-fulfilled in the company. I refer here to the QA Tester whose job is more-or-less entirely to read a script, click some buttons and make sure the actual and expected values match. Configuration manager - for example, someone whose main job is to configure build environments so that devs can access their source code; assuredly necessary for the smooth functioning and productivity of the team, and hopefully well-paid. So what other sort of job in IT should one choose if the work of a DBA proves to be too exciting? Or are these roles secretly more exciting than many imagine? I invite you all to put forward your own suggestions. Cheers, Tony.

    Read the article

  • Silverlight Cream for March 14, 2011 -- #1060

    - by Dave Campbell
    In this Issue: Lazar Nikolov, Rudi Grobler, WindowsPhoneGeek, Jesse Liberty, Pete Brown, Jessica Foster, Chris Rouw, Andy Beaulieu, and Colin Eberhardt. Above the Fold: Silverlight: "A Silverlight Resizable TextBlock (and other resizable things)" Colin Eberhardt WP7: "Retrofitting the Trial API" Jessica Foster Shoutouts: Rudi Grobler has a post up that's not Silverlight, but it's cool stuff you may be interested in: WPF Themes now available on NuGet From SilverlightCream.com: Simulating rain in Silverlight Lazar Nikolov has a cool tutorial up at SilverlightShow... Simulating rain. Nice demo near to top, and source code plus a very nice tutorial on the entire process. Making the ApplicationBar bindable Rudi Grobler has a couple new posts up... first this one on making the WP7 AppBar bindable... he's created 2 simple wrappers that make it possible to bind to a method... with source as usual! All about Splash Screens in WP7 – Creating animated Splash Screen WindowsPhoneGeek's latest is about splash screens in WP7, but he goes one better with animated splash screens. Lots of good information including closing points in more of a FAQ-style listing. Testing Network Availability Jesse Liberty's latest is on testing for network availability in your WP7 app. This should be required reading for anyone building a WP7 app, since you never know when the network is going to be there or not. Lighting up on Windows 7 with Native Extensions for Microsoft Silverlight Pete Brown's latest post is about the Native Extensions for Microsoft Silverlight or NESL library. Pete describes what NESL is, a link to the library, installing it, and tons more information. If you wanna know or try NESL... this looks like the place to start. Retrofitting the Trial API Jessica Foster paused on the way to shipping her WP7 app to add in the trial API code. Check out what she went through to complete that task, as she explains the steps and directions she took. Good description, links, and code. WP7 Insights #2: Creating a Splash Screen in WP7 In the 2nd post in his series on WP7 development, Chris Rouw is discussing a WP7 splash screen. He gives some good external links for references then gets right into discussing his code. Air Hockey for Windows Phone 7 Andy Beaulieu shares a tutorial he wrote for the Expression Toolbox site, using the Physics Helper Library and Farseer Physics Engine -- an Air Hockey game for WP7. A Silverlight Resizable TextBlock (and other resizable things) I think Michael Washington had the best comment about Colin Eberhardt's latest outing: "Another WOW example" ... drop this in the pretty darn cool category: an attached behavior that uses a Thumb control within a popup to adorn any UI element to allow the user to resize it! Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Bad Data is Really the Monster

    - by Dain C. Hansen
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} Bad Data is really the monster – is an article written by Bikram Sinha who I borrowed the title and the inspiration for this blog. Sinha writes: “Bad or missing data makes application systems fail when they process order-level data. One of the key items in the supply-chain industry is the product (aka SKU). Therefore, it becomes the most important data element to tie up multiple merchandising processes including purchase order allocation, stock movement, shipping notifications, and inventory details… Bad data can cause huge operational failures and cost millions of dollars in terms of time, resources, and money to clean up and validate data across multiple participating systems. Yes bad data really is the monster, so what do we do about it? Close our eyes and hope it stays in the closet? We’ve tacked this problem for some years now at Oracle, and with our latest introduction of Oracle Enterprise Data Quality along with our integrated Oracle Master Data Management products provides a complete, best-in-class answer to the bad data monster. What’s unique about it? Oracle Enterprise Data Quality also combines powerful data profiling, cleansing, matching, and monitoring capabilities while offering unparalleled ease of use. What makes it unique is that it has dedicated capabilities to address the distinct challenges of both customer and product data quality – [different monsters have different needs of course!]. And the ability to profile data is just as important to identify and measure poor quality data and identify new rules and requirements. Included are semantic and pattern-based recognition to accurately parse and standardize data that is poorly structured. Finally all of the data quality components are integrated with Oracle Master Data Management, including Oracle Customer Hub and Oracle Product Hub, as well as Oracle Data Integrator Enterprise Edition and Oracle CRM. Want to learn more? On Tuesday Nov 15th, I invite you to listen to our webcast on Reduce ERP consolidation risks with Oracle Master Data Management I’ll be joined by our partner iGate Patni and be talking about one specific way to deal with the bad data monster specifically around ERP consolidation. Look forward to seeing you there!

    Read the article

  • Migrating a blog from Orchard 0.5 to 0.9

    - by Bertrand Le Roy
    My personal blog still runs on Orchard 0.5, because the theme that I used to build it is not yet available for more recent versions, but it is still very important for me to know that I can migrate all my content and comments to a new version at any time. Fortunately, Nick Mayne has been consistently shipping a BlogML module a few days after each of the Orchard versions shipped. Because the module gallery for each version is behind a different URL and is kept alive even after a new one shipped, it is very easy to install the module for both versions. Step 0: Setting up the migration environment In order to do the migration, I made a local copy of the production site on my laptop (data included: I'm using SQL CE) and I also created a new local site with a fresh install of Orchard 0.9. Step 1: Enable the gallery feature on both versions From the admin UI, go to Features and locate the Gallery feature under "Packaging". Enable it. You may now click on "Browse Gallery" on the 0.5 instance and "Modules" under "Gallery" for 0.9: Step 2: Install the BlogML module on both versions From the gallery page, locate the BlogML module and install it. Do it on both versions. Then go to Features and enable BlogML under "Content Publishing". Do it on both versions. Step 3: Export from the 0.5 version Click on "Manage Blog" then on "Export using BlogML" from the 0.5 version. The module then informs you of the path of the saved file: Step 4: Import into the 0.9 version From the 0.9 version, click "Import under "Blogs". Click the button to browse to the file that you just saved from 0.5. Then click "Upload file and Import" Step 5: Copy the 0.5 media folder into 0.9 Copy the contents of the 0.5 version's media folder into the media folder of the 0.9 version. Once that is done, you can delete the "Default/Blog Exports" subfolder. Step 6: Configure the target blog Click "Manage Blog", then "Blog Properties" and restore any properties you had on the source blog. For me, it was the title and URL as well as to set the blog as the home page and show it on the main menu: Step 7: Republish the new site to the production server Once this is done and everything works locally, you are ready to publish to the production site. I use FTP. Note: this should work just as well for any couple of versions for which the BlogML module exists, and not just for 0.5 and 0.9.

    Read the article

  • Getting Started with Kinect for Windows.

    - by Vishal
    Hello folks,      Recently I got involved in a project for building a demo application for one of our customers with Kinect for Windows. Yes, something similar what Tom Cruise did in the movie Minority Report. Waving arms, moving stuff around, swipes, speech recognition, manipulating computer screens without even touching it. Pretty cool!!! The idea in the movie showed us how technology would be after 50 years from that day.   Minority Report Movie clip.           Well, that 50 years of time frame got squeezed and recently on Feb 1st 2012, Microsoft released the official Kinect for Windows SDK. That’s just 10 years from the movie release. Although, the product is in it early stages but with developer creativity and continuously improving hardware, those features shown in the movie are not very far away from becoming a reality. Soon after releasing the SDK, Microsoft again announced in March the release of its new Kinect for Windows SDK version 1.5 which is coming out in sometime May. More history about Kinect. Anyways, so for a newbie with Kinect, where would you start. Here is what I would suggest you can do. Watch the Kinect for Windows Quick start Series by Den Fernandez. Download the Kinect for Windows SDK and start playing around with the demos in it. It also comes with some basic Kinect documentation. Coding4Fun Kinect Projects | Lot many more videos and open sources projects information. Kinect for Windows Session at Techdays NL demo by Jesus Rodriguez. Book: Beginning Kinect Programming with the Microsoft Kinect SDK.  | I did go through few of the chapters in this book and based on that, it does talk deeply about core Kinect concepts but in very easy to understand way. I would definitely suggest this book for any Kinect developers. I liked the way it explained the Gestures recognition in Chapter 6. Buy your Kinect device from either Amazon or NewEgg. You will get it cheaper then buying it from Microsoft Store. Personally, I love Newegg.com as I never had any order related or shipping issues with them. I always hate developing UI application but well, you would need to get your hands dirty with WPF too in order to work with Kinect. So get started with WPF concepts too. I will keep adding stuff to the list once I come across them but so far the above list would definitely get you started building your first Kinect apps. Till then Happy Kinecting…!!!!! Thanks, Vishal Mody

    Read the article

  • Stop trying to be perfect

    - by Kyle Burns
    Yes, Bob is my uncle too.  I also think the points in the Manifesto for Software Craftsmanship (manifesto.softwarecraftsmanship.org) are all great.  What amazes me is that tend to confuse the term “well crafted” with “perfect”.  I'm about to say something that will make Quality Assurance managers and many development types as well until you think about it as a craftsman – “Stop trying to be perfect”. Now let me explain what I mean.  Building software, as with building almost anything, often involves a series of trade-offs where either one undesired characteristic is accepted as necessary to achieve another desired one (or maybe stave off one that is even less desirable) or a desirable characteristic is sacrificed for the same reasons.  This implies that perfection itself is unattainable.  What is attainable is “sufficient” and I think that this really goes to the heart both of what people are trying to do with Agile and with the craftsmanship movement.  Simply put, sufficient software drives the greatest business value.   I've been in many meetings where “how can we keep anything from ever going wrong” has become the thing that holds us in analysis paralysis.  I've also been the guy trying way too hard to perfect some function to make sure that every edge case is accounted for.  Somewhere in there, something a drill instructor said while I was in boot camp occurred to me.  In response to being asked a question by another recruit having to do with some edge case (I can barely remember the context), he said “What if grasshoppers had machine guns?  Would the birds still **** with them?”  It sounds funny, but there's a lot of wisdom in those words.   “Sufficient” is different for every situation and it’s important to understand what sufficient means in the context of the work you’re doing.  If I’m writing a timesheet application (and please shoot me if I am), I’m going to have a much higher tolerance for imperfection than if you’re writing software to control life support systems on spacecraft.  I’m also likely to have less need for high volume performance than if you’re writing software to control stock trading transactions.   I’d encourage anyone who has read this far to instead of trying to be perfect, try to create software that is sufficient in every way.  If you’re working to make a component that is sufficient “better”, ask yourself if there is any component left that is not yet sufficient.  If the answer is “yes” you’re working on the wrong thing and need to adjust.  If the answer is “no”, why aren’t you shipping and delivering business value?

    Read the article

  • Who Are the BI Users in Your Neighborhood?

    - by [email protected]
    By Brian Dayton on March 19, 2010 10:52 PM Forrester's Boris Evelson recently wrote a blog titled "Who are the BI Personas?" that I enjoyed for a number of reasons. It's a quick read, easy to grasp and (refreshingly) focuses on the users of technology VS the technology. As Evelson admits, he meant to keep the reference chart at a high-level because there are too many different permutations and additional sub-categories to make such a chart useful. For me, I wouldn't head into the technical permutations but more the contextual use of BI and the issues that users experience. My thoughts brought up more questions than answers such as: Context: - HOW: With the exception of the "Power User" persona--likely some sort of business or operations analyst? - WHEN: Are they using the information to make real-time decisions on the front lines (a customer service manager or shipping/logistics VP) or are they using this information for cumulative analysis and business planning? Or both? - WHERE: What areas of the business are more or less likely to rely on BI across an organization? Human Resources, Operations, Facilities, Finance--- and why are some more prone to use data-driven analysis than others? Issues: - DELAYS & DRAG ON IT?: One of the persona characteristics Evelson calls out is a reliance on IT. Every persona except for the "Power User" has a heavy reliance on IT for support. What business issues or delays does that cause to users? What is the drag on IT resources who could potentially be creating instead of reporting? - HOW MANY CLICKS: If BI is being used within the context of a transaction (sales manager looking for upsell opportunities as an example) is that person getting the information within the context of that action or transaction? Or are they minimizing screens, logging into another application or reporting tool, running queries, etc.? Who are the BI Users in your neighborhood or line of business? Do Evelson's personas resonate--and do the tools that he calls out (he refers to it as "BI Style") resonate with what your personas have or need? Finally, I'm very interested if BI use is viewed as a bolt-on...or an integrated part of your daily enterprise processes?

    Read the article

  • Will you share your SQL Server configuration?

    - by Bill Graziano
    I regularly visit client sites and review their SQL Server configurations.  I come across all kinds of strange settings.  I’ve been thinking about a way to aggregate people’s configurations and see what’s common and what’s unique.  I used to do that with polls on SQLTeam.com.  I think we can find out more interesting things if we look at combinations of settings in relation to size and volume. I’ve been working on an application for another project that is similar.  It will be fairly easy to use that code for this.  I can have something up and running in a few days – if people are interested in it.  I admit that I often come up with ideas that just don’t make sense.  This may be one of them.  One of your biggest concerns has be how secure your data is.  My solution is not to store anything identifying.  The instance name and database names can both be “anonymized” and I don’t store the machine name or IP address or anything to do with logins. Some of the questions I’m curious about are: At what size database does the Enterprise Edition become prevalent? Given the total size of the databases how much RAM is common? How many people have multiple data files?  At what size does that become prevalent? How common is database mirroring?  Replication?  Log shipping? How common is full recovery mode?  At what data size does it become prevalent? I think those are all questions that are easy to answer -- with the right data.  The big question is whether or not people will share their SQL Server configurations.  I understand that organizations in regulated or high security environments can’t participate.  But I think that leaves many, many people that can.  Are you willing to share your configuration and learn about others?  I have a simple sign up form here.  It’s actually a mailing list signup that also captures your edition, number of servers and largest database.  The list will only be used for this project.  Is your SQL Server is configured correctly?  Do you wonder what the next step is as your data grows?  Take a second and sign up.

    Read the article

  • Showrooming: What's the big deal?

    - by David Dorf
    There's been lots of chatter recently on how retailers will combat showrooming this holiday season.  Best Buy and Target, for example, plan to price-match certain online sites.  But from my perspective, the whole showrooming concept is overblown.  Yes, mobile phones make is easier to comparison-shop, but consumers have been doing that all along.  Retailers have to work hard to merchandise their stores with the right products at the right price with the right promotions.  Its Retail 101. Yeah ok, many websites don't have to charge tax so they have an advantage, but they also have to cover shipping costs. Brick-and-mortar stores have the opportunity to provide expertise, fit, and instant gratification all of which are pretty big advantages. I see lots of studies that claim a large percentage of shoppers are showrooming.  Now I don't do much shopping, but when I do I rarely see anyone scanning UPC codes in the aisles.  If you dig into those studies, the question is usually something like, "have you used your mobile phone to price compare while shopping in the last year."  Well yeah, I did it once -- out of the 20 shopping trips.  And by the way, the in-store price was close enough to just buy the item.  Based on casual observation and informal surveys of friends, showrooming is not the modus-operandi for today's busy shoppers. I never see people showrooming in grocery stores, and most people don't bother for fashion.  For big purchases like appliances and furniture, I bet most people do their research online before entering the store.  The cases where I've done it was to see if a promotion was in fact a good deal.  Or even to make sure the in-store price is the same as the online price for the same brand. So, if you think you're a victim of showrooming, I suggest you look at the bigger picture.  Are you providing an engaging store experience?  Are you allowing customers to shop the way they want to shop, using various touchpoints?  Are you monitoring the competition to ensure prices are competitive?  Are your promotions attracting the right customers? Hubert Jolly, CEO of Best Buy, recently commented that showrooming might just get more people into his stores. "Once customers are in our stores, they're ours to lose."

    Read the article

  • Solaris 11

    - by user9154181
    Oracle has a strict policy about not discussing product features until they appear in shipping product. Now that Solaris 11 is publically available, it is time to catch up. I will be shortly posting articles on a variety of new developments in the Solaris linkers and related bits: 64-bit Archives After 40+ years of Unix, the archive file format has run out of room. The ar and link-editor (ld) commands have been enhanced to allow archives to grow past their previous 32-bit limits. Guidance The link-editor is now willing and able to tell you how to alter your link lines in order to build better objects. Stub Objects This is one of the bigger projects I've undertaken since joining the Solaris group. Stub objects are shared objects, built entirely from mapfiles, that supply the same linking interface as the real object, while containing no code or data. You can link to them, but cannot use them at runtime. It was pretty simple to add this ability to the link-editor, but the changes to the OSnet in order to apply them to building Solaris were massive. I discuss how we came to invent stub objects, how we apply them to build the OSnet in a more parallel and scalable manner, and about the follow on opportunities that have emerged from the new stub proto area we created to hold them. The elffile Utility A new standard Solaris utility, elffile is a variant of the file utility, focused exclusively on linker related files. elffile is of particular value for examining archives, as it allows you to find out what is inside them without having to first extract the archive members into temporary files. This release has been a long time coming. I joined the Solaris group in late 2005, and this will be my first FCS. From a user perspective, Solaris 11 is probably the biggest change to Solaris since Solaris 2.0. Solaris 11 polishes the ground breaking features from Solaris 10 (DTrace, FMA, ZFS, Zones), and uses them to add a powerful new packaging system, numerous other enhacements and features, along with a huge modernization effort. I'm excited to see it go out into the world. I hope you enjoy using it as much as we did creating it. Software is never done. On to the next one...

    Read the article

  • Cloud Infrastructure has a new standard

    - by macoracle
    I have been working for more than two years now in the DMTF working group tasked with creating a Cloud Management standard. That work has culminated in the release today of the Cloud Infrastructure Management Interface (CIMI) version 1.0 by the DMTF. CIMI is a single interface that a cloud consumer can use to manage their cloud infrastructure in multiple clouds. As CIMI is adopted by the cloud vendors, no more will you need to adapt client code to each of the proprietary interfaces from these multiple vendors. Unlike a de facto standard where typically one vendor has change control over the interface, and everyone else has to reverse engineer the inner workings of it, CIMI is a de jure standard that is under change control of a standards body. One reason the standard took two years to create is that we factored in use cases, requirements and contributed APIs from multiple vendors. These vendors have products shipping today and as a result CIMI has a strong foundation in real world experience. What does CIMI allow? CIMI is both a model for the resources (computing, storage networking) in the cloud as well as a RESTful protocol binding to HTTP. This means that to create a Machine (guest VM) for example, the client creates a “document” that represents the Machine resource and sends it to the server using HTTP. CIMI allows the resources to be encoded in either JavaScript Object Notation (JSON) or the eXentsible Markup Language (XML). CIMI provides a model for the resources that can be mapped to any existing cloud infrastructure offering on the market. There are some features in CIMI that may not be supported by every cloud, but CIMI also supports the discovery of which features are implemented. This means that you can still have a client that works across multiple clouds and is able to take full advantage of the features in each of them. Isn’t it too early for a standard? A key feature of a successful standard is that it allows for compatible extensions to occur within the core framework of the interface itself. CIMI’s feature discovery (through metadata) is used to convey to the client that additional features that may be vendor specific have been implemented. As multiple vendors implement such features, they become candidates to add the future versions of CIMI. Thus innovation can continue in the cloud space without being slowed down by a lowest common denominator type of specification. Since CIMI was developed in the open by dozens of stakeholders who are already implementing infrastructure clouds, I expect to CIMI being adopted by these same companies and others over the next year or two. Cloud Customers who can see the benefit of this standard should start to ask their cloud vendors to show a CIMI implementation in their roadmap.  For more information on CIMI and the DMTF's other cloud efforts, go to: http://dmtf.org/cloud

    Read the article

  • Understanding When Social Interactions Should Be Resolved in Another Channel

    - by Christina McKeon
    Guest Blogger: Aphrodite Brinsmead, Senior Analyst at Ovum Agents need to respond to customers’ social comments and questions quickly and in the right tone. But more importantly, they need to offer resolutions. Customers care most about how long it takes to find information rather than which channel they are using. They choose to use social media because they are comfortable with the channel and it offers a convenient way to communicate. Ideally agents will resolve questions within social media, but they need guidance as to how and when to escalate interactions to a more private channel. First, businesses should assess the way in which customers are using social media to communicate with them and categorize posts into groups: complaints, feedback, technical queries or more general support questions. They should then consider the types of interactions that can easily be handled within social media and those that need to be followed up in another channel. This will be very dependent on the industry. Examples of queries that can be resolved in social media include Shipping pricing and timeframes Outage updates and resolution plans Flight status information Product stock check Technical support videos or forum posts Availability of facilities Both customers and agents need to be educated about the types of questions they can expect to resolve within social media. As the channel matures as a customer service tool, it needs to have value other than just as a forum for complaints. Social customer service agents need the power to start a web chat or phone call Any questions where customers need to divulge personal details in order to get a resolution will need to be addressed in a private channel: a private social message, web chat, email or phone call. Customers should never disclose their date of birth, social security, credit card number, or healthcare records in a public forum. Flight issues, changes to a booking, billing queries or account updates will all need to be completed via a private interaction. Agents responding to questions on social media need the ability to start a web chat or phone call with the customer. The customer doesn’t want to have to repeat their question and the agent should be empowered to connect customer records and access account or billing information. These agents will need to be trained across different channels and should be able to view all customer communications in one application. They also need to follow up questions that began on a public forum in the initial channel to make it clear that the issue was addressed. In order to make this possible, social media needs to be integrated as part of a broader customer service strategy. Irrespective of how many channels are used to complete an interaction, businesses should prioritize customer satisfaction and issue resolution. They need a clear strategy and trained agents that can handle and respond to social interactions. Follow me on Twitter @diteb. 

    Read the article

  • Oracle EBS ????“????”???????

    - by Steve He(???)
    v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} Normal 0 7.8 ? 0 2 false false false EN-US ZH-CN X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:????; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.5pt; mso-bidi-font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:??; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi; mso-font-kerning:1.0pt;} Oracle E-Business ?????????????????????,???????????EBS?????????????????“????”,???????????????????????????????? ????????? notes ?????,?????????????????????,?notes??????????????????,????????????? note,????????  ???????????????????????????????,??????????????????????????????,“????”??????????????????????????,???????????????????????????? ?? EBS ????????“????”?????? Doc ID 1501724.1 ???? EBS “????”???? ??????Receivables Transactions?“????”: ??? "Entering / Updating Transactions"????,????????: ????? "Transaction numbers are not in sequence",????????????ID 197212.1: How To Setup Gapless Document Sequencing in Receivables. EBS?????????“????” ?: Advanced Pricing Applications Technology Configurator General Ledger Human Capital Management Inventory Management Order Management Payables Process Manufacturing Purchasing Receivables Shipping Value Chain Planning

    Read the article

  • Postgresql base backup script

    - by Terry Lorber
    I'm using the following script to do a file-level backup of Postgresql. I sometimes see that the last part, to do cleanup after "pgs_backup_stop" is called, hangs while it waits for the last WAL to be created. The REF_FILE to search for is sometimes wrong. I'm also shipping these files to a different machine, every 5 minutes via rsync. What do other people do to safely remove old WAL files? #!/bin/bash PGDATA=/usr/local/pgsql/data WAL_ARCHIVE=/usr/local/pgsql/archives PGBACKUP=/usr/local/pgsqlbackup PSQL=/usr/local/pgsql/bin/psql today=`date +%Y%m%d-%H%M%S` label=base_backup_${today} echo "Executing pg_start_backup with label $label in server ... " CP=`$PSQL -q -Upostgres -d template1 -c "SELECT pg_start_backup('$label');" -P tuples_only -P format=unaligned` RVAL=$? echo "Begin CheckPoint is $CP" if [ ${RVAL} -ne 0 ] then echo "PSQL pg_start_backup failed" exit 1; fi echo "pg_start_backup executed successfully" echo "TAR begins ... " pushd $PGBACKUP tar -cjf pgdata-$today.tar.bz2 --exclude='pg_xlog' $PGDATA/* popd echo "TAR completed" echo "Executing pg_stop_backup in server ... " $PSQL -Upostgres template1 -c "SELECT pg_stop_backup();" if [ $? -ne 0 ] then echo "PSQL pg_stop_backup failed" exit 1; fi echo "pg_stop_backup done successfully" TO_SEARCH="*${CP:0:2}000000${CP:3:2}.00${CP:5}" echo "Check for ${WAL_ARCHIVE}/${TO_SEARCH}.backup" while [ ! -e ${WAL_ARCHIVE}/${TO_SEARCH}.backup ]; do echo "Waiting for ${WAL_ARCHIVE}/${TO_SEARCH}.backup" sleep 1 done REF_FILE="`echo ${WAL_ARCHIVE}/*${CP:0:2}000000${CP:3:2}`" echo "Reference file ${REF_FILE}" # "-not -newer" or "\! -newer" will also return REF_FILE # so you have to grep it out and use xargs; otherwise you # could also use the -delete action find ${WAL_ARCHIVE} -not -newer ${REF_FILE} -type f | grep -v "^${REF_FILE}$" | xargs rm -f REF_FILE="`echo ${PGBACKUP}/pgdata-$today.tar.bz2`" echo "Reference file ${REF_FILE}" find $PGBACKUP -not -newer ${REF_FILE} -type f -name pgdata* | grep -v "^${REF_FILE}$" | xargs rm -f

    Read the article

  • Memory upgrade to 8GB on unibody MacBook

    - by dlamblin
    I bought the 13" unibody MacBook one month prior to it's upgrade to become the new improved 13" unibody MacBook Pro (with unopenable battery compartment, extra 3 hours of battery life, more color gammut, SD slot, firewire 800 port, and a new 8Gb memory limit). My MacBook says it is limited to 4GB of DDR3 1066mhz memory, in 2 SO-DIMMs. But that was written back when you couldn't get 2 4GB SO-DIMMs. Now that you can get them, the very similar MacBook Pro is shipping with 4GB standard, and can be upgraded to 8GB. I've asked (Apple reps), and I'm repeatedly told that my model cannot be upgraded to 8. When I ask for the reason they alway say: "Because that's the published limit at the time your mac was built." I find this unconvincing. If they said: "Because the memory controller in the chipset is limited to 4GB, despite being seemingly identical to the memory controller in the same chipset newest MacBook model," then I'd just take their word for it. Has anyone tried it, or found any research as to whether two 4GB DDR3 1066mhz SO-DIMMs can be installed in the unibody MacBook without FireWire?

    Read the article

< Previous Page | 10 11 12 13 14 15 16 17 18 19 20  | Next Page >