Search Results

Search found 21777 results on 872 pages for 'howard may'.

Page 136/872 | < Previous Page | 132 133 134 135 136 137 138 139 140 141 142 143  | Next Page >

  • Good Practices in Software Outsourcing

    When an organization has dedicated to outsource a job to other company outside its organizational structure, it may have to deal with risks pertaining to software outsourcing. Hiring a stranger to ge... [Author: Chirag Vyas - Web Design and Development - April 09, 2010]

    Read the article

  • The importance of Unit Testing in BI

    - by Davide Mauri
    One of the main steps in the process we internally use to develop a BI solution is the implementation of Unit Test of you BI Data. As you may already know, I’ve create a simple (for now) tool that leverages NUnit to allow us to quickly create Unit Testing without having to resort to use Visual Studio Database Professional: http://queryunit.codeplex.com/ Once you have a tool like this one, you can start also to make sure that your BI solution (DWH and CUBE) is not only structurally sound (I mean, the cube or the report gets processed correctly), but you can also check that the logical integrity of your business rules is enforced. For example let’s say that the customer tell you that they will never create an invoice for a specific product-line in 2010 since that product-line is dismissed and will never be sold again. Ok we know that this in theory is true, but a lot of this business rule effectiveness depends on the fact the people does not do a mistake while inserting new orders/invoices and the ERP used implements a check for this business logic. Unfortunately these last two hypotesis are not always true, so you may find yourself really having some invoices for a product line that doesn’t exists anymore. Maybe this kind of situation in future will be solved using Master Data Management but, meanwhile, how you can give and idea of the data quality to your customers? How can you check that logical integrity of the analytical data you produce is exactly what you expect? Well, Unit Testing of a DWH or a CUBE can be a solution. Once you have defined your test suite, by writing SQL and MDX queries that checks that your data is what you expect to be, if you use NUnit (and QueryUnit does), you can then use a tool like NUnit2Report to create a nice HTML report that can be shipped via email to give information of data quality: In addition to that, since NUnit produces an XML file as a result, you can also import it into a SQL Server Database and then monitor the quality of data over time. I’ll be speaking about this approach (and more in general about how to “engineer” a BI solution) at the next European SQL PASS Adaptive BI Best Practices http://www.sqlpass.org/summit/eu2010/Agenda/ProgramSessions/AdaptiveBIBestPratices.aspx I’ll enjoy discussing with you all about this, so see you there! And remember: “if ain't tested it's broken!” (Sorry I don’t remember how said that in first place :-)) Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Don't Cut Corners on Server Defragmentation

    Hard-Core Hardware: Fragmentation may not cut it as a big screen villain, but it remains a threat and handicap to optimal server performance. In this era of massive hard drives and virtualization, minimizing fragmentation is more critical than ever.

    Read the article

  • Don't Cut Corners on Server Defragmentation

    Hard-Core Hardware: Fragmentation may not cut it as a big screen villain, but it remains a threat and handicap to optimal server performance. In this era of massive hard drives and virtualization, minimizing fragmentation is more critical than ever.

    Read the article

  • How to install postgresql-8.4?

    - by ted
    sudo add-apt-repository ppa:pitti/postgresql # ... OK sudo apt-get install postgresql-8.4 Reading package lists... Done Building dependency tree Reading state information... Done Package postgresql-8.4 is not available, but is referred to by another package. This may mean that the package is missing, has been obsoleted, or is only available from another source E: Package 'postgresql-8.4' has no installation candidate Should I download it from http://packages.debian.org/squeeze/postgresql-8.4 and manually install all the dependencies?

    Read the article

  • Building dynamic OLAP data marts on-the-fly

    - by DrJohn
    At the forthcoming SQLBits conference, I will be presenting a session on how to dynamically build an OLAP data mart on-the-fly. This blog entry is intended to clarify exactly what I mean by an OLAP data mart, why you may need to build them on-the-fly and finally outline the steps needed to build them dynamically. In subsequent blog entries, I will present exactly how to implement some of the techniques involved. What is an OLAP data mart? In data warehousing parlance, a data mart is a subset of the overall corporate data provided to business users to meet specific business needs. Of course, the term does not specify the technology involved, so I coined the term "OLAP data mart" to identify a subset of data which is delivered in the form of an OLAP cube which may be accompanied by the relational database upon which it was built. To clarify, the relational database is specifically create and loaded with the subset of data and then the OLAP cube is built and processed to make the data available to the end-users via standard OLAP client tools. Why build OLAP data marts? Market research companies sell data to their clients to make money. To gain competitive advantage, market research providers like to "add value" to their data by providing systems that enhance analytics, thereby allowing clients to make best use of the data. As such, OLAP cubes have become a standard way of delivering added value to clients. They can be built on-the-fly to hold specific data sets and meet particular needs and then hosted on a secure intranet site for remote access, or shipped to clients' own infrastructure for hosting. Even better, they support a wide range of different tools for analytical purposes, including the ever popular Microsoft Excel. Extension Attributes: The Challenge One of the key challenges in building multiple OLAP data marts based on the same 'template' is handling extension attributes. These are attributes that meet the client's specific reporting needs, but do not form part of the standard template. Now clearly, these extension attributes have to come into the system via additional files and ultimately be added to relational tables so they can end up in the OLAP cube. However, processing these files and filling dynamically altered tables with SSIS is a challenge as SSIS packages tend to break as soon as the database schema changes. There are two approaches to this: (1) dynamically build an SSIS package in memory to match the new database schema using C#, or (2) have the extension attributes provided as name/value pairs so the file's schema does not change and can easily be loaded using SSIS. The problem with the first approach is the complexity of writing an awful lot of complex C# code. The problem of the second approach is that name/value pairs are useless to an OLAP cube; so they have to be pivoted back into a proper relational table somewhere in the data load process WITHOUT breaking SSIS. How this can be done will be part of future blog entry. What is involved in building an OLAP data mart? There are a great many steps involved in building OLAP data marts on-the-fly. The key point is that all the steps must be automated to allow for the production of multiple OLAP data marts per day (i.e. many thousands, each with its own specific data set and attributes). Now most of these steps have a great deal in common with standard data warehouse practices. The key difference is that the databases are all built to order. The only permanent database is the metadata database (shown in orange) which holds all the metadata needed to build everything else (i.e. client orders, configuration information, connection strings, client specific requirements and attributes etc.). The staging database (shown in red) has a short life: it is built, populated and then ripped down as soon as the OLAP Data Mart has been populated. In the diagram below, the OLAP data mart comprises the two blue components: the Data Mart which is a relational database and the OLAP Cube which is an OLAP database implemented using Microsoft Analysis Services (SSAS). The client may receive just the OLAP cube or both components together depending on their reporting requirements.  So, in broad terms the steps required to fulfil a client order are as follows: Step 1: Prepare metadata Create a set of database names unique to the client's order Modify all package connection strings to be used by SSIS to point to new databases and file locations. Step 2: Create relational databases Create the staging and data mart relational databases using dynamic SQL and set the database recovery mode to SIMPLE as we do not need the overhead of logging anything Execute SQL scripts to build all database objects (tables, views, functions and stored procedures) in the two databases Step 3: Load staging database Use SSIS to load all data files into the staging database in a parallel operation Load extension files containing name/value pairs. These will provide client-specific attributes in the OLAP cube. Step 4: Load data mart relational database Load the data from staging into the data mart relational database, again in parallel where possible Allocate surrogate keys and use SSIS to perform surrogate key lookup during the load of fact tables Step 5: Load extension tables & attributes Pivot the extension attributes from their native name/value pairs into proper relational tables Add the extension attributes to the views used by OLAP cube Step 6: Deploy & Process OLAP cube Deploy the OLAP database directly to the server using a C# script task in SSIS Modify the connection string used by the OLAP cube to point to the data mart relational database Modify the cube structure to add the extension attributes to both the data source view and the relevant dimensions Remove any standard attributes that not required Process the OLAP cube Step 7: Backup and drop databases Drop staging database as it is no longer required Backup data mart relational and OLAP database and ship these to the client's infrastructure Drop data mart relational and OLAP database from the build server Mark order complete Start processing the next order, ad infinitum. So my future blog posts and my forthcoming session at the SQLBits conference will all focus on some of the more interesting aspects of building OLAP data marts on-the-fly such as handling the load of extension attributes and how to dynamically alter the structure of an OLAP cube using C#.

    Read the article

  • Quick Poll: Certification Information Preferences

    - by Paul Sorensen
    We're starting a new "quick poll" series so that we can better learn about you - our technical professionals who are either already Oracle certified or working on earning an Oracle credential. We aim to keep them short (~1 minute to answer) so that you'll share your opinion.This week we want to know how you prefer to get your information about Oracle Certification:TAKE THE QUICK POLLNOTE: You can only take the survey once per machine. (if you try a second time it may redirect you to an external website)

    Read the article

  • Advice on designing a robust program to handle a large library of meta-information & programs

    - by Sam Bryant
    So this might be overly vague, but here it is anyway I'm not really looking for a specific answer, but rather general design principles or direction towards resources that deal with problems like this. It's one of my first large-scale applications, and I would like to do it right. Brief Explanation My basic problem is that I have to write an application that handles a large library of meta-data, can easily modify the meta-data on-the-fly, is robust with respect to crashing, and is very efficient. (Sorta like the design parameters of iTunes, although sometimes iTunes performs more poorly than I would like). If you don't want to read the details, you can skip the rest Long Explanation Specifically I am writing a program that creates a library of image files and meta-data about these files. There is a list of tags that may or may not apply to each image. The program needs to be able to add new images, new tags, assign tags to images, and detect duplicate images, all while operating. The program contains an image Viewer which has tagging operations. The idea is that if a given image A is viewed while the library has tags T1, T2, and T3, then that image will have boolean flags for each of those tags (depending on whether the user tagged that image while it was open in the Viewer). However, prior to being viewed in the Viewer, image A would have no value for tags T1, T2, and T3. Instead it would have a "dirty" flag indicating that it is unknown whether or not A has these tags or not. The program can introduce new tags at any time (which would automatically set all images to "dirty" with respect to this new tag) This program must be fast. It must be easily able to pull up a list of images with or without a certain tag as well as images which are "dirty" with respect to a tag. It has to be crash-safe, in that if it suddenly crashes, all of the tagging information done in that session is not lost (though perhaps it's okay to loose some of it) Finally, it has to work with a lot of images (10,000) I am a fairly experienced programmer, but I have never tried to write a program with such demanding needs and I have never worked with databases. With respect to the meta-data storage, there seem to be a few design choices: Choice 1: Invidual meta-data vs centralized meta-data Individual Meta-Data: have a separate meta-data file for each image. This way, as soon as you change the meta-data for an image, it can be written to the hard disk, without having to rewrite the information for all of the other images. Centralized Meta-Data: Have a single file to hold the meta-data for every file. This would probably require meta-data writes in intervals as opposed to after every change. The benefit here is that you could keep a centralized list of all images with a given tag, ect, making the task of pulling up all images with a given tag very efficient

    Read the article

  • Work Item Traceability in TFS 2010

    - by Sam Patrick
    I have created a Windows Form project (VS solution) under a TFS 2010 project. I may eventually add more solutions to the TFS project. My question: Can we create a Use Case WIT for a specific solution within a TFS project? Furthermore, is it possible to create a "traceability matrix" that starts at the Use Case level and goes down to the the code level (at least the namespace level) of that particular VS solution?

    Read the article

  • How to Install Moodle to subdomain with softaculous via cpanel

    - by Sean
    Hi there i installed moodle to a directory with softaculous as it doesn't allow installing to subdomain, after install I created a subdomain and pointed the destination (of subdomain) to previously created moodle directory, now when I go to the subdomain.example.com it says Incorrect access detected, this server may be accessed only through "http://example.com/moodle" address, sorry. Please notify server administrator. Any suggestions much appreciated! I must be doing something wrong, when installing it was very similar to these instructions

    Read the article

  • From Zero To Deployed Contest&ndash;Prizes Announced

    - by Robz / Fervent Coder
    Do you have what it takes to meet the challenge? We’ll make it worth it. You may have noticed at the end of my last post I threw down the community challenge to get from zero to deployed faster than me. The Challenge My time was 13:48 to be from zero to deployed. Beat my time and show it in a video response. The person with the best time by March 15th @ 11:59PM CST will receive a prize. Here are the links to the videos: #1 - http://www.youtube.com/watch?v=cZIUVfHWsbc #2 - http://www.youtube.com/watch?v=l7WluaXIya0 #3 - http://www.youtube.com/watch?v=IqPh7wbWsLc The Rules Let’s revisit those ground rules before I tell you what the prizes will be: Ground rules: .NET Application with a valid database connection Start from Zero Deployed with AppHarbor or an alternative A timer displayed in the video that runs during the entire process Video response published on YouTube or acceptable alternative Video(s) must be published by March 15th at 11:59PM CST. Either post the link here as a comment or on YouTube as a response (also by 11:59PM CST March 15th) The Prizes The prize package for the best time is:                      $50 Gift Card or equivalent – Provided by yours truly. AppHarbor $100 service credit – AppHarbor will provide a $100 credit for their services once they launch payments. Thank you to the folks at AppHarbor! ReSharper - Jetbrains will provide a FULL license of ReSharper Personal. This license is a $199 value. Thank you to the folks at Jetbrains! Telerik Ultimate Collection for .NET – Telerik will provide a license to pretty much every .NET tool they offer. This license is a $1999 value. A big thank you to the folks at Telerik!! This is a total value of $2348!!! The prize package for the person that has the most creative video(s) with a time better than mine (if there are at least 5 responses):           $20 Gift card or equivalent – Provided by this guy. AppHarbor $50 service credit – same deal as above. Thank you AppHarbor! Twilio T-Shirt - Twilio has donated a shirt and will ship your size to you (this may be subject to US residents only). This is a $25 value. Thank you to the folks at Twilio!

    Read the article

  • How to prevent a hacked-server from spoofing a master server?

    - by Cody Smith
    I wish to setup a room-based multilayer game model where players may host matches and serve as host (IE the server with authoritative power). I wish to host a master server which tracks player's items, rank, cash, exp, etc. In such a model, how can I prevent someone that is hosting a game (with a modified server) from spoofing the master server with invalid match results, thus gaining exp, money or rankings. Thanks. -Cody

    Read the article

  • How to Make Money Online Part 4 - Keyword Research is Critical

    Last time we looked at choosing the best products on ClickBank to promote, now we will look at keyword research, and how critical it is to make money online. You may have heard the term keyword research if you have been looking around for a while, but if you are looking to make or earn money online for the first time, I will give a brief description of what keyword research is and why it is so important.

    Read the article

  • Do I need to replace my hard disk?

    - by Sneha Kamath
    Hello everyone. Whenever I start my computer Ubuntu pops up the following error: A hard disk may be failing one or more hard disk report health problems A friend of mine ran some test and it was found that my hard disk has 74 bad sectors. Is this merely a software issue that will be solved after a complete format of my hard disk, or is it a hardware issue and I will have to replace my hard disk? Awaiting your responses. Thanks, Sneha Kamath.

    Read the article

  • Determining distribution of NULL values

    - by AaronBertrand
    Today on the twitter hash tag #sqlhelp, @leenux_tux asked: How can I figure out the percentage of fields that don't have data ? After further clarification, it turns out he is after what proportion of columns are NULL. Some folks suggested using a data profiling task in SSIS . There may be some validity to that, but I'm still a fan of sticking to T-SQL when I can, so here is how I would approach it: Create a #temp table or @table variable to store the results. Create a cursor that loops through all...(read more)

    Read the article

  • Self-signed certificates for a known community

    - by costlow
    Recently announced changes scheduled for Java 7 update 51 (January 2014) have established that the default security slider will require code signatures and the Permissions Manifest attribute. Code signatures are a common practice recommended in the industry because they help determine that the code your computer will run is the same code that the publisher created. This post is written to help users that need to use self-signed certificates without involving a public Certificate Authority. The role of self-signed certificates within a known community You may still use self-signed certificates within a known community. The difference between self-signed and purchased-from-CA is that your users must import your self-signed certificate to indicate that it is valid, whereas Certificate Authorities are already trusted by default. This works for known communities where people will trust that my certificate is mine, but does not scale widely where I cannot actually contact or know the systems that will need to trust my certificate. Public Certificate Authorities are widely trusted already because they abide by many different requirements and frequent checks. An example would be students in a university class sharing their public certificates on a mailing list or web page, employees publishing on the intranet, or a system administrator rolling certificates out to end-users. Managed machines help this because you can automate the rollout, but they are not required -- the major point simply that people will trust and import your certificate. How to distribute self-signed certificates for a known community There are several steps required to distribute a self-signed certificate to users so that they will properly trust it. These steps are: Creating a public/private key pair for signing. Exporting your public certificate for others Importing your certificate onto machines that should trust you Verify work on a different machine Creating a public/private key pair for signing Having a public/private key pair will give you the ability both to sign items yourself and issue a Certificate Signing Request (CSR) to a certificate authority. Create your public/private key pair by following the instructions for creating key pairs.Every Certificate Authority that I looked at provided similar instructions, but for the sake of cohesiveness I will include the commands that I used here: Generate the key pair.keytool -genkeypair -alias erikcostlow -keyalg EC -keysize 571 -validity 730 -keystore javakeystore_keepsecret.jks Provide a good password for this file. The alias "erikcostlow" is my name and therefore easy to remember. Substitute your name of something like "mykey." The sigalg of EC (Elliptical Curve) and keysize of 571 will give your key a good strong lifetime. All keys are set to expire. Two years or 730 days is a reasonable compromise between not-long-enough and too-long. Most public Certificate Authorities will sign something for one to five years. You will be placing your keys in javakeystore_keepsecret.jks -- this file will contain private keys and therefore should not be shared. If someone else gets these private keys, they can impersonate your signature. Please be cautious about automated cloud backup systems and private key stores. Answer all the questions. It is important to provide good answers because you will stick with them for the "-validity" days that you specified above.What is your first and last name?  [Unknown]:  First LastWhat is the name of your organizational unit?  [Unknown]:  Line of BusinessWhat is the name of your organization?  [Unknown]:  MyCompanyWhat is the name of your City or Locality?  [Unknown]:  City NameWhat is the name of your State or Province?  [Unknown]:  CAWhat is the two-letter country code for this unit?  [Unknown]:  USIs CN=First Last, OU=Line of Business, O=MyCompany, L=City, ST=CA, C=US correct?  [no]:  yesEnter key password for <erikcostlow>        (RETURN if same as keystore password): Verify your work:keytool -list -keystore javakeystore_keepsecret.jksYou should see your new key pair. Exporting your public certificate for others Public Key Infrastructure relies on two simple concepts: the public key may be made public and the private key must be private. By exporting your public certificate, you are able to share it with others who can then import the certificate to trust you. keytool -exportcert -keystore javakeystore_keepsecret.jks -alias erikcostlow -file erikcostlow.cer To verify this, you can open the .cer file by double-clicking it on most operating systems. It should show the information that you entered during the creation prompts. This is the file that you will share with others. They will use this certificate to prove that artifacts signed by this certificate came from you. If you do not manage machines directly, place the certificate file on an area that people within the known community should trust, such as an intranet page. Import the certificate onto machines that should trust you In order to trust the certificate, people within your known network must import your certificate into their keystores. The first step is to verify that the certificate is actually yours, which can be done through any band: email, phone, in-person, etc. Known networks can usually do this Determine the right keystore: For an individual user looking to trust another, the correct file is within that user’s directory.e.g. USER_HOME\AppData\LocalLow\Sun\Java\Deployment\security\trusted.certs For system-wide installations, Java’s Certificate Authorities are in JAVA_HOMEe.g. C:\Program Files\Java\jre8\lib\security\cacerts File paths for Mac and Linux are included in the link above. Follow the instructions to import the certificate into the keystore. keytool -importcert -keystore THEKEYSTOREFROMABOVE -alias erikcostlow -file erikcostlow.cer In this case, I am still using my name for the alias because it’s easy for me to remember. You may also use an alias of your company name. Scaling distribution of the import The easiest way to apply your certificate across many machines is to just push the .certs or cacerts file onto them. When doing this, watch out for any changes that people would have made to this file on their machines. Trusted.certs: When publishing into user directories, your file will overwrite any keys that the user has added since last update. CACerts: It is best to re-run the import command with each installation rather than just overwriting the file. If you just keep the same cacerts file between upgrades, you will overwrite any CAs that have been added or removed. By re-importing, you stay up to date with changes. Verify work on a different machine Verification is a way of checking on the client machine to ensure that it properly trusts signed artifacts after you have added your signing certificate. Many people have started using deployment rule sets. You can validate the deployment rule set by: Create and sign the deployment rule set on the computer that holds the private key. Copy the deployment rule set on to the different machine where you have imported the signing certificate. Verify that the Java Control Panel’s security tab shows your deployment rule set. Verifying an individual JAR file or multiple JAR files You can test a certificate chain by using the jarsigner command. jarsigner -verify filename.jar If the output does not say "jar verified" then run the following command to see why: jarsigner -verify -verbose -certs filename.jar Check the output for the term “CertPath not validated.”

    Read the article

  • SharePoint 2010 Service Pack 1

    - by Ricardo Peres
    Install updates by this order: SharePoint Foundation 2010 SP1 SharePoint Foundation 2010 Language Packs SP1 SharePoint Server 2010 SP1 SharePoint Server 2010 SP1 Language Packs SP1 June 2011 Cumulative Update After installing any of these packages, run the SharePoint 2010 Products Configuration Wizard. Also, you may want to read this: Service Pack 1 (SP1) for Microsoft SharePoint Foundation 2010 and Microsoft SharePoint Server 2010 (white paper) Description of SharePoint Server 2010 SP1

    Read the article

  • Windows 8 Launch&ndash;Why OEM and Retailers Should STFU

    - by D'Arcy Lussier
    Microsoft has gotten a lot of flack for the Surface from OEM/hardware partners who create Windows-based devices and I’m sure, to an extent, retailers who normally stock and sell Windows-based devices. I mean we all know how this is supposed to work – Microsoft makes the OS, partners make the hardware, retailers sell the hardware. Now Microsoft is breaking the rules by not only offering their own hardware but selling them via online and through their Microsoft branded stores! The thought has been that Microsoft is trying to set a standard for the other hardware companies to reach for. Maybe. I hope, at some level, Microsoft may be covertly responding to frustrations associated with trusting the OEMs and Retailers to deliver on their part of the supply chain. I know as a consumer, I’m very frustrated with the Windows 8 launch. Aside from the Surface sales, there’s nothing happening at the retail level. Let me back up and explain. Over the weekend I visited a number of stores in hopes of trying out various Windows 8 devices. Out of three retailers (Staples, Best Buy, and Future Shop), not *one* met my expectations. Let me be honest with you Staples, I never really have high expectations from your computer department. If I need paper or pens, whatever, but computers – you’re not the top of my list for price or selection. Still, considering you flaunted Win 8 devices in your flyer I expected *something* – some sign of effort that you took the Windows 8 launch seriously. As I entered the 1910 Pembina Highway location in Winnipeg, there was nothing – no signage, no banners – nothing that would suggest Windows 8 had even launched. I made my way to the laptops. I had to play with each machine to determine which ones were running Windows 8. There wasn’t anything on the placards that made it obvious which were Windows 8 machines and which ones were Windows 7. Likewise, there was no easy way to identify the touch screen laptop (the HP model) from the others without physically touching the screen to verify. Horrible experience. In the same mall as the Staples I mentioned above, there’s a Future Shop. Surely they would be more on the ball. I walked in to the 1910 Pembina Highway location and immediately realized I would not get a better experience. Except for the sign by the front door mentioning Windows 8, there was *nothing* in the computer department pointing you to the Windows 8 devices. Like in Staples, the Win 8 laptops were mixed in with the Win 7 ones and there was nothing notable calling out which ones were running Win 8. I happened to hit up the St. James Street location today, thinking since its a busier store they must have more options. To their credit, they did have two staff members decked out in Windows 8 shirts and who were helping a customer understand Windows 8. But otherwise, there was nothing highlighting the Windows 8 devices and they were again mixed in with the rest of the Win 7 machines. Finally, we have the St. James Street Best Buy location here in Winnipeg. I’m sure Best Buy will have their act together. Nope, not even close. Same story as the others: minimal signage (there was a sign as you walked in with a link to this schedule of demo days), Windows 8 hardware mixed with the rest of the PC offerings, and no visible call-outs identifying which were Win 8 based. This meant that, like Future Shop and Staples, if you wanted to know which machine had Windows 8 you had to go and scrutinize each machine. Also, there was nothing identifying which ones were touch based and which were not. Just Another Day… To these retailers, it seemed that the Windows 8 launch was just another day, with another product to add to the showroom floor. Meanwhile, Apple has their dedicated areas *in all three stores*. It was dead simple to find where the Apple products were compared to the Windows 8 products. No wonder Microsoft is starting to push their own retail stores. No wonder Microsoft is trying to funnel orders through them instead of relying on these bloated retail big box stores who obviously can’t manage a product launch. It’s Not Just The Retailers… Remember when the Acer CEO, Founder, and President of Computer Global Operations all weighed in on how Microsoft releasing the Surface would have a “huge negative impact for the ecosystem and other brands may take a negative reaction”? Also remember the CEO stating “[making hardware] is not something you are good at so please think twice”? Well the launch day has come and gone, and so far Microsoft is the only one that delivered on having hardware available on the October 26th date. Oh sure, there are laptops running Windows 8 – but all in one desktop PCs? I’ve only seen one or two! And tablets are *non existent*, with some showing an early to late November availability on Best Buy’s website! So while the retailers could be doing more to make it easier to find Windows 8 devices, the manufacturers could help by *getting devices into stores*! That’s supposedly something that these companies are good at, according to the Acer CEO. So Here’s What the Retailers and Manufacturers Need To Do… Get Product Out The pivotal timeframe will be now to the end of November. We need to start seeing all these fantastic pieces of hardware ship – including the Samsung ATIV Smart PC Pro, the Acer Iconia, the Asus TAICHI 21, and the sexy Samsung Series 7 27” desktop. It’s not enough to see product announcements, we need to see actual devices. Make It Easy For Customers To Find Win8 Devices You want to make it easy to sell these things? Make it easy for people to find them! Have staff on hand that really know how these devices run and what can be done with them. Don’t just have a single demo day, have people who can demo it every day! Make It Easy to See the Features There’s touch screen desktops, touch screen laptops, tablets, non-touch laptops, etc. People need to easily find the features for each machine. If I’m looking for a touch-laptop, I shouldn’t need to sift through all the non-touch laptops to find them – at the least, I need to quickly be able to see which ones are touch. I feel silly even typing this because this should be retail 101 and I have no retail background (but I do have an extensive background as a customer). In Summary… Microsoft launching the Surface and selling them through their own channels isn’t slapping its OEM and retail partners in the face; its slapping them to wake the hell up and stop coasting through Windows launch events like they don’t matter. Unless I see some improvements from vendors and retailers in November, I may just hold onto my money for a Surface Pro even if I have to wait until early 2013. Your move OEM/Retailers. *Update – While my experience has been in Winnipeg, similar experiences have been voiced from colleagues in Calgary and Edmonton.

    Read the article

  • Revision Methodology for Developer Post as Entry Level

    - by Demla Pawan
    I had revised all basic concepts of my computer science ciriculum like: Core Java(basics),SQL(basics),C++(basics),XHTML,PHP(basics),Datastructures(basics) and what I need to do,and How to do, as their may be fault in my preparation methods for revision session's, So can Anybody suggest Methodology to revise those technical things,to which you are not in touch at present, but you can write basic programs or have used 1-2 years ago. And also can U suggest some Quick revision links on Net for various technologies mentioned above.

    Read the article

  • SQL Rank

    - by Derek Dieter
    The SQL Rank function was introduced in SQL Server 2005 and is part of a family of ranking functions. In order to explain the SQL Rank function, we need to look at it in context with the other rank functions.RANK DENSE_RANK ROW_NUMBER NTILEThis list may seem overwhelming, however most of the ranking functions are rather similar. First, the [...]

    Read the article

  • Oracle Tutor: Top 10 to Implement Sustainable Policies and Procedures

    - by emily.chorba(at)oracle.com
    Overview Your organization (executives, managers, and employees) understands the value of having written business process documents (process maps, procedures, instructions, reference documents, and form abstracts). Policies and procedures should be documented because they help to reduce the range of individual decisions and encourage management by exception: the manager only needs to give special attention to unusual problems, not covered by a specific policy or procedure. As more and more procedures are written to cover recurring situations, managers will begin to make decisions which will be consistent from one functional area to the next.Companies should take a project management approach when implementing an environment for a sustainable documentation program and do the following:1. Identify an Executive Champion2. Put together a winning team3. Assign ownership4. Centralize publishing5. Establish the Document Maintenance Process Up Front6. Document critical activities only7. Document actual practice8. Minimize documentation9. Support continuous improvement10. Keep it simple 1. Identify an Executive ChampionAppoint a top down driver. Select one key individual to be a mentor for the procedure planning team. The individual should be a senior manager, such as your company president, CIO, CFO, the vice-president of quality, manufacturing, or engineering. Written policies and procedures can be important supportive aids when known to express the thinking for the chief executive officer and / or the president and to have his or her full support. 2. Put Together a Winning TeamChoose a strong Project Management Leader and staff the procedure planning team with management members from cross functional groups. Make sure team members have the responsibility - and the authority - to make things happen.The winning team should consist of the Documentation Project Manager, Document Owners (one for each functional area), a Document Controller, and Document Specialists (as needed). The Tutor Implementation Guide has complete job descriptions for these roles. 3. Assign Ownership It is virtually impossible to keep process documentation simple and meaningful if employees who are far removed from the activity itself create it. It is impossible to keep documentation up-to-date when responsibility for the document is not clearly understood.Key to the Tutor methodology, therefore, is the concept of ownership. Each document has a single owner, who is responsible for ensuring that the document is necessary and that it reflects actual practice. The owner must be a person who is knowledgeable about the activity and who has the authority to build consensus among the persons who participate in the activity as well as the authority to define or change the way an activity is performed. The owner must be an advocate of the performers and negotiate, not dictate practices.In the Tutor environment, a document's owner is the only person with the authority to approve an update to that document. 4. Centralize Publishing Although it is tempting (especially in a networked environment and with document management software solutions) to decentralize the control of all documents -- with each owner updating and distributing his own -- Tutor promotes centralized publishing by assigning the Document Administrator (gate keeper) to manage the updates and distribution of the procedures library. 5. Establish a Document Maintenance Process Up Front (and stick to it) Everyone in your organization should know they are invited to suggest changes to procedures and should understand exactly what steps to take to do so. Tutor provides a set of procedures to help your company set up a healthy document control system. There are many document management products available to automate some of the document change and maintenance steps. Depending on the size of your organization, a simple document management system can reduce the effort it takes to track and distribute document changes and updates. Whether your company decides to store the written policies and procedures on a file server or in a database, the essential tasks for maintaining documents are the same, though some tasks are automated. 6. Document Critical Activities Only The best way to keep your documentation simple is to reduce the number of process documents to a bare minimum and to include in those documents only as much detail as is absolutely necessary. The first step to reducing process documentation is to document only those activities that are deemed critical. Not all activities require documentation. In fact, some critical activities cannot and should not be standardized. Others may be sufficiently documented with an instruction or a checklist and may not require a procedure. A document should only be created when it enhances the performance of the employee performing the activity. If it does not help the employee, then there is no reason to maintain the document. Activities that represent little risk (such as project status), activities that cannot be defined in terms of specific tasks (such as product research), and activities that can be performed in a variety of ways (such as advertising) often do not require documentation. Sometimes, an activity will evolve to the point where documentation is necessary. For example, an activity performed by single employee may be straightforward and uncomplicated -- that is, until the activity is performed by multiple employees. Sometimes, it is the interaction between co-workers that necessitates documentation; sometimes, it is the complexity or the diversity of the activity.7. Document Actual Practices The only reason to maintain process documentation is to enhance the performance of the employee performing the activity. And documentation can only enhance performance if it reflects reality -- that is, current best practice. Documentation that reflects an unattainable ideal or outdated practices will end up on the shelf, unused and forgotten.Documenting actual practice means (1) auditing the activity to understand how the work is really performed, (2) identifying best practices with employees who are involved in the activity, (3) building consensus so that everyone agrees on a common method, and (4) recording that consensus.8. Minimize Documentation One way to keep it simple is to document at the highest level possible. That is, include in your documents only as much detail as is absolutely necessary.When writing a document, you should ask yourself, What is the purpose of this document? That is, what problem will it solve?By focusing on this question, you can target the critical information.• What questions are the end users likely to have?• What level of detail is required?• Is any of this information extraneous to the document's purpose? Short, concise documents are user friendly and they are easier to keep up to date. 9. Support Continuous Improvement Employees who perform an activity are often in the best position to identify improvements to the process. In other words, continuous improvement is a natural byproduct of the work itself -- but only if the improvements are communicated to all employees who are involved in the process, and only if there is consensus among those employees.Traditionally, process documentation has been used to dictate performance, to limit employees' actions. In the Tutor environment, process documents are used to communicate improvements identified by employees. How does this work? The Tutor methodology requires a process document to reflect actual practice, so the owner of a document must routinely audit its content -- does the document match what the employees are doing? If it doesn't, the owner has the responsibility to evaluate the process, to build consensus among the employees, to identify "best practices," and to communicate these improvements via a document update. Continuous improvement can also be an outgrowth of corrective action -- but only if the solutions to problems are communicated effectively. The goal should be to solve a problem once and only once, which means not only identifying the solution, but ensuring that the solution becomes part of the process. The Tutor system provides the method through which improvements and solutions are documented and communicated to all affected employees in a cost-effective, timely manner; it ensures that improvements are not lost or confined to a single employee. 10. Keep it Simple Process documents don't have to be complex and unfriendly. In fact, the simpler the format and organization, the more likely the documents will be used. And the simpler the method of maintenance, the more likely the documents will be kept up-to-date. Keep it simply by:• Minimizing skills and training required• Following the established Tutor document format and layout• Avoiding technology just for technology's sake No other rule has as major an impact on the success of your internal documentation as -- keep it simple. Learn More For more information about Tutor, visit Oracle.Com or the Tutor Blog. Post your questions at the Tutor Forum.   Emily Chorba Principle Product Manager Oracle Tutor & BPM 

    Read the article

< Previous Page | 132 133 134 135 136 137 138 139 140 141 142 143  | Next Page >