Search Results

Search found 2012 results on 81 pages for 'impossible'.

Page 26/81 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • TFS 2010 Basic Concepts

    - by jehan
    v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} Normal 0 false false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Here, I’m going to discuss some key Architectural changes and concepts that have taken place in TFS 2010 when compared to TFS 2008. In TFS 2010 Installation, First you need to do the Installation and then you have to configure the Installation Feature from the available features. This is bit similar to SharePoint Installation, where you will first do the Installation and then configure the SharePoint Farms. 1) Installation Features available in TFS2010: a) Basic: It is the most compact TFS installation possible. It will install and configure Source Control, Work Item tracking and Build Services only. (SharePoint and Reporting Integration will not be possible). b) Standard Single Server: This is suitable for Single Server deployment of TFS. It will install and configure Windows SharePoint Services for you and will use the default instance of SQL Server. c) Advanced: It is suitable, if you want use Remote Servers for SQL Server Databases, SharePoint Products and Technologies and SQL Server Reporting Services. d) Application Tier Only: If you want to configure high availability for Team Foundation Server in a Load Balanced Environment (NLB) or you want to move Team Foundation Server from one server to other or you want to restore TFS. e) Upgrade: If you want to upgrade from a prior version of TFS. Note: One more important thing to know here about  TFS 2010 Basic is that,  it can be installed on Client Operations Systems(Windows 7 and Windows Vista SP3), Where as  earlier you cannot Install previous version of TFS (2008 and 2005) on client OS. 2) Team Project Collections: Connect to TFS dialog box in TFS 2008:  In TFS 2008, the TFS Server contains a set of Team Projects and each project may or may not be independent of other projects and every checkin gets a ever increasing  changeset ID  irrespective of the team project in which it is checked in and the same applies to work items  also, who also gets unique Work Item Ids.The main problem with this approach was that there are certain things which were impossible to do; those were required as per the Application Development Process. a)      If something has gone wrong in one team project and now you want to restore it back to earlier state where it was working properly then it requires you to restore the Database of Team Foundation Server from the backup you have taken as per your Maintenance plans and because of this the other team projects may lose out on the work which is not backed up. b)       Your company had a merge with some other company and now you have two TFS servers. One TFS Server which you are working on and other TFS server which other company was working and now after the merge you want to integrate the team projects from two TFS servers into one, which is almost impossible to achieve in TFS 2008. Though you can create the Team Projects in one server manually (In Source Control) which you want to integrate from the other TFS Server, but will lose out on History of Change Sets and Work items and others which are very important. There were few more issues of this sort, which were difficult to resolve in TFS 2008. To resolve issues related to above kind of scenarios which were mainly related TFS Maintenance, Integration, migration and Security,  Microsoft has come up with Team Project Collections concept in TFS 2010.This concept is similar to SharePoint Site Collections and if you are familiar with SharePoint Architecture, then it will help you to understand TFS 2010 Architecture easily. Connect to TFS dialog box in TFS 2010: In above dialog box as you can see there are two Team Project Collections, each team project can contain any number of team projects as you can see on right side it shows the two Team Projects in Team Project Collection (Default Collection) which I have chosen. Note: You can connect to only one Team project Collection at a time using an instance of  TFS Team Explorer. How does it work? To introduce Team Project Collections, changes have been done in reorganization of TFS databases. TFS 2008 was composed of 5-7 databases partitioned by subsystem (each for Version Control, Work Item Tracking, Build, Integration, Project Management...) New TFS 2010 database architecture: TFS_Config: It’s the root database and it contains centralized TFS configuration data, including the list of all team projects exist in TFS server. TFS_Warehouse: The data warehouse contains all the reporting data of served by this server (farm). TFS_* : This contains individual team project collection data. This database contains all the operational data of team project collection regardless of subsystem.In additional to this, you will have databases for SharePoint and Report Server. 3) TFS Farms:  As TFS 2010 is more flexible to configure as multiple Application tiers and multiple Database tiers, so it will be more appropriate to call as TFS Farm if you going for multi server installation of TFS. NLB support for TFS application tiers – With TFS 2010: you can configure multiple TFS application tier machines to serve the same set of Team Project Collections. The primary purpose of NLB support is to enable a cleaner and more complete high availability than in TFS 2008. Even if any application tier in the farm fails then farm will automatically continue to work with hardly any indication to end users of a problem. SQL data tiers: With 2010 you can configure many SQL Servers. Each Database can be configured to be on any SQL Server because each Team Project Collection is an independent database. This feature can also be used to load balance databases across SQL Servers.These new capabilities will significantly change the way enterprises manage their TFS installations in the future. With Team Project Collections and TFS farms, you can create a single, arbitrarily large TFS installation. You can grow it incrementally by adding ATs and SQL Servers as needed.

    Read the article

  • Documenting C# Library using GhostDoc and SandCastle

    - by sreejukg
    Documentation is an essential part of any IT project, especially when you are creating reusable components that will be used by other developers (such as class libraries). Without documentation re-using a class library is almost impossible. Just think of coding .net applications without MSDN documentation (Ooops I can’t think of it). Normally developers, who know the bits and pieces of their classes, see this as a boring work to write details again to generate the documentation. Also the amount of work to make this and manage it changes made the process of manual creation of Documentation impossible or tedious. So what is the effective solution? Let me divide this into two steps 1. Generate comments for your code while you are writing the code. 2. Create documentation file using these comments. Now I am going to examine these processes. Step 1: Generate XML Comments automatically Most of the developers write comments for their code. The best thing is that the comments will be entered during the development process. Additionally comments give a good reference to the code, make your code more manageable/readable. Later these comments can be converted into documentation, along with your source code by identifying properties and methods I found an add-in for visual studio, GhostDoc that automatically generates XML documentation comments for C#. The add-in is available in Visual Studio Gallery at MSDN. You can download this from the url http://visualstudiogallery.msdn.microsoft.com/en-us/46A20578-F0D5-4B1E-B55D-F001A6345748. I downloaded the free version from the above url. The free version suits my requirement. There is a professional version (you need to pay some $ for this) available that gives you some more features. I found the free version itself suits my requirements. The installation process is straight forward. A couple of clicks will do the work for you. The best thing with GhostDoc is that it supports multiple versions of visual studio such as 2005, 2008 and 2010. After Installing GhostDoc, when you start Visual studio, the GhostDoc configuration dialog will appear. The first screen asks you to assign a hot key, pressing this hotkey will enter the comment to your code file with the necessary structure required by GhostDoc. Click Assign to go to the next step where you configure the rules for generating the documentation from the code file. Click Create to start creating the rules. Click finish button to close this wizard. Now you performed the necessary configuration required by GhostDoc. Now In Visual Studio tools menu you can find the GhostDoc that gives you some options. Now let us examine how GhostDoc generate comments for a method. I have write the below code in my code behind file. public Char GetChar(string str, int pos) { return str[pos]; } Now I need to generate the comments for this function. Select the function and enter the hot key assigned during the configuration. GhostDoc will generate the comments as follows. /// <summary> /// Gets the char. /// </summary> /// <param name="str">The STR.</param> /// <param name="pos">The pos.</param> /// <returns></returns> public Char GetChar(string str, int pos) { return str[pos]; } So this is a very handy tool that helps developers writing comments easily. You can generate the xml documentation file separately while compiling the project. This will be done by the C# compiler. You can enable the xml documentation creation option (checkbox) under Project properties -> Build tab. Now when you compile, the xml file will created under the bin folder. Step 2: Generate the documentation from the XML file Now you have generated the xml file documentation. Sandcastle is the tool from Microsoft that generates MSDN style documentation from the compiler produced XML file. The project is available in codeplex http://sandcastle.codeplex.com/. Download and install Sandcastle to your computer. Sandcastle is a command line tool that doesn’t have a rich GUI. If you want to automate the documentation generation, definitely you will be using the command line tools. Since I want to generate the documentation from the xml file generated in the previous step, I was expecting a GUI where I can see the options. There is a GUI available for Sandcastle called Sandcastle Help File Builder. See the link to the project in codeplex. http://www.codeplex.com/wikipage?ProjectName=SHFB. You need to install Sandcastle and then the Sandcastle Help file builder. From here I assume that you have installed both sandcastle and Sandcastle help file builder successfully. Once you installed the help file builder, it will be available in your all programs list. Click on the Sandcastle Help File Builder GUI, will launch application. First you need to create a project. Click on File -> New project The New project dialog will appear. Choose a folder to store your project file and give a name for your documentation project. Click the save button. Now you will see your project properties. Now from the Project explorer, right click on the Documentation Sources, Click on the Add Documentation Source link. A documentation source is a file such as an assembly or a Visual Studio solution or project from which information will be extracted to produce API documentation. From the Add Documentation source dialog, I have selected the XML file generated by my project. Once you add the xml file to the project, you will see the dll file automatically added by the help file builder. Now click on the build button. Now the application will generate the help file. The Build window gives to the result of each steps. Once the process completed successfully, you will have the following output in the build window. Now navigate to your Help Project (I have selected the folder My Documents\Documentation), inside help folder, you can find the chm file. Open the chm file will give you MSDN like documentation. Documentation is an important part of development life cycle. Sandcastle with GhostDoc make this process easier so that developers can implement the documentation in the projects with simple to use steps.

    Read the article

  • Fraud Detection with the SQL Server Suite Part 2

    - by Dejan Sarka
    This is the second part of the fraud detection whitepaper. You can find the first part in my previous blog post about this topic. My Approach to Data Mining Projects It is impossible to evaluate the time and money needed for a complete fraud detection infrastructure in advance. Personally, I do not know the customer’s data in advance. I don’t know whether there is already an existing infrastructure, like a data warehouse, in place, or whether we would need to build one from scratch. Therefore, I always suggest to start with a proof-of-concept (POC) project. A POC takes something between 5 and 10 working days, and involves personnel from the customer’s site – either employees or outsourced consultants. The team should include a subject matter expert (SME) and at least one information technology (IT) expert. The SME must be familiar with both the domain in question as well as the meaning of data at hand, while the IT expert should be familiar with the structure of data, how to access it, and have some programming (preferably Transact-SQL) knowledge. With more than one IT expert the most time consuming work, namely data preparation and overview, can be completed sooner. I assume that the relevant data is already extracted and available at the very beginning of the POC project. If a customer wants to have their people involved in the project directly and requests the transfer of knowledge, the project begins with training. I strongly advise this approach as it offers the establishment of a common background for all people involved, the understanding of how the algorithms work and the understanding of how the results should be interpreted, a way of becoming familiar with the SQL Server suite, and more. Once the data has been extracted, the customer’s SME (i.e. the analyst), and the IT expert assigned to the project will learn how to prepare the data in an efficient manner. Together with me, knowledge and expertise allow us to focus immediately on the most interesting attributes and identify any additional, calculated, ones soon after. By employing our programming knowledge, we can, for example, prepare tens of derived variables, detect outliers, identify the relationships between pairs of input variables, and more, in only two or three days, depending on the quantity and the quality of input data. I favor the customer’s decision of assigning additional personnel to the project. For example, I actually prefer to work with two teams simultaneously. I demonstrate and explain the subject matter by applying techniques directly on the data managed by each team, and then both teams continue to work on the data overview and data preparation under our supervision. I explain to the teams what kind of results we expect, the reasons why they are needed, and how to achieve them. Afterwards we review and explain the results, and continue with new instructions, until we resolve all known problems. Simultaneously with the data preparation the data overview is performed. The logic behind this task is the same – again I show to the teams involved the expected results, how to achieve them and what they mean. This is also done in multiple cycles as is the case with data preparation, because, quite frankly, both tasks are completely interleaved. A specific objective of the data overview is of principal importance – it is represented by a simple star schema and a simple OLAP cube that will first of all simplify data discovery and interpretation of the results, and will also prove useful in the following tasks. The presence of the customer’s SME is the key to resolving possible issues with the actual meaning of the data. We can always replace the IT part of the team with another database developer; however, we cannot conduct this kind of a project without the customer’s SME. After the data preparation and when the data overview is available, we begin the scientific part of the project. I assist the team in developing a variety of models, and in interpreting the results. The results are presented graphically, in an intuitive way. While it is possible to interpret the results on the fly, a much more appropriate alternative is possible if the initial training was also performed, because it allows the customer’s personnel to interpret the results by themselves, with only some guidance from me. The models are evaluated immediately by using several different techniques. One of the techniques includes evaluation over time, where we use an OLAP cube. After evaluating the models, we select the most appropriate model to be deployed for a production test; this allows the team to understand the deployment process. There are many possibilities of deploying data mining models into production; at the POC stage, we select the one that can be completed quickly. Typically, this means that we add the mining model as an additional dimension to an existing DW or OLAP cube, or to the OLAP cube developed during the data overview phase. Finally, we spend some time presenting the results of the POC project to the stakeholders and managers. Even from a POC, the customer will receive lots of benefits, all at the sole risk of spending money and time for a single 5 to 10 day project: The customer learns the basic patterns of frauds and fraud detection The customer learns how to do the entire cycle with their own people, only relying on me for the most complex problems The customer’s analysts learn how to perform much more in-depth analyses than they ever thought possible The customer’s IT experts learn how to perform data extraction and preparation much more efficiently than they did before All of the attendees of this training learn how to use their own creativity to implement further improvements of the process and procedures, even after the solution has been deployed to production The POC output for a smaller company or for a subsidiary of a larger company can actually be considered a finished, production-ready solution It is possible to utilize the results of the POC project at subsidiary level, as a finished POC project for the entire enterprise Typically, the project results in several important “side effects” Improved data quality Improved employee job satisfaction, as they are able to proactively contribute to the central knowledge about fraud patterns in the organization Because eventually more minds get to be involved in the enterprise, the company should expect more and better fraud detection patterns After the POC project is completed as described above, the actual project would not need months of engagement from my side. This is possible due to our preference to transfer the knowledge onto the customer’s employees: typically, the customer will use the results of the POC project for some time, and only engage me again to complete the project, or to ask for additional expertise if the complexity of the problem increases significantly. I usually expect to perform the following tasks: Establish the final infrastructure to measure the efficiency of the deployed models Deploy the models in additional scenarios Through reports By including Data Mining Extensions (DMX) queries in OLTP applications to support real-time early warnings Include data mining models as dimensions in OLAP cubes, if this was not done already during the POC project Create smart ETL applications that divert suspicious data for immediate or later inspection I would also offer to investigate how the outcome could be transferred automatically to the central system; for instance, if the POC project was performed in a subsidiary whereas a central system is available as well Of course, for the actual project, I would repeat the data and model preparation as needed It is virtually impossible to tell in advance how much time the deployment would take, before we decide together with customer what exactly the deployment process should cover. Without considering the deployment part, and with the POC project conducted as suggested above (including the transfer of knowledge), the actual project should still only take additional 5 to 10 days. The approximate timeline for the POC project is, as follows: 1-2 days of training 2-3 days for data preparation and data overview 2 days for creating and evaluating the models 1 day for initial preparation of the continuous learning infrastructure 1 day for presentation of the results and discussion of further actions Quite frequently I receive the following question: are we going to find the best possible model during the POC project, or during the actual project? My answer is always quite simple: I do not know. Maybe, if we would spend just one hour more for data preparation, or create just one more model, we could get better patterns and predictions. However, we simply must stop somewhere, and the best possible way to do this, according to my experience, is to restrict the time spent on the project in advance, after an agreement with the customer. You must also never forget that, because we build the complete learning infrastructure and transfer the knowledge, the customer will be capable of doing further investigations independently and improve the models and predictions over time without the need for a constant engagement with me.

    Read the article

  • Oracle Tutor: Top 10 to Implement Sustainable Policies and Procedures

    - by emily.chorba(at)oracle.com
    Overview Your organization (executives, managers, and employees) understands the value of having written business process documents (process maps, procedures, instructions, reference documents, and form abstracts). Policies and procedures should be documented because they help to reduce the range of individual decisions and encourage management by exception: the manager only needs to give special attention to unusual problems, not covered by a specific policy or procedure. As more and more procedures are written to cover recurring situations, managers will begin to make decisions which will be consistent from one functional area to the next.Companies should take a project management approach when implementing an environment for a sustainable documentation program and do the following:1. Identify an Executive Champion2. Put together a winning team3. Assign ownership4. Centralize publishing5. Establish the Document Maintenance Process Up Front6. Document critical activities only7. Document actual practice8. Minimize documentation9. Support continuous improvement10. Keep it simple 1. Identify an Executive ChampionAppoint a top down driver. Select one key individual to be a mentor for the procedure planning team. The individual should be a senior manager, such as your company president, CIO, CFO, the vice-president of quality, manufacturing, or engineering. Written policies and procedures can be important supportive aids when known to express the thinking for the chief executive officer and / or the president and to have his or her full support. 2. Put Together a Winning TeamChoose a strong Project Management Leader and staff the procedure planning team with management members from cross functional groups. Make sure team members have the responsibility - and the authority - to make things happen.The winning team should consist of the Documentation Project Manager, Document Owners (one for each functional area), a Document Controller, and Document Specialists (as needed). The Tutor Implementation Guide has complete job descriptions for these roles. 3. Assign Ownership It is virtually impossible to keep process documentation simple and meaningful if employees who are far removed from the activity itself create it. It is impossible to keep documentation up-to-date when responsibility for the document is not clearly understood.Key to the Tutor methodology, therefore, is the concept of ownership. Each document has a single owner, who is responsible for ensuring that the document is necessary and that it reflects actual practice. The owner must be a person who is knowledgeable about the activity and who has the authority to build consensus among the persons who participate in the activity as well as the authority to define or change the way an activity is performed. The owner must be an advocate of the performers and negotiate, not dictate practices.In the Tutor environment, a document's owner is the only person with the authority to approve an update to that document. 4. Centralize Publishing Although it is tempting (especially in a networked environment and with document management software solutions) to decentralize the control of all documents -- with each owner updating and distributing his own -- Tutor promotes centralized publishing by assigning the Document Administrator (gate keeper) to manage the updates and distribution of the procedures library. 5. Establish a Document Maintenance Process Up Front (and stick to it) Everyone in your organization should know they are invited to suggest changes to procedures and should understand exactly what steps to take to do so. Tutor provides a set of procedures to help your company set up a healthy document control system. There are many document management products available to automate some of the document change and maintenance steps. Depending on the size of your organization, a simple document management system can reduce the effort it takes to track and distribute document changes and updates. Whether your company decides to store the written policies and procedures on a file server or in a database, the essential tasks for maintaining documents are the same, though some tasks are automated. 6. Document Critical Activities Only The best way to keep your documentation simple is to reduce the number of process documents to a bare minimum and to include in those documents only as much detail as is absolutely necessary. The first step to reducing process documentation is to document only those activities that are deemed critical. Not all activities require documentation. In fact, some critical activities cannot and should not be standardized. Others may be sufficiently documented with an instruction or a checklist and may not require a procedure. A document should only be created when it enhances the performance of the employee performing the activity. If it does not help the employee, then there is no reason to maintain the document. Activities that represent little risk (such as project status), activities that cannot be defined in terms of specific tasks (such as product research), and activities that can be performed in a variety of ways (such as advertising) often do not require documentation. Sometimes, an activity will evolve to the point where documentation is necessary. For example, an activity performed by single employee may be straightforward and uncomplicated -- that is, until the activity is performed by multiple employees. Sometimes, it is the interaction between co-workers that necessitates documentation; sometimes, it is the complexity or the diversity of the activity.7. Document Actual Practices The only reason to maintain process documentation is to enhance the performance of the employee performing the activity. And documentation can only enhance performance if it reflects reality -- that is, current best practice. Documentation that reflects an unattainable ideal or outdated practices will end up on the shelf, unused and forgotten.Documenting actual practice means (1) auditing the activity to understand how the work is really performed, (2) identifying best practices with employees who are involved in the activity, (3) building consensus so that everyone agrees on a common method, and (4) recording that consensus.8. Minimize Documentation One way to keep it simple is to document at the highest level possible. That is, include in your documents only as much detail as is absolutely necessary.When writing a document, you should ask yourself, What is the purpose of this document? That is, what problem will it solve?By focusing on this question, you can target the critical information.• What questions are the end users likely to have?• What level of detail is required?• Is any of this information extraneous to the document's purpose? Short, concise documents are user friendly and they are easier to keep up to date. 9. Support Continuous Improvement Employees who perform an activity are often in the best position to identify improvements to the process. In other words, continuous improvement is a natural byproduct of the work itself -- but only if the improvements are communicated to all employees who are involved in the process, and only if there is consensus among those employees.Traditionally, process documentation has been used to dictate performance, to limit employees' actions. In the Tutor environment, process documents are used to communicate improvements identified by employees. How does this work? The Tutor methodology requires a process document to reflect actual practice, so the owner of a document must routinely audit its content -- does the document match what the employees are doing? If it doesn't, the owner has the responsibility to evaluate the process, to build consensus among the employees, to identify "best practices," and to communicate these improvements via a document update. Continuous improvement can also be an outgrowth of corrective action -- but only if the solutions to problems are communicated effectively. The goal should be to solve a problem once and only once, which means not only identifying the solution, but ensuring that the solution becomes part of the process. The Tutor system provides the method through which improvements and solutions are documented and communicated to all affected employees in a cost-effective, timely manner; it ensures that improvements are not lost or confined to a single employee. 10. Keep it Simple Process documents don't have to be complex and unfriendly. In fact, the simpler the format and organization, the more likely the documents will be used. And the simpler the method of maintenance, the more likely the documents will be kept up-to-date. Keep it simply by:• Minimizing skills and training required• Following the established Tutor document format and layout• Avoiding technology just for technology's sake No other rule has as major an impact on the success of your internal documentation as -- keep it simple. Learn More For more information about Tutor, visit Oracle.Com or the Tutor Blog. Post your questions at the Tutor Forum.   Emily Chorba Principle Product Manager Oracle Tutor & BPM 

    Read the article

  • At times, you need to hire a professional.

    - by Phil Factor
    After months of increasingly demanding toil, the development team I belonged to was told that the project was to be canned and the whole team would be fired.  I’d been brought into the team as an expert in the data implications of a business re-engineering of a major financial institution. Nowadays, you’d call me a data architect, I suppose.  I’d spent a happy year being paid consultancy fees solving a succession of interesting problems until the point when the company lost is nerve, and closed the entire initiative. The IT industry was in one of its characteristic mood-swings downwards.  After the announcement, we met in the canteen. A few developers had scented the smell of death around the project already hand had been applying unsuccessfully for jobs. There was a sense of doom in the mass of dishevelled and bleary-eyed developers. After giving vent to anger and despair, talk turned to getting new employment. It was then that I perked up. I’m not an obvious choice to give advice on getting, or passing,  IT interviews. I reckon I’ve failed most of the job interviews I’ve ever attended. I once even failed an interview for a job I’d already been doing perfectly well for a year. The jobs I’ve got have mostly been from personal recommendation. Paradoxically though, from years as a manager trying to recruit good staff, I know a lot about what IT managers are looking for.  I gave an impassioned speech outlining the important factors in getting to an interview.  The most important thing, certainly in my time at work is the quality of the résumé or CV. I can’t even guess the huge number of CVs (résumés) I’ve read through, scanning for candidates worth interviewing.  Many IT Developers find it impossible to describe their  career succinctly on two sides of paper.  They leave chunks of their life out (were they in prison?), get immersed in detail, put in irrelevancies, describe what was going on at work rather than what they themselves did, exaggerate their importance, criticize their previous employers, aren’t  aware of the important aspects of a role to a potential employer, suffer from shyness and modesty,  and lack any sort of organized perspective of their work. There are many ways of failing to write a decent CV. Many developers suffer from the delusion that their worth can be recognized purely from the code that they write, and shy away from anything that seems like self-aggrandizement. No.  A resume must make a good impression, which means presenting the facts about yourself in a clear and positive way. You can’t do it yourself. Why not have your resume professionally written? A good professional CV Writer will know the qualities being looked for in a CV and interrogate you to winkle them out. Their job is to make order and sense out of a confused career, to summarize in one page a mass of detail that presents to any recruiter the information that’s wanted. To stand back and describe an accurate summary of your skills, and work-experiences dispassionately, without rancor, pity or modesty. You are no more capable of producing an objective documentation of your career than you are of taking your own appendix out.  My next recommendation was more controversial. This is to have a professional image overhaul, or makeover, followed by a professionally-taken photo portrait. I discovered this by accident. It is normal for IT professionals to face impossible deadlines and long working hours by looking more and more like something that had recently blocked a sink. Whilst working in IT, and in a state of personal dishevelment, I’d been offered the role in a high-powered amateur production of an old ex- Broadway show, purely for my singing voice. I was supposed to be the presentable star. When the production team saw me, the air was thick with tension and despair. I was dragged kicking and protesting through a succession of desperate grooming, scrubbing, dressing, dieting. I emerged feeling like “That jewelled mass of millinery, That oiled and curled Assyrian bull, Smelling of musk and of insolence.” (Tennyson Maud; A Monodrama (1855) Section v1 stanza 6) I was then photographed by a professional stage photographer.  When the photographs were delivered, I was amazed. It wasn’t me, but it looked somehow respectable, confident, trustworthy.   A while later, when the show had ended, I took the photos, and used them for work. They went with the CV to job applications. It did the trick better than I could ever imagine.  My views went down big with the developers. Old rivalries were put immediately to one side. We voted, with a show of hands, to devote our energies for the entire notice period to getting employable. We had a team sourcing the CV Writer,  a team organising the make-overs and photographer, and a third team arranging  mock interviews. A fourth team determined the best websites and agencies for recruitment, with the help of friends in the trade.  Because there were around thirty developers, we were in a good negotiating position.  Of the three CV Writers we found who lived locally, one proved exceptional. She was an ex-journalist with an eye to detail, and years of experience in manipulating language. We tried her skills out on a developer who seemed a hopeless case, and he was called to interview within a week.  I was surprised, too, how many companies were experts at image makeovers. Within the month, we all looked like those weird slick  people in the ‘Office-tagged’ stock photographs who stare keenly and interestedly at PowerPoint slides in sleek chromium-plated high-rise offices. The portraits we used still adorn the entries of many of my ex-colleagues in LinkedIn. After a months’ worth of mock interviews, and technical Q&A, our stutters, hesitations, evasions and periphrastic circumlocutions were all gone.  There is little more to relate. With the résumés or CVs, mugshots, and schooling in how to pass interviews, we’d all got new and better-paid jobs well  before our month’s notice was ended. Whilst normally, an IT team under the axe is a sad and depressed place to belong to, this wonderful group of people had proved the power of organized group action in turning the experience to advantage. It left us feeling slightly guilty that we were somehow cheating, but I guess we were merely leveling the playing-field.

    Read the article

  • Can't install Windows 7 on Acer Aspire M1100

    - by r0ca
    When I install Windows 7, everything goes smooth but as soon as it's done and Windows needs to reboot for the last time before getting the desktop, the computer stucks to Verify DMI Pool Data............. and then, nothing. I change the CMOS battery, I tried so many setup in BIOS, even load default settings... Nothing worked. The HDD light is not flickering anymore, no HDD activity. CTRL-ALT-DEL doesn't work. It's just impossible to load Windows 7. I tried Windows XP and this works fine. I also tried the Acer (Futureshop) recovery CD and I get an Hexademical error message stating the install cannot continue. Is there a BIOS flash apps somewhere or a fix I can apply to have Windows 7 Ultimate installed on my computer. Any takers?

    Read the article

  • Using Synology NAS attached to WDS-Repeater

    - by Kai B
    I'm using the following devices for my home network: Router 1: Speedport W 723 V (192.168.2.1) Router 2: AVM Fritzbox 3270 (192.168.2.2) NAS: Synology DS 207+(192.168.2.3) I successfully set up a WDS connection between the two routers. The Speedport acts as basestation, the Fritzbox repeats the WIFI signal of the Speedport. Everything fine, so far. Now I'm trying to achieve the following: Client → Speedport (Base) → Fritzbox (Repeater) → Synology NAS I want to use my Synology NAS attached to the Fritzbox (which is in repeating mode). I already gave it a static IP (as written above) but all connection attempts failed. Did I miss something out or is this set-up simply impossible?

    Read the article

  • Has anyone else had problems with the miserable kb980408 update?

    - by Dan
    I wasn't experienced any problems before installing kb980408 for Windows 7 x86. The update made it impossible for me to log into my computer after displaying a "Setting up personalized settings for Windows Desktop Update" message, after killing that task a "Windows Explorer is not responding - Windows is checking for a solution to the problem" message appears, then it waits forever... If I killed that task, I was not able to continue the boot process, if I ran explore.exe manually, the same thing happens. The solution, by the way, was to boot into safe mode and use the add/remove program in the control panel to uninstall this wretched piece of code. Start-up repair and/or the use of system restore will not help you, you need to uninstall.

    Read the article

  • osx split external hard drive partition

    - by Bart
    Hi, I currently have a 640GB external HD that has 1 partition formatted as HSF+ Now I want to split some of the free space into a new FAT32 partition, without having to reformat the whole HD and losing all my data. I read that I'm supposed to be able to add new partitions in the Hard Disc Utility by clicking the "+" sign, without any loss of data. But in my case the "+" is not clickable and it says that this partition cannot be altered. Can anyone tell me how to proceed. Or is it impossible without reformatting the whole disc? Thanks ps: I'm running osx snow leopard 10.6.6

    Read the article

  • Add custom command line to extended context menu in Windows 7

    - by 280Z28
    I have an application pinned to the task bar. 90% of the time, I run it with no additional command line options, so I can either click it (if not already open), or right click the icon and click the application name to open a new instance. I want to make it where when I right click it, there are 2 options listed: the first is the program with no command line options and the second is the one with a custom command line (that I hard code). If this is impossible, it would be tolerable to add it to the extended context menu (shift + right click the icon), but I prefer the former.

    Read the article

  • Redundancy and Automated failover using Forefront TMG 2010 Standard between Production-DR site ?

    - by Albert Widjaja
    Hi, I'm using MS TMG 2010 Standard as my single firewall to publish my Exchange Server and IIS website to the internet, however it is just one VM in the DMZ network with just one network card (vNIC), what sort of redundancy method that is suitable for making this firewall VM redundant / automatically failover in my DR site ? Because it is very important in the event of disaster recovery all important email through various mobile device will still need to operate and it is impossible if this TMG 2010 VM is offline. is it by using: 1. Multicast NLB 2. Any other clustering 3. VMware HA / FT (one VM in production, the other VM in DR site with different subnet ?) Any suggestion and idea willl be appreciated. Thanks

    Read the article

  • Videoediting slow on windows with HD footage in .MOV format

    - by Joakim
    My camera is a Canon 5D mark II that serves me .mov files in HD. But these are a pain in the neck to edit on my pc. I have tried Premier, Vegas, Pinnacle, etc, but it is almost impossible to edit and do a movie. I do have a good computer with Windows7 but that doesn't help me, and I don't want to buy a new monster pc just for that task. Question: I could convert the files, but what would be the best format? I don't want to lose much quality. Anyone have any ideas?

    Read the article

  • How safe is it to run CHKDSK on an SSD?

    - by Eilon
    I recently saw Windows 7 pop up a warning or two that I should run chkdsk on my laptop. My laptop came with an SSD and I'm not sure if there are any negative implications to running chkdsk on such a drive. Are there any potential issues with reporting "bad sectors" on the drive? I would imagine that the physical concept of sectors is completely different between a platter and a microchip. I don't think my SSD supports TRIM. It's about 14 months old and a quick web search seems to hint that it doesn't (though it's nearly impossible to find out this info for sure!). I'm also not sure if TRIM is even relevant here since there shouldn't be much in the way of deletes. So, how safe is it to run chkdsk on my SSD drive? The model of SSD that I have is reported as "Samsung SSD PB22-JS3 2.5".

    Read the article

  • Adobe Reader unusably slow on Mac OS X 10.6.6

    - by vwegert
    We've got two Macs that are both running 10.6.6. On my MBP, Adobe (Acrobat) Reader started behaving weird a few weeks ago. It became very sluggish, started missing mouse clicks or mouse button releases, scrolling was next to impossible. Most of the time it does not handle Page Up / Page Down events at all. Zooming works erratically if at all. It's basically unusable. On the other Mac (an iMac), there are no such problems. I've tried to remove and reinstall Adobe Reader as well as upgrading to the latest version, but unfortunately without success. This is the only software that is behaving strangely on this Mac, everything else is working fine. What else could I try?

    Read the article

  • How to fix an SSD

    - by anonymous
    I have a Samsung 128 go SSD : MZ5PA128HMCD-01000. When I'm using it : there is always I/O errors. I tried to secure-erase it. But it is impossible to create a new NTFS (or any filesystem...) partition because I still get I/O errors. I tought maybe upgrading the firmware will solve the problem. Unfortunately, I'm not able to upgrade the firmware with Samsung SSD Magician Tool ... because Magician says there is no Samsung SSD on my computer... Is there a way to make Magician recognize this Samsung MZ5PA128HMCD-01000 SSD? Is another tool available to flash any firmware on a SSD? What should I do to fix this SSD?

    Read the article

  • Videoediting slow on windows with HD footage in fileformat .MOV

    - by Joakim
    Hi, My camera a Canon 5D mark II serves me .mov files in HD - but these are a pain in the neck to edit on my pc. I have tried Premier, Vegas, Pinnacle etc - but it is almost impossible to edit and do a movie. I do have a good computer with windows7 but that doesnt help me, and I dont want to buy a new monster pc just for that task. Question: I could manage to convert the files, but what would be the best format? I dont want to loose to much quality. Anyone have any ideas? Best regards, Joakim

    Read the article

  • Laptop LCD Screen Flickering

    - by BSchlinker
    I recognize this is most likely not a software error and likely lies in one of the hardware components. However, about 3 months ago I replaced the LCD screen on this laptop. The company which I bought it through provided a 12 month warranty. Before I contact them, I would like to verify that the problem is most likely the LCD screen and not the inverter. Any comments? I recognize its impossible to diagnose hardware remotely, but it would be nice to know if there is a high probability it could be a component other then the LCD screen.

    Read the article

  • Iomega Home Media Network Hard Drive: Accesing the data in the disk?

    - by JJarava
    Hi all! I have an Iomega Home Media Network Hard Drive, 1TB, and lately I can't access the data on the drive. The shares (both built-in and created by me) are there, the security works, but when trying to access the data I get a "The network path was not found" message which is worrying, to say the least. I'd like to know if there is a way to get the data off the disks somehow, as some of the data in the drive (ie, pictures and videos of my 1.5 yrs old son) is hard if not impossible to find otherwise. Thanks a lot Javier

    Read the article

  • v4l - capture and watch at the same time

    - by John Barrett
    Capturing v4l and line-in audio using mencoder works very well, but I would like to record real-time gameplay video from consoles plugged into the video card. I've used xawtv for this (Works quite well, can preview and record in real time), but when I enable any deinterlacing or aspect ration options the video fails to record. I have to record raw and re-encode the video with the appropriate filters later to get something workable. Other things I have tried: tvtime with xvidcap and jack audio capture - xvidcap drops frames and muxing the audio is impossible as it will go out of sync (I have not found muxer options that work to force a correct frame rate) mencoder capture to file, attempt to pipe tail of file to mplayer... mencoder works great, piping the file is far too heavy to attempt gameplay. Soooo, v4l capture and preview simultaneously, recommendations?

    Read the article

  • Linux accessibility: Slow Keys causing duplicate key strokes

    - by skypanther
    I'm exploring the accessibility features within Gnome and having trouble with Slow Keys. My input is always doubled. Press a key briefly and I get nothing as you'd expect. Press just a bit longer and which ever key I'm pressing is input twice. Hello becomes HHeelllloo. I'm running Debian Lenny 5.0.6, kernel 2.6.26-2-686, GNOME Desktop 2.22.3 running within a VirtualBox session. I did some googling and didn't find others having similar troubles. Maybe it's a vbox thing? Any ideas how to fix this so I don't get the duplicates? It makes it impossible to log back in when the screen lock kicks on!

    Read the article

  • Can't get Intel drivers for ASUS P6T Deluxe

    - by Alex K
    Hi everybody, A few days back my Win7 Ultimate started bugging me to install the RAID drivers for my ASUS P6T Delux. Fine, I'm all for having up to date drivers even for stuff I don't use. The problem is that the drivers I download from both ASUS support site here and the ones I get directly from Intel (can't post the second link, since I have less than 10 rep, but it's the first hit for "Intel(R) Matrix Storage Manager Driver" on google) say that my system does not meet the minimum system requirements. Impossible, I would think, what with it having the X58 chipset and the Windows version is supported one. Has anyone encountered something like this before? Any clues on how to solve this would be much appreciated. Thanks!

    Read the article

  • Can single ESXi host make use of two separete iSCSI box?

    - by user71061
    Hi! I have problem with using multiply iSCSI targets with single ESXi host (in my case they are two FreeNAS hosts, but I suspect that this problem will occur with any two iSCSI box of that same type/model). If I configure two FreeNAS hosts as iSCSI targets (say iSCSI A and iSCSI B), then I can use both of them with my ESXi host, but only one at a time (i.e only iSCSI A or only iSCSI B, but not both of them simultaneously). If I try to add second iSCSI target to my iSCSI adapter (of course it has unique iqn name), then in a details pane of this adapter (it is iSCSI software adapter), I see that total number of paths has increased accordingly, but not total number of devices (so I can't use it as another storage). What should I do? It is impossible to attach two iSCSI targets to single adapter? I'm using free version of ESXi 4.1. Maybe it is an limitation of free version? Thanks in advance for any sugestion.

    Read the article

  • Skype keypad tones

    - by Don
    Hi, When I push a number on the Skype keypad (or use the number keys on the keyboard) no tone is emitted. This happens both when dialling a number and if I push a key during a call. This makes it impossible for me to use Skype with automated telephone systems that require you to use the keypad to enter data or choose between various options. I spoke to somebody who works in a call centre about this and they indicated that somebody had mentioned that it's possible to disable (DTMF) tones in Skype. I've looked through all the Skype options and can't find any way to enable/disable DTMF tones. If somebody knows how I can do this, or has another suggestion for fixing the problem, please let me know. I'm using version 4.2.0.152 of Skype. Thanks, Don

    Read the article

  • Installing Windows XP sp3 into USB 2.0 WD 320Gb Hard Drive

    - by NetKabuki
    I have a HP laptop with support for 3 USB 2.0 ports. I also have a clean 320Gb WD USB 2.0 drive. The HP can boot from USB (Bios options). I used the install disk (XP SP3 bootable) and after a few stutters, was actually able to load up a partition on the 320Gb drive with Windows. I cannot consistently get the system to boot up off the USB drive. I am able to drop Ubuntu on that same WD drive and boot up. I can even get Grub2 on Ubuntu to recognize the Win OS. But booting the Win OS is an impossible task. What can I do differently - if anything?

    Read the article

  • Everything Windows Explorer does hangs up t some time or another

    - by William Barnes
    Yes this is a duplicate of something I have tried to ask in the past. I think I am posting wrong since my first question was deleted OK I ran SCANNOW nd it came back with no problems. I booted in Safe mode and windows evplorer worked OK. I am not sure what that proves. I also ran AUTRUNS and disabled everything that you said, but it still dies. I read someplace that it could be a service program, but I can't find that link anymore. Something about shutting off all non-windows SP's and then start adding back SP's until it quits. The problem is that sometimes it will work, but most of the time it does not. Makes saving or retrieving anything impossible. SAVEAS hangs up, my Garmin Map updater hangs up etc.. I was having this problem before installing Garmin so I am reasonably sure it is not the problem HELP PLEASE, I don't have many brain cells left and can't afford to lose more.

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >