Search Results

Search found 27092 results on 1084 pages for 'project houston'.

Page 139/1084 | < Previous Page | 135 136 137 138 139 140 141 142 143 144 145 146  | Next Page >

  • When Your Boss Doesn't Want you to Succeed

    - by Phil Factor
    You're working hard to get an application finished. You are programming long into the evenings sometimes, and eating sandwiches at your desk instead of taking a lunch break. Then one day you glance up at the IT manager, serene in his mysterious round of meetings, and think 'Does he actually care whether this project succeeds or not?'. The question may seem absurd. Of course the project must succeed. The truth, as always, is often far more complex. Your manager may even be doing his best to make sure you don't succeed. Why? There have always been rich pickings for the unscrupulous in IT.  In extreme cases, where administrators struggle with scarcely-comprehended technical issues, huge sums of money can be lost and gained without any perceptible results. In a very few cases can fraud be proven: most of the time, the intricacies of the 'game' are such that one can do little more than harbor suspicion.  Where does over-enthusiastic salesmanship end and fraud begin? The Business of Information Technology provides rich opportunities for White-collar crime. The poor developer has his, or her, hands full with the task of wrestling with the sheer complexity of building an application. He, or she, has no time for following the complexities of the chicanery of the management that is directing affairs.  Most likely, the developers wouldn't even suspect that their company management had ulterior motives. I'll illustrate what I mean with an entirely fictional, hypothetical, example. The Opportunist and the Aged Charities often do good, unexciting work that is funded by the income from a bequest that dates back maybe hundreds of years.  In our example, it isn't exciting work, for it involves the welfare of elderly people who have fallen on hard times.  Volunteers visit, giving a smile and a chat, and check that they are all right, but are able to spend a little money on their discretion to ameliorate any pressing needs for these old folk.  The money is made to work very hard and the charity averts a great deal of suffering and eases the burden on the state. Daisy hears the garden gate creak as Mrs Rainer comes up the path. She looks forward to her twice-weekly visit from the nice lady from the trust. She always asked ‘is everything all right, Love’. Cheeky but nice. She likes her cheery manner. She seems interested in hearing her memories, and talking about her far-away family. She helps her with those chores in the house that she couldn’t manage and once even paid to fill the back-shed with coke, the other year. Nice, Mrs. Rainer is, she thought as she goes to open the door. The trustees are getting on in years themselves, and worry about the long-term future of the charity: is it relevant to modern society? Is it likely to attract a new generation of workers to take it on. They are instantly attracted by the arrival to the board of a smartly dressed University lecturer with the ear of the present Government. Alain 'Stalin' Jones is earnest, persuasive and energetic. The trustees welcome him to the board and quickly forgive his humorless political-correctness. He talks of 'diversity', 'relevance', 'social change', 'equality' and 'communities', but his eye is on that huge bequest. Alain first came to notice as a Trotskyite union official, who insinuated himself into one of the duller Trades Unions and turned it, through his passionate leadership, into a radical, headline-grabbing organization.  Middle age, and the rise of European federal socialism, had brought him quiet prosperity and charcoal suits, an ear in the current government, and a wide influence as a member of various Quangos (government bodies staffed by well-paid unelected courtiers).  He was employed as a 'consultant' by several organizations that relied on government contracts. After gaining the confidence of the trustees, and showing a surprising knowledge of mundane processes and the regulatory framework of charities, Alain launches his plan.  The trust will expand their work by means of a bold IT initiative that will coordinate the interventions of several 'caring agencies', and provide  emergency cover, a special Website so anxious relatives can see how their elderly charges are doing, and a vastly more efficient way of coordinating the work of the volunteer carers. It will also provide a special-purpose site that gives 'social networking' facilities, rather like Facebook, to the few elderly folk on the lists with access to the internet. The trustees perk up. Their own experience of the internet is restricted to the occasional scanning of railway timetables, but they can see that it is 'relevant'. In his next report to the other trustees, Alain proudly announces that all this glamorous and exciting technology can be paid for by a grant from the government. He admits darkly that he has influence. True to his word, the government promises a grant of a size that is an order of magnitude greater than any budget that the trustees had ever handled. There was the understandable proviso that the company that would actually do the IT work would have to be one of the government's preferred suppliers and the work would need to be tendered under EU competition rules. The only company that tenders, a multinational IT company with a long track record of government work, quotes ten million pounds for the work. A trustee questions the figure as it seems enormous for the reasonably trivial internet facilities being built, but the IT Salesmen dazzle them with presentations and three-letter acronyms until they subside into quiescent acceptance. After all, they can’t stay locked in the Twentieth century practices can they? The work is put in hand with a large project team, in a splendid glass building near west London. The trustees see rooms of programmers working diligently at screens, and who talk with enthusiasm of the project. Paul, the project manager, looked through his resource schedule with growing unease. His initial excitement at being given his first major project hadn’t lasted. He’d been allocated a lackluster team of developers whose skills didn’t seem right, and he was allowed only a couple of contractors to make good the deficit. Strangely, the presentation he’d given to his management, where he’d saved time and resources with a OTS solution to a great deal of the development work, and a sound conservative architecture, hadn’t gone down nearly as big as he’d hoped. He almost got the feeling they wanted a more radical and ambitious solution. The project starts slipping its dates. The costs build rapidly. There are certain uncomfortable extra charges that appear, such as the £600-a-day charge by the 'Business Manager' appointed to act as a point of liaison between the charity and the IT Company.  When he appeared, his face permanently split by a 'Mr Sincerity' smile, they'd thought he was provided at the cost of the IT Company. Derek, the DBA, didn’t have to go to the server room quite some much as he did: but It got him away from the poisonous despair of the development group. Wave after wave of events had conspired to delay the project.  Why the management had imposed hideous extra bureaucracy to cover ISO 9000 and 9001:2008 accreditation just as the project was struggling to get back on-schedule was  beyond belief.  Then  the Business manager was coming back with endless changes in scope, sorrowing saying that the Trustees were very insistent, though hopelessly out in touch with the reality of technical challenges. Suddenly, the costs mount to the point of consuming the government grant in its entirety. The project remains tantalizingly just out of reach. Alain Jones gives an emotional rallying speech at the trustees review meeting, urging them not to lose their nerve. Sadly, the trustees dip into the accumulated capital of the trust, the seed-corn of all their revenues, in order to save the IT project. A few months later it is all over. The IT project is never delivered, even though it had seemed so incredibly close.  With the trust's capital all gone, the activities it funded have to be terminated and the trust becomes just a shell. There aren't even the funds to mount a legal challenge against the IT company, even had the trust's solicitor advised such a foolish thing. Alain leaves as suddenly as he had arrived, only to pop up a few months later, bronzed and rested, at another charity. The IT workers who were permanent employees are dispersed to other projects, and the contractors leave to other contracts. Within months the entire project is but a vague memory. One or two developers remain  puzzled that their managers had been so obstructive when they should have welcomed progress toward completion of the project, but they put it down to incompetence and testosterone. Few suspected that they were actively preventing the project from getting finished. The relationships between the IT consultancy, and the government of the day are intricate, and made more complex by the Private Finance initiatives and political patronage.  The losers in this case were the taxpayers, and the beneficiaries of the trust, and, perhaps the soul of the original benefactor of the trust, whose bid to give his name some immortality had been scuppered by smooth-talking white-collar political apparatniks.  Even now, nobody is certain whether a crime was ever committed. The perfect heist, I guess. Where’s the victim? "I hear that Daisy’s cottage is up for sale. She’s had to go into a care home.  She didn’t want to at all, but then there is nobody to keep an eye on her since she had that minor stroke a while back.  A charity used to help out. The ‘social’ don’t have the funding, evidently for community care. Yes, her old cat was put down. There was a good clearout, and now the house is all scrubbed and cleared ready for sale. The skip was full of old photos and letters, memories. No room in her new ‘home’."

    Read the article

  • How to Automate your Database Documentation

    - by Jonathan Hickford
    In my previous post, “Automating Deployments with SQL Compare command line” I looked at how teams can automate the deployment and post deployment validation of SQL Server databases using the command line versions of Red Gate tools. In this post I’m looking at another use for the command line tools, namely using them to generate up-to-date documentation with every database change. There are many reasons why up-to-date documentation is valuable. For example when somebody new has to work on or administer a database for the first time, or when a new database comes into service. Having database documentation reduces the risks of making incorrect decisions when making changes. Documentation is very useful to business intelligence analysts when writing reports, for example in SSRS. There are a couple of great examples talking about why up to date documentation is valuable on this site:  Database Documentation – Lands of Trolls: Why and How? and Database Documentation Using SQL Doc. The short answer is that it can save you time and reduce risk when you need that most! SQL Doc is a fast simple tool that automatically generates database documentation. It can create documents in HTML, Word or pdf files. The documentation contains information about object definitions and dependencies, along with any other information you want to associate with each object. The SQL Doc GUI, which is included in Red Gate’s SQL Developer Bundle and SQL Toolbelt, allows you to add additional notes to objects, and customise which objects are shown in the docs.  These settings can be saved as a .sqldoc project file. The SQL Doc command line can use this project file to automatically update the documentation every time the database is changed, ensuring that documentation that is always up to date. The simplest way to keep documentation up to date is probably to use a scheduled task to run a script every day. However if you have a source controlled database, or are using a Continuous Integration (CI) server or a build server, it may make more sense to use that instead. If  you’re using SQL Source Control or SSDT Database Projects to help version control your database, you can automatically update the documentation after each change is made to the source control repository that contains your database. To get this automation in place,  you can use the functionality of a Continuous Integration (CI) server, which can trigger commands to run when a source control repository has changed. A CI server will also capture and save the documentation that is created as an artifact, so you can always find the exact documentation for a specific version of the database. This forms an always up to date data dictionary. If you don’t already have a CI server in place there are several you can use, such as the free open source Jenkins or the free starter editions of TeamCity. I won’t cover setting these up in this article, but there is information about using CI servers for automating database tasks on the Red Gate Database Delivery webpage. You may be interested in Red Gate’s SQL CI utility (part of the SQL Automation Pack) which is an easy way to update a database with the latest changes from source control. The PowerShell example below shows how to create the documentation from a database. That database might be your integration database or a shared development database that is always up to date with the latest changes. $serverName = "server\instance" $databaseName = "databaseName" # If you want to document multiple databases use a comma separated list $userName = "username" $password = "password" # Path to SQLDoc.exe $SQLDocPath = "C:\Program Files (x86)\Red Gate\SQL Doc 3\SQLDoc.exe" $arguments = @( "/server:$($serverName)", "/database:$($databaseName)", "/username:$($userName)", "/password:$($password)", "/filetype:html", "/outputfolder:.", # "/project:$args[0]", # If you already have a .sqldoc project file you can pass it as an argument to this script. Values in the project will be overridden with any options set on the command line "/name:$databaseName Report", "/copyrightauthor:$([Environment]::UserName)" ) write-host $arguments & $SQLDocPath $arguments There are several options you can set on the command line to vary how your documentation is created. For example, you can document multiple databases or exclude certain types of objects. In the example above, we set the name of the report to match the database name, and use the current Windows user as the documentation author. For more examples of how you can customise the report from the command line please see the SQL Doc command line documentation If you already have a .sqldoc project file, or wish to further customise the report by including or excluding specific objects, you can use this project on the command line. Any settings you specify on the command line will override the defaults in the project. For details of what you can customise in the project please see the SQL Doc project documentation. In the example above, the line to use a project is commented out, but you can uncomment this line and then pass a path to a .sqldoc project file as an argument to this script.  Conclusion Keeping documentation about your databases up to date is very easy to set up using SQL Doc and PowerShell. By using a CI server to run this process you can trigger the documentation to be run on every change to a source controlled database, and keep historic documentation available. If you are considering more advanced database automation, e.g. database unit testing, change script generation, deploying to large numbers of targets and backup/verification, please email me at [email protected] for further script samples or if you have any questions.

    Read the article

  • VirtualHost and 403 Forbidden problem with Apache

    - by Joaquín L. Robles
    Hi people, I have the exactly same problem as in 403 Forbidden Error when accessing enabled virtual host, but I tried the provided solution and still receiving 403 Forbidden, my site is called project, and it's cofiguration file in /etc/apache2/sites-available is the following: <VirtualHost *:80> ServerAdmin webmaster@reweb ServerName www.online.project.com ErrorLog /logs/project-errors.log DocumentRoot /home/joarobles/Zend/workspaces/DefaultWorkspace7/online.project.com.ar/public <Directory /home/joarobles/Zend/workspaces/DefaultWorkspace7/online.project.com.ar> Order Deny,Allow Allow from all Options Indexes </Directory> Tried to change the owner of /home/joarobles/Zend/workspaces/DefaultWorkspace7/online.cobico.com.ar to www:data:www-data with recursion, but still getting this error in project-errors.log: [Mon Jan 31 01:26:01 2011] [crit] [client 127.0.0.1] (13)Permission denied: /home/joarobles/.htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable Any idea? Thanks!

    Read the article

  • eclipse-cdt Problems

    - by Bary12
    I have got a weird problem. I installed eclipse for c/c++ Development using the following: sudo apt-get install eclipse eclipse-cdt g++ after i opened eclipse, i tried to make a new project, but i didn't get an option to choose between types of projects (c project/ c++ project/ Java project) but just the option "project". this project did not generate a src folder. i didn't see an option to create a src folder as well. how do i fix that? i tried installing it manually with the download from their site, but the program didn't even launch.

    Read the article

  • XNA Moddable Game - Architecture Design and Reflection

    - by David K
    I've decided to embark on an XNA moddable game project of a simple rogue style. For all purposes of this question, I'm going to not be using a scripting engine, but rather allow modders to directly compile assemblies that are loaded by the game at run time. I know about the security problems this may raise. So in order to expose the moddable content, I have gone about creating a generic project in XNA called MyModel. This contains a number of interfaces that all inherit from IPlugin, such as IGameSystem, IRenderingSystem, IHud, IInputSystem etc. Then I've created another project called MyRogueModel. This references MyModel project, and holds interfaces such as IMonster, IPlayer, IDungeonGenerator, IInventorySystem. More rogue specific interfaces, but again, all interfaces in this project inherit from IPlugin. Then finally, I've created another project called MyRogueGame, that references both MyModel and MyRogueModel projects. This project will be the game that you run and play. Here I have put the actual implementation of the Monster, DungeonGenerator, InputSystem and RenderingSystem classes. This project will also scan the mods directory during run time and load any IPlugins it finds using reflection and override anything it finds from the default. For example if it finds a new implementation of the DungeonGenerator it will use that one instead. Now my question is, in order to get this far, I have effectively 2 projects that contain nothing but interfaces... which seems a little... strange ? For people to create mods for the game, I would give them both the MyModel and MyRogueModel assemblies in which they would reference. I'm not sure whether this is the right way to do it, but my reasoning goes as follows : If I write 1 input system, I can use it in any game I write. If I create 3 rogue like games, and a modder writes 1 rendering system, that modder could use the rendering system for all 3 games, because it all comes from the MyModel project. I come from a more web based C# role, so having empty interface projects doesn't seem wrong, its just something I haven't done before. Before I embark on something that might be crazy, I'd just like to know whether this is a foolish idea and whether there's a better (or established) design principle I should be following ?

    Read the article

  • Software Architecture and MEF composition location

    - by Leonardo
    Introduction My software (a bunch of webapi's) consist of 4 projects: Core, FrontWebApi, Library and Administration. Library is a code library project that consists of only interfaces and enumerators. All my classes in other projects inherit from at least one interface, and this interface is in the library. Generally speaking, my interfaces define either Entities, Repositories or Controllers. This project references no other project or any special dlls... just the regular .Net stuff... Core is a class-library project where concrete implementation of Entities and Repositories. In some cases i have more than 1 implementation for a Repository (ex: one for azure table storage and one for regular Sql). This project handles the intelligence (business rules mostly) and persistence, and it references only the Library. FrontWebApi is a ASP.NET MVC 4 WebApi project that implements the controllers interfaces to handle web-requests (from a mobile native app)... It references the Core and the Library. Administration is a code-library project that represents a "optional-module", meaning: if it is present, it provides extra-features (such as Access Control Lists) to the application, but if its not, no problem. Administration is also only referencing the Library and implementing concrete classes of a few interfaces such as "IAccessControlEntry"... I intend to make this available with a "setup" that will create any required database table or anything like that. But it is important to notice that the Core has no reference to this project... Development Now, in order to have a decoupled code I decide to use IoC and because this is a small project, I decided to do it using MEF, specially because of its advertised "composition" capabilities. I arranged all the imports/exports and constructors and everything, but something is quite not perfect in my "mental-visualisation": Main Question Where should I "Compose" the objects? I mean: Technically, the only place where real implementation access is required is in the Repositories, because in order to retrieve data from wherever, entities instances will be necessary, and in all other places. The repositories could also provide a public "GetCleanInstanceOf()" right? Then all other places will be just fine working with the interfaces instead of concrete classes... Secondary Question Should "Administration" implement the concrete object for "IAccessControlGeneralRepository" or the Core should?

    Read the article

  • Difference between ~/folder and /home/username/folder when creating a path in /etc/environment

    - by r0xx4nne
    I had an executable script on my ubuntu located on ~/project/ directory and I tried to add that path to /etc/environment . So , I edit the path to this PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:~/project/" . Then , I logout and login back , open the terminal as su and run the command to execute my script on that folder but the result is command not found. Then, I change the path in /etc/environment to PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/home/r0xx4nne/project/" and voila it works.Now i can run the executable script inside ~/project/ without fail under su command. My question is , what's the difference between ~/project and /home/r0xx4nne/project when it comes in case of creating a path in /etc/environment ? Why it happened to be like this? I am a newbie and I just want to know more . Thanks for any reply .

    Read the article

  • Migrating an Existing ASP.NET App to run on Windows Azure

    - by kaleidoscope
    Converting Existing ASP.NET application in to Windows Azure Method 1:  1. Add a Windows Azure Cloud Service to the existing solution                         2. Add WebRole Project in solution and select Existing Asp.Net project in it. Method 2: The other option would have been to create a new Cloud Service project and add the ASP.NET project (which you want to deploy on Azure) to it using -                    1. Solution | Add | Existing Project                    2. Add | Web Role Project in solution Converting Sql server database to SQL Azure - Step 1: Migrate the <Existing Application>.MDF Step 2: Migrate the ASP.NET providers Step 3: Change the connection strings More details can be found at http://blogs.msdn.com/jnak/archive/2010/02/08/migrating-an-existing-asp-net-app-to-run-on-windows-azure.aspx   Ritesh, D

    Read the article

  • How to organise projects with dependencies on BitBucket?

    - by Timwi
    Both Mercurial and BitBucket make one fundamental assumption: 1 repo = 1 project. If I have a project that has a dependency (a library) which is shared by many projects, this assumption gets in the way. Now it is no longer possible to have a separate BitBucket page for each project while still being able to commit atomic revisions to multiple projects. If I put all the projects into one repo, they all become one “project” on BitBucket. If I put them in separate repos, it is no longer possible to know which version of the library project was in use at revision X of a dependent project. How is this situation normally solved on BitBucket, or is there explicitly no support for this common scenario?

    Read the article

  • How do you organize your projects?

    - by Sergio
    Do you have any particular style of organizing projects? For example, currently I'm creating a project for a couple of schools here in Bolivia, this is how I organized it: TutoMentor (Solution) TutoMentor.UI (Winforms project) TutoMentor.Data (Class library project) How exactly do you organize your project? Do you have an example of something you organized and are proud of? Can you share a screenshot of the Solution pane? In the UI area of my application, I'm having trouble deciding on a good schema to organize different forms and where they belong. Edit: What about organizing different forms in the .UI project? Where/how should I group different form? Putting them all in root level of the project is a bad idea.

    Read the article

  • Deploy from NetBeans IDE by Twisting an External Dial

    - by Geertjan
    Via this code in a NetBeans module, i.e., a registered NetBeans ModuleInstall class, you can twist the Tinkerforge Rotary Poti Bricklet to deploy the current application in the IDE: import com.tinkerforge.BrickMaster; import com.tinkerforge.BrickletLCD20x4; import com.tinkerforge.BrickletRotaryPoti; import com.tinkerforge.IPConnection; import javax.swing.Action; import javax.swing.JMenuItem; import org.netbeans.api.project.Project; import org.netbeans.api.project.ProjectUtils; import org.openide.awt.Actions; import org.openide.modules.ModuleInstall; import org.openide.util.Utilities; public class Installer extends ModuleInstall { private static final String HOST = "localhost"; private static final int PORT = 4223; private static final String MASTERBRICKUID = "abc"; private static final String LCDUID = "abc"; private static final String ROTIUID = "abc"; private static IPConnection ipc; private static BrickMaster master = new BrickMaster(MASTERBRICKUID); private static BrickletLCD20x4 lcd = new BrickletLCD20x4(LCDUID); private static BrickletRotaryPoti poti = new BrickletRotaryPoti(ROTIUID); @Override public void restored() { try { ipc = new IPConnection(HOST, PORT); ipc.addDevice(master); ipc.addDevice(lcd); ipc.addDevice(poti); poti.setPositionCallbackPeriod(50); poti.addListener(new BrickletRotaryPoti.PositionListener() { @Override public void position(final short position) { lcd.backlightOn(); lcd.clearDisplay(); final Action runAction = Actions.forID("Project","org.netbeans.modules.project.ui.RunMainProject"); //The action must be invoked from menu item or toolbar button, //see line 147 in org.netbeans.modules.project.ui.actions.LookupSensitiveAction: JMenuItem jmi = new JMenuItem(runAction); //When position is 100 (range is -150 to 150), deploy the app //and print info about the project to the LCD display: if (position == 100) { jmi.doClick(); Project p = Utilities.actionsGlobalContext().lookup(Project.class); lcd.writeLine((short) 0, (short) 0, "Deployed:"); lcd.writeLine((short) 1, (short) 0, ProjectUtils.getInformation(p).getDisplayName()); } else { lcd.writeLine((short) 0, (short) 0, "Position: " + position); } } }); } catch (Exception e) { } } }

    Read the article

  • MSM Merge Modules in Visual Studio 2013 [on hold]

    - by theGreenCabbage
    Could someone please let me know where I might find resources for creating MSM files? While I am able to create MSI files using InstallShield, it seems that Visual Studio no longer supports Merge Module Projects, judging by the link below and the screenshot of my version of Visual Studio 2013 - http://msdn.microsoft.com/en-us/library/z6z02ts5(v=vs.80).aspx To create a new merge module project: On the File menu, point to Add, then click New Project. In the resulting Add New Project dialog box, in the Project types pane, open the Other Project Types node and select Setup and Deployment Projects. In the Templates pane, choose Merge Module Project.

    Read the article

  • XNA stopped compiling my model x files

    - by HuseyinUslu
    So I've a 3d game project I'm working on and I'm using 2 model files (SkyBlock.x and AimedBlock.x). So until now everything was all good and my models files were compiled all okay and I was able to use them within my game. With the latest changes (which I don't know what caused it really) - XNA stopped compiling my model files and instead only outputs files; AimedBlockxnb - 1kb SkyDome.xnb - 1kb SkyDomeTexture.xnb - 1389 kb SkyDomeTexture_0.xnb - 419 kb So I created a test XNA game project and moved all my asset's to new solution content project's and tried compiling them and saw that they're all good. AimedBlockxnb - 2kb SkyDome.xnb - 13kb SkyDomeTexture.xnb - 4097 kb SkyDomeTexture_0.xnb - 683 kb So I guess my main project sucks there but I couldn't came with a solution. I even tried overwriting my game's content project with new game's content project (which was all okay) but it didn't work. Anybody had similar issues?

    Read the article

  • Do you feel that you, as a programmer, make a difference?

    - by gablin
    When I graduate from uni my desire is to land a job where I feel that what I do as a programmer makes a difference and contribute to the project. My code, no matter how small, is useful to the project, is being used by the project, and takes it forward. My work matters and thus I feel that I make a difference. In contrast, one of my fears is that my work just doesn't matter. Either it is just meaningless to the project but you're told to do it anyway, or your code is useful but not used in the project, or you feel that the project as a whole is just pointless, for whatever reason. Is this something that you've experienced, or are experiencing? Do you feel that you, as a programmer, make a difference, or do you feel that what you do just doesn't matter?

    Read the article

  • Just Released: Oracle Instantis EnterpriseTrack 8.5

    - by Melissa Centurio Lopes
    Instantis EnterpriseTrack has been successfully integrated into the Oracle development process and the first release under Oracle is generally available. This release is a significant expansion of solution capabilities in resource management, project demand management, and project execution. It also includes customer requested product enhancements, reporting enhancements, performance optimizations, and user experience improvements. Key enhancements include: New Resource Calendar functionality provides more precise capacity visibility for resource planning and increases project plan reliability. Support for Activity Labor Cost Capitalization allows project teams to easily mark any WBS activity as CapEx or OpEx with Labor Expense Type and enforce proper classification of Labor Expense Type for activities by setting defaults through activity templates or at project level New Variable Resource Rates functionality allows project stakeholders to specify resource rate accurately over time and account for wage revisions Instantis EnterpriseTrack cloud and on-premise solutions provide a top-down approach to managing, tracking and reporting on enterprise strategies, projects, portfolios, processes, resources, and financials. Upgrade now or Visit the Instantis EnterpriseTrack site to learn more.

    Read the article

  • TextMate - completion using an external file or file contained in project?

    - by Neil Baldwin
    Does anyone know how to get TextMate to search an external file (or even the files contained in a TextMate "project") with which to perform word completion? I'm coding some stuff on the C64 (using TextMate to write the code) and I have an external file containing labels for all of the hardware registers/kernal routines e.g VIC2InteruptStatus = $D019 It would be really handy to be able to type, say, 'VIC2I' then press the key for word completion and have TextMate find matches in the external library file. Rather than how I'm having to do it at the moment by opening the library file and copy-paste the register names into my code.

    Read the article

  • The project estimates the installation of external and internal surveillance. [closed]

    - by Zhasulan Berdybekov
    The project estimates the installation of external and internal surveillance. Here are our objects: 1 - Number of cameras 2 - These are objects 3 - setting this distance to the Situation Centre 11 - New Alphabet - 1,5 km 11 - New Alphabet - 1 km 19 - New Alphabet - 800 m 19 - New Alphabet - 1 km 35 - The building - 200 m 35 - The building - 100 m 18 - The building - 100 m 22 - Outside videonalyudenie - 50 m to 1 km Please tell how many need to DVRs, and where they put on the object or situation center How to bring information to the Situation Centre. What cables needed. Your advice and comment. Thank you for your efforts!

    Read the article

  • Time to stop using &ldquo;Execute Package Task&rdquo;&ndash; a way to execute package in SSIS catalog taking advantage of the new project deployment model ,and the logging and reporting feature

    - by Kevin Shyr
    I set out to find a way to dynamically call package in SSIS 2012.  The following are 2 excellent blogs I found; I used them heavily.  The code below has some addition to parameter types and message types, but was made essentially derived entirely from the blogs. http://sqlblog.com/blogs/jamie_thomson/archive/2011/07/16/ssis-logging-in-denali.aspx http://www.ssistalk.com/2012/07/24/quick-tip-run-ssis-2012-packages-synchronously-and-other-execution-options/   The code: Every package will be called by a PackageController package.  The packageController is initialized with some information on which package to run and what information to pass in.   The following is the stored procedure called from the “Execute SQL Task”.  Here is the highlight of the stored procedure It takes in packageName, project name, and folder name (folder in SSIS project deployment to SSIS catalog) The stored procedure sets the package variables of the upcoming package execution Execute package in SSIS Catalog Get the status of the execution.  Also, if exists, get the error message’s message_id and store them in the management database. Return value to “Execute SQL Task” to manage failure properly CREATE PROCEDURE [AUDIT].[LaunchPackageExecutionInSSISCatalog]        @PackageName NVARCHAR(255)        , @ProjectFolder NVARCHAR(255)        , @ProjectName NVARCHAR(255)        , @AuditKey INT        , @DisableNotification BIT        , @PackageExecutionLogID INT AS BEGIN TRY        DECLARE @execution_id BIGINT = 0;        -- Create a package execution        EXEC [SSISDB].[catalog].[create_execution]                     @package_name=@PackageName,                     @execution_id=@execution_id OUTPUT,                     @folder_name=@ProjectFolder,                     @project_name=@ProjectName,                     @use32bitruntime=False;          UPDATE [AUDIT].[PackageInstanceExecutionLog] WITH(ROWLOCK)        SET [SSISCatalogExecutionID] = @execution_id        WHERE [PackageInstanceExecutionLogID] = @PackageExecutionLogID          -- this is to set the execution synchronized so that I can check the result in the end        EXEC [SSISDB].[catalog].[set_execution_parameter_value]                     @execution_id,                      @object_type=50,                     @parameter_name=N'SYNCHRONIZED',                     @parameter_value=1; -- true          /********************************************************         ********************************************************              Section: setting parameters                     Source table:  SSISDB.internal.object_parameters              object_type list:                     20: project level variables                     30: package level variables                     50: execution parameter         ********************************************************         ********************************************************/        EXEC [SSISDB].[catalog].[set_execution_parameter_value]                     @execution_id,                      @object_type=30,                     @parameter_name=N'FromParent_AuditKey',                     @parameter_value=@AuditKey; -- true          EXEC [SSISDB].[catalog].[set_execution_parameter_value]                     @execution_id,                      @object_type=30,                     @parameter_name=N'FromParent_DisableNotification',                     @parameter_value=@DisableNotification; -- true          EXEC [SSISDB].[catalog].[set_execution_parameter_value]                     @execution_id,                      @object_type=30,                     @parameter_name=N'FromParent_PackageInstanceExecutionID',                     @parameter_value=@PackageExecutionLogID; -- true        /********************************************************         ********************************************************              Section: setting variables END         ********************************************************         ********************************************************/            /* This section is carried over from example code           I don't see a reason to change them yet        */        -- Set our package parameters        EXEC [SSISDB].[catalog].[set_execution_parameter_value]                     @execution_id,                      @object_type=50,                     @parameter_name=N'DUMP_ON_EVENT',                     @parameter_value=1; -- true          EXEC [SSISDB].[catalog].[set_execution_parameter_value]                     @execution_id,                      @object_type=50,                     @parameter_name=N'DUMP_EVENT_CODE',                     @parameter_value=N'0x80040E4D;0x80004005';          EXEC [SSISDB].[catalog].[set_execution_parameter_value]                     @execution_id,                      @object_type=50,                     @parameter_name=N'LOGGING_LEVEL',                     @parameter_value= 1; -- Basic          EXEC [SSISDB].[catalog].[set_execution_parameter_value]                     @execution_id,                      @object_type=50,                     @parameter_name=N'DUMP_ON_ERROR',                     @parameter_value=1; -- true                              /********************************************************         ********************************************************              Section: EXECUTING         ********************************************************         ********************************************************/        EXEC [SSISDB].[catalog].[start_execution]                     @execution_id;        /********************************************************         ********************************************************              Section: EXECUTING END         ********************************************************         ********************************************************/            /********************************************************         ********************************************************              Section: checking execution result                     Source table:  [SSISDB].[catalog].[executions]              status:                     1: created                     2: running                     3: cancelled                     4: failed                     5: pending                     6: ended unexpectedly                     7: succeeded                     8: stopping                     9: completed         ********************************************************         ********************************************************/        if EXISTS(SELECT TOP 1 1                            FROM [SSISDB].[catalog].[executions] WITH(NOLOCK)                            WHERE [execution_id] = @execution_id                                  AND [status] NOT IN (2, 7, 9)) BEGIN                /********************************************************               ********************************************************                     Section: logging error messages                            Source table:  [SSISDB].[internal].[operation_messages]                     message type:                            10:  OnPreValidate                             20:  OnPostValidate                             30:  OnPreExecute                             40:  OnPostExecute                             60:  OnProgress                             70:  OnInformation                             90:  Diagnostic                             110:  OnWarning                            120:  OnError                            130:  Failure                            140:  DiagnosticEx                             200:  Custom events                             400:  OnPipeline                     message source type:                            10:  Messages logged by the entry APIs (e.g. T-SQL, CLR Stored procedures)                             20:  Messages logged by the external process used to run package (ISServerExec)                             30:  Messages logged by the package-level objects                             40:  Messages logged by tasks in the control flow                             50:  Messages logged by containers (For, ForEach, Sequence) in the control flow                             60:  Messages logged by the Data Flow Task                                    ********************************************************               ********************************************************/                INSERT INTO AUDIT.PackageInstanceExecutionOperationErrorLink                     SELECT @PackageExecutionLogID                                  ,[operation_message_id]                            FROM [SSISDB].[internal].[operation_messages] WITH(NOLOCK)                            WHERE operation_id = @execution_id                                  AND message_type IN (120, 130)                           EXEC [AUDIT].[FailPackageInstanceExecution] @PackageExecutionLogID, 'SSISDB Internal operation_messages found'                GOTO ReturnTrueAsErrorFlag                /********************************************************               ********************************************************                     Section: checking messages END               ********************************************************               ********************************************************/                /* This part is not really working, so now using rowcount to pass status              --DECLARE @PackageErrorMessage NVARCHAR(4000)              --SET @PackageErrorMessage = @PackageName + 'failed with executionID: ' + CONVERT(VARCHAR(20), @execution_id)                --RAISERROR (@PackageErrorMessage -- Message text.              --     , 18 -- Severity,              --     , 1 -- State,              --     , N'check table AUDIT.PackageInstanceExecutionErrorMessages' -- First argument.              --     );              */        END        ELSE BEGIN              GOTO ReturnFalseAsErrorFlagToSignalSuccess        END        /********************************************************         ********************************************************              Section: checking execution result END         ********************************************************         ********************************************************/ END TRY BEGIN CATCH        DECLARE @SSISCatalogCallError NVARCHAR(MAX)        SELECT @SSISCatalogCallError = ERROR_MESSAGE()          EXEC [AUDIT].[FailPackageInstanceExecution] @PackageExecutionLogID, @SSISCatalogCallError          GOTO ReturnTrueAsErrorFlag END CATCH;     /********************************************************  ********************************************************    Section: end result  ********************************************************  ********************************************************/ ReturnTrueAsErrorFlag:        SELECT CONVERT(BIT, 1) AS PackageExecutionErrorExists ReturnFalseAsErrorFlagToSignalSuccess:        SELECT CONVERT(BIT, 0) AS PackageExecutionErrorExists   GO

    Read the article

  • How can I get TFS 2010 to build each project to a separate directory?

    - by Jonathan Schuster
    In our project, we'd like to have our TFS build put each project into its own folder under the drop folder, instead of dropping all of the files into one flat structure. To illustrate, we'd like to see something like this: DropFolder/ Foo/ foo.exe Bar/ bar.dll Baz baz.dll This is basically the same question as was asked here, but now that we're using workflow-based builds, those solutions don't seem to work. The solution using the CustomizableOutDir property looked like it would work best for us, but I can't get that property to be recognized. I customized our workflow to pass it in to MSBuild as a command line argument (/p:CustomizableOutDir=true), but it seems MSBuild just ignores it and puts the output into the OutDir given by the workflow. I looked at the build logs, and I can see that the CustomizableOutDir and OutDir properties are both getting set in the command line args to MSBuild. I still need OutDir to be passed in so that I can copy my files to TeamBuildOutDir at the end. Any idea why my CustomizableOutDir parameter isn't getting recognized, or if there's a better way to achieve this?

    Read the article

  • First TDD, Simple 2-tier C# Project - what do I unit test?

    - by Joel
    This is probably a stupid question but my googling isn't finding a satisfactory answer. I'm starting a small project in C#, with just a business layer and a data access layer - strangely, the UI will come later, and I have very little (read:no) concept / control over what it will look like. I would like to try TDD for this project. I'm using Visual Studio 2008 (soon to be 2010), I have ReSharper 5, and nUnit. Again, I want to do Test-Driven Development, but not necessarily the entire XP system. My question is - when and where do I write the first unit test? Do I only test logic before I write it, or do I test everything? It seems counter-productive to test things that have no reason to fail (auto-properties, empty constructors)...but it seems like the "No new code without a failing test" maxim requires this. Links or references are fine (but preferably to online resources, not books - I would like to get started ASAP). Thanks in advance for any guidance!

    Read the article

  • should/could i use ruby-on-rails 3 for my next project?

    - by fayer
    im new to ruby and ruby on rails. im on a new project and would really like to code in ruby. of course u have to use a framework like RoR for web development. so my question is if i should/could use ruby-on-rails 3 for this new project that i will start with in 2-3 weeks from now. are there tutorials for it? is it stable enough? well tested? cause one thing when entering something totally new, you need tutorials and books that teach you how to use it. i guess there arent a lot of sources for this information so maybe its better for me to use the latest stable version? what do you think? thanks

    Read the article

  • When is a PHP project too small for a framework?

    - by Jonathan Nicol
    I'm about to start on a small, static website project: no database or CMS required. Basically, a brochure website. I used the CodeIgniter framework recently to develop a full-blown web application, and I'm wondering if it appropriate to also use CI for smaller, simpler sites. Typically for a static brochure site I would write regular PHP pages with a few includes thrown in to save on repetition (i.e. HTML with a sprinking of PHP), but this time around I'm wondering if my new friend CodeIgniter might be able to streamline the development process. Is it sensible to consider a framework for such a simple project, or is it overkill? I'm worried that I might be the proverbial carpenter whose only tool is a hammer, and sees every problem as a nail!

    Read the article

  • How do I force Eclipse to rebuild if files in another project change (any change)?

    - by James Moore
    I've got an Eclipse (Galileo) project (called ProguardBuilder) that runs Proguard over a set of class files in other projects and produces a jar file. I'd like to have the ProguardBuilder project get rebuilt any time any class file in the other projects changes. AutoBuild doesn't do that; presumably it's smart enough to recognize and ignore any changes that don't affect anything externally visible. My problem is that I don't care whether or not the change is visible, since I need to completely rebuild ProguardBuilder any time the class files it depends on change at all. How do I tell Eclipse to do this sort of rebuild?

    Read the article

  • Grafting Scala 2.8 into a Netbeans NBAndroid Project...What steps am I missing?

    - by Michael Kohout
    Hi All; Due to Apple's recent T+C hijinks, I've become interested in developing for Android. Anyways, I'm trying to get a mixed-language Android 2.1 project going in Netbeans 6.8 (with the NBAndroid 0.10 plugin). The two languages being Java and Scala(2.8 head build). To give you a basic idea of what the app does right now, it's just a simple "Hello World" app. To get this to build, I've modified the projects's build.xml file: -- -injars ${scala-library}(!META-INF/MANIFEST.MF,!library.properties) -outjars "${build.classes.dir}/classes.min.jar" -libraryjars "${file.reference.android.jar}" -dontwarn -dontoptimize -dontobfuscate -keep public class * extends android.app.Activity -keep public class scala.xml.include.sax.Main** I've gotten the project so that it'll build, but it errors on startup in my Android Emulator(inside the emulator Android tells me my application has stopped unexpectedly). So my questions are: Does anyone see what I may be doing wrong? And is there any way to get access to the logs that the emulator must create? thanks Mike Kohout

    Read the article

< Previous Page | 135 136 137 138 139 140 141 142 143 144 145 146  | Next Page >