Search Results

Search found 31269 results on 1251 pages for 'process management'.

Page 23/1251 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • WEB based HPC cluster node management

    - by Skuja
    Hello, i am working on my school diploma thesis. The main goal is to create web based application where logged users could see free and busy nodes, turn them on and off, see what process they are running etc. Figured out that i could do something like this - write some cron daemon that would run every 30seconds or so, and it could run ping utility for each node to find out if it is on or off, then write results to some file. Then from my web app (i will write in PHP) i could read the info. Will it be a good solution? How would you suggest me to do it? And finally, is there any existing solutions (it may not be a definetly ewb based) for managment of cluster nodes?

    Read the article

  • When a python process is killed on OSX, why doesn't it kill the child processes?

    - by Hugh
    I found myself getting very confused a while back by some changes that I found when moving Python scripts from Linux over to OSX... On Linux, if a python script has called os.system(), and the calling process is killed, the called process will be killed at the same time. On OSX, however, if the main process is killed, anything that it launched is left behind. Is there something somewhere in OSX/Python where I can change this behaviour? This is causing problems on our render farm, where the processes can be killed from the management GUI, but the top level process is really just a wrapper, so, while the render farm management might think that the process has gone and the machine is freed up for another task, the actual processor-intensive task is still running, which can lead to huge blockages. I know that I could write more logic to catch the kill signal and pass it on to the child processes, but I was hoping that it might be something that could be enabled at a lower level.

    Read the article

  • Prevent the "System" process from locking my files in a shared folder.

    - by Kamarey
    I have an application that creates files to be processed by SQL bulk. The files are created in shared folder on another server and than taken from there by SQL. The problem that sometime SQL returns an error, that the file is locked by another process and can't be accessed. The process that locks these files is "System" process. Looks like it lock files because of they are in a shared folder, but not sure. The use of any software to unlock files manually is not an option, as all bulk process is automatic. The question is: Why the "System" process locks these files and is there a way to prevent this?

    Read the article

  • Stored Procedures In Source Control - Automate Build/Deployment Process

    - by Alex
    My company provides a large .NET service-oriented solution. The services layer interact with a T-SQL back-end consisting of hundreds of tables and stored procedures. Our C# code is in version-control (SVN) but our stored procedures and schema are not. After much lobbying of expedient upper-management, I was allowed to review our (non-existent) build/deployment process to accomplish the following goals: Place schema and stored procedures under source-control. Automate the build/deployment process. I would like to proceed per the accepted answer's strategy in this post but have additional questions: I would like to use Hudson as my build server. Is this a reasonable choice for a C#/SQL solution? What better alternatives should I explore? Assuming I have all triggers, stored-procedures, schema, etc... under source control, and that they are scripted to individual files, how do I generate a build script which will take into account dependencies/references between these items? (SQL Server does this automatically, but it generates one giant script) What does the workflow of performing an update at the client look like? i.e. I have to keep existing table data. How do I roll-back schema changes? I am the only programmer. Several other pseudo-technical staff like to make changes directly inside SQL Management Studio. Is it realistic to expect others to adhere to this solution -- how can I enforce this? Thank you in advance for your help.

    Read the article

  • Window Management for Mac OS X

    - by Paolo Maffei
    Ok, I feel dumb. I've put many hours into this and found nothing, yet. When I was using Windows I had this little tool called WinSplit Revolution. What it did was letting you divide your screen into how many and of how much size you choose "virtual monitors". You set one time of you want to divide your monitor, then everytime WinSplit is opened the monitor is automatically divided into Virtual Monitors. Screenshots: http://www.google.com/images?hl=en&q=winsplit%20revolution&um=1&ie=UTF-8&source=og&sa=N&tab=wi&biw=1045&bih=499 I'm now using a 30' which i want almost always divided into 4 equal size "virtual monitors" (plus my mbp 13' those will be 5 1280x800 virtual monitors) Now I've switched to Mac OS X and can't find anything that does just this efficiently. I tried Divvy but I found no way to divide my screen into arbitrary "virtual monitors", I need a couple of clicks to select a 3x3 space on a 9x9 grid. Before starting coding something like this can you tell me if you already know of some software that does window management like this?

    Read the article

  • Sharing disk volumes across OpenVZ guests to reduce Package Management Overhead

    - by andyortlieb
    Is it feasible to create a single "master" OpenVZ guest who would only be used for package management, and use something like mount --bind on several other OpenVZ guests sort of trick them into using the environment installed by the master guest? The point of this would be so that users can maintain their own containers, and yet stay in sync with the master development environment, so they'll always have the latest & greatest requirements without worrying too much about system administration. If they need to install their own packages, could put them in /opt, or /usr/local (or set a path to their home directory)? To rephrase, I would like several (developer's, for example) OpenVZ guests whose /bin, /usr (and so on...) actually refer to the same disk location as that of a master OpenVZ guest who can be started up to install and update common packages for the environment to be shared by all of this group of OpenVZ guests. For what it's worth, we're running Debian 6. Edit: I have tried mounting (bind, and readonly) /bin, /lib, /sbin, /usr in this fashion and it refuses to start the containers stating that files are already mounted or otherwise in use: Starting container ... vzquota : (error) Quota on syscall for id 1102: Device or resource busy vzquota : (error) Possible reasons: vzquota : (error) - Container's root is already mounted vzquota : (error) - there are opened files inside Container's private area vzquota : (error) - your current working directory is inside Container's vzquota : (error) private area vzquota : (error) Currently used file(s): /var/lib/vz/private/1102/sbin /var/lib/vz/private/1102/usr /var/lib/vz/private/1102/lib /var/lib/vz/private/1102/bin vzquota on failed [3] If I unmount these four volumes, and start the guest, and then mount them after the guest has started, the guest never sees them mounted.

    Read the article

  • Configuration management in support of scientific computing

    - by Sharpie
    For the past few years I have been involved with developing and maintaining a system for forecasting near-shore waves. Our team has just received a significant grant for further development and as a result we are taking the opportunity to refactor many components of the old system. We will also be receiving a new server to run the model and so I am taking this opportunity to consider how we set up the system. Basically, the steps that need to happen are: Some standard packages and libraries such as compilers and databases need to be downloaded and installed. Some custom scientific models need to be downloaded and compiled from source as they are not commonly provided as packages. New users need to be created to manage the databases and run the models. A suite of scripts that manage model-database interaction needs to be checked out from source code control and installed. Crontabs need to be set up to run the scripts at regular intervals in order to generate forecasts. I have been pondering applying tools such as Puppet, Capistrano or Fabric to automate the above steps. It seems perfectly possible to implement most of the above functionality except there are a couple usage cases that I am wondering about: During my preliminary research, I have found few examples and little discussion on how to use these systems to abstract and automate the process of building custom components from source. We may have to deploy on machines that are isolated from the Internet- i.e. all configuration and set up files will have to come in on a USB key that can be inserted into a terminal that can connect to the server that will run the models. I see this as an opportunity to learn a new tool that will help me automate my workflow, but I am unsure which tool I should start with. If any member of the community could suggest a tool that would support the above workflow and the issues specific to scientific computing, I would be very grateful. Our production server will be running Linux, but support for OS X would be a bonus as it would allow the development team to setup test installations outside of VirtualBox.

    Read the article

  • Configuration management in support of scientific computing

    - by Sharpie
    For the past few years I have been involved with developing and maintaining a system for forecasting near-shore waves. Our team has just received a significant grant for further development and as a result we are taking the opportunity to refactor many components of the old system. We will also be receiving a new server to run the model and so I am taking this opportunity to consider how we set up the system. Basically, the steps that need to happen are: Some standard packages and libraries such as compilers and databases need to be downloaded and installed. Some custom scientific models need to be downloaded and compiled from source as they are not commonly provided as packages. New users need to be created to manage the databases and run the models. A suite of scripts that manage model-database interaction needs to be checked out from source code control and installed. Crontabs need to be set up to run the scripts at regular intervals in order to generate forecasts. I have been pondering applying tools such as Puppet, Capistrano or Fabric to automate the above steps. It seems perfectly possible to implement most of the above functionality except there are a couple usage cases that I am wondering about: During my preliminary research, I have found few examples and little discussion on how to use these systems to abstract and automate the process of building custom components from source. We may have to deploy on machines that are isolated from the Internet- i.e. all configuration and set up files will have to come in on a USB key that can be inserted into a terminal that can connect to the server that will run the models. I see this as an opportunity to learn a new tool that will help me automate my workflow, but I am unsure which tool I should start with. If any member of the community could suggest a tool that would support the above workflow and the issues specific to scientific computing, I would be very grateful. Our production server will be running Linux, but support for OS X would be a bonus as it would allow the development team to setup test installations outside of VirtualBox.

    Read the article

  • Application for time and projet management

    - by user10826
    I want to improve the way I organize my projects/tasks/schedule What I do now is: keep an excel sheet with the name of the most important tasks/projects, I look at it at the beginning of each day and decide the ones I will focus on on iCal I write down events for each day, or for a concrete time (13 to 14 hours). I set up each day the tasks I want to accomlish, and allocate them hours I use Things (culture code) to keep info about tasks and projects not very important and which are not time allocated yet (GTD name = someday) I use Mail on Mac and create folders for the mails I want to process with the name of the different projects I save the main info for each project on freemind maps My system works well at the moment but it is pretty complicated to use. I want to make it better and I am looking for something with these requirements: must be 100% offline accessable it should use as less programs/resources as possible, ideally just one program should be able to manage all my info I can use the GTD methodology mixed with priorities and I can allocate each task converted to event on my calendar I can have different daily/weekly, etc views on a calendar to see the "big picture" must run on mac os x leopard price does not matter, I will pay for this So, according to your experience, can you recommend me something like this? Thanks

    Read the article

  • Application for time and project management

    - by user10826
    I want to improve the way I organize my projects/tasks/schedule What I do now is: keep an excel sheet with the name of the most important tasks/projects, I look at it at the beginning of each day and decide the ones I will focus on on iCal I write down events for each day, or for a concrete time (13 to 14 hours). I set up each day the tasks I want to accomlish, and allocate them hours I use Things (culture code) to keep info about tasks and projects not very important and which are not time allocated yet (GTD name = someday) I use Mail on Mac and create folders for the mails I want to process with the name of the different projects I save the main info for each project on freemind maps My system works well at the moment but it is pretty complicated to use. I want to make it better and I am looking for something with these requirements: must be 100% offline accessable it should use as less programs/resources as possible, ideally just one program should be able to manage all my info I can use the GTD methodology mixed with priorities and I can allocate each task converted to event on my calendar I can have different daily/weekly, etc views on a calendar to see the "big picture" must run on mac os x leopard price does not matter, I will pay for this So, according to your experience, can you recommend me something like this? Thanks

    Read the article

  • Five Point Partners Reviews Oracle Utilities Mobile Workforce Management 2.0

    - by caroline.yu
    Oracle recently provided Five Point Partners, Research and Analysis Division's Warren B. Causey and Bart Thielbar a one-hour briefing of Oracle Utilities Mobile Workforce Management 2.0. Based on that briefing, Warren and Bart provided an evaluation of the new software. The review notes that this is the first major rewrite of a mobile system. Oracle Utilities has made numerous updates in structure, architecture and functionality to the software that should well-position Oracle Utilities Mobile Workforce Management 2.0 for the current utility market. Additionally, the reviewers noted that one of the most significant improvements in the new version of Oracle Utilities Mobile Workforce Management is that it has moved to the same Java technical stack of other Oracle Utilities products. Utilities can deploy the software in multiple environments including Linux, Unix and Windows. This will simplify integration with existing Oracle products, as well as with other systems, thus potentially lowering cost of installation and ownership for utilities. Overall, Warren and Bart note that Oracle Utilities now has an impressive, state-of-the-art mobile workforce management system that utilities can readily deploy in a bundle with other Oracle solutions, or use as a stand-alone system with relatively easy integration to other utility systems. They state that Oracle Utilities Mobile Workforce Management 2.0 should significantly strengthen Oracle's competitive position in the mobile workforce management solution space. To take a look at the full review, click here.

    Read the article

  • Management Reporter Installation – Lessons Learned

    - by Ryan McBee
    After successfully completing several installations of Management Reporter this year, I wanted to share a few lessons learned that should help you. First, you will want to make sure that you install Management Reporter under a domain account as opposed to a local system or network service account. Management Reporter gives you the option to install under these accounts, but it is a be a best practice approach to use a domain account. Upon installation of Management Report, you will want to make sure that Directory Browsing is enabled within the IIS server of your site or you will have problems when you go to use Management Reporter. By default, it will be disabled in Server 2008 R2 and you will need to make the setting change under the Actions pane shown below. Lastly, you will want to make sure that SQL Server is running under a domain account. I have had multiple situations where reports have been stuck in the Queued status rather than Processing status of Management Reporter. After reviewing resolution 5 of KB 2298248, it was determined that running SQL Server under a domain account is the way to go.

    Read the article

  • How to hide/show a Process using c#?

    - by aF
    Hello, While executing my program, I want to hide/minimize Microsoft Speech Recognition Application: and at the end I want to show/maximize using c#! This process is not started by me so I can't give control the process startInfo. I've tried to use user32.dll methods such as: ShowWindow AnimatedWindows AnimatedWindows With all of them I have the same problem. I can hide the windows (althought I have to call one of the methods two times with SW_HIDE option), but when I call the method with a SW_SHOW flag, it simply doesn't shows.. How can I maximize/show after hiding the process? Thanks in advance! Here is some pieces of the code, now implemented to use SetWindowPlacement: { [DllImport("user32.dll")] [return: MarshalAs(UnmanagedType.Bool)] public static extern bool GetWindowPlacement(IntPtr hWnd, ref WINDOWPLACEMENT lpwndpl); [DllImport("user32.dll", SetLastError = true)] [return: MarshalAs(UnmanagedType.Bool)] static extern bool SetWindowPlacement(IntPtr hWnd, [In] ref WINDOWPLACEMENT lpwndpl); [DllImport("user32.dll")] public static extern Boolean ShowWindowAsync(IntPtr hWnd, Int32 nCmdShow); [DllImport("user32.dll")] public static extern Boolean SetForegroundWindow(IntPtr hWnd); [DllImport("user32.dll")] public static extern Boolean ShowWindow(IntPtr hWnd, Int32 nCmdShow); [DllImport("user32.dll")] public static extern Boolean AnimateWindow(IntPtr hWnd, uint dwTime, uint dwFlags); [DllImport("dwmapi.dll")] public static extern int DwmSetWindowAttribute(IntPtr hwnd, uint dwAttribute, IntPtr pvAttribute, IntPtr lol); //Definitions For Different Window Placement Constants const UInt32 SW_HIDE = 0; const UInt32 SW_SHOWNORMAL = 1; const UInt32 SW_NORMAL = 1; const UInt32 SW_SHOWMINIMIZED = 2; const UInt32 SW_SHOWMAXIMIZED = 3; const UInt32 SW_MAXIMIZE = 3; const UInt32 SW_SHOWNOACTIVATE = 4; const UInt32 SW_SHOW = 5; const UInt32 SW_MINIMIZE = 6; const UInt32 SW_SHOWMINNOACTIVE = 7; const UInt32 SW_SHOWNA = 8; const UInt32 SW_RESTORE = 9; public sealed class AnimateWindowFlags { public const int AW_HOR_POSITIVE = 0x00000001; public const int AW_HOR_NEGATIVE = 0x00000002; public const int AW_VER_POSITIVE = 0x00000004; public const int AW_VER_NEGATIVE = 0x00000008; public const int AW_CENTER = 0x00000010; public const int AW_HIDE = 0x00010000; public const int AW_ACTIVATE = 0x00020000; public const int AW_SLIDE = 0x00040000; public const int AW_BLEND = 0x00080000; } public struct WINDOWPLACEMENT { public int length; public int flags; public int showCmd; public System.Drawing.Point ptMinPosition; public System.Drawing.Point ptMaxPosition; public System.Drawing.Rectangle rcNormalPosition; } //this works param = new WINDOWPLACEMENT(); param.length = Marshal.SizeOf(typeof(WINDOWPLACEMENT)); param.showCmd = (int)SW_HIDE; lol = SetWindowPlacement(theprocess.MainWindowHandle, ref param); // this doesn't work WINDOWPLACEMENT param = new WINDOWPLACEMENT(); param.length = Marshal.SizeOf(typeof(WINDOWPLACEMENT)); param.showCmd = SW_SHOW; lol = GetWindowPlacement(theprocess.MainWindowHandle, ref param);

    Read the article

  • How to use an out-of-process COM server without its tlb file

    - by Dbger
    It is about Window COM component. Server.exe: an 32bit out-of-process COM server CLSID_Application: The GUID of a COM object in Server.exe Client.exe: a 64bit client application which use Server.exe in a registry-free way. As we know, an exe can't be used as a registry-free COM component, to mimic such behavior, I start the Server.exe process myself by providing the exact path: CreateProcess("Server.exe") IClassFactory* pFactory = CoGetClassObject(CLSID_Application) pFactory-CreateInstance(ppAppObject); It works if I have the Server.tlb registred, but after unregister Server.tlb, it just failed to create the ppAppObject, even though I embed manifest into both Server.exe and Client.exe: <assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0"> <file name="Server.tlb"> <typelib tlbid="{DAC4A4C9-F84C-4F05-A7DC-E152869499F5}" version="1.0" helpdir=""></typelib> </file> <comInterfaceExternalProxyStub name="IApplication" iid="{D74208EA-71C2-471D-8681-9760B8ECE599}" tlbid="{DAC4A4C9-F84C-4F05-A7DC-E152869499F5}" proxyStubClsid32="{00020424-0000-0000-C000-000000000046}"></comInterfaceExternalProxyStub> </assembly> Do you have any idea on this? Edit: It turns out that it really works if I specify tlbid for interfaces, and embed the manifest to both exe

    Read the article

  • Reusing Windows Picture and Fax Viewer process to load a new image from FileSystemWatcher

    - by Cory Larson
    So for an idea for my birthday party I'm setting up a photo booth. I've got software to remotely control the camera and all that, but I need to write a little application to monitor the folder where the pictures get saved and display them. Here's what I've got so far. The issue is that I don't want to launch a new Windows Photo Viewer process every time the FileSystemWatcher sees a new file, I just want to load the latest image into the current instance of the Windows Photo Viewer (or start a new one if one isn't running). class Program { static void Main(string[] args) { new Program().StartWatching(); } public void StartWatching() { FileSystemWatcher incoming = new FileSystemWatcher(); incoming.Path = @"G:\TempPhotos\"; incoming.NotifyFilter = NotifyFilters.LastWrite | NotifyFilters.FileName; incoming.Filter = "*.jpg"; incoming.Created += new FileSystemEventHandler(ShowImage); incoming.EnableRaisingEvents = true; Console.WriteLine("Press \'q\' to quit."); while (Console.Read() != 'q') ; } private void ShowImage(object source, FileSystemEventArgs e) { string s1 = Environment.ExpandEnvironmentVariables("%windir%\\system32\\rundll32.exe "); string s2 = Environment.ExpandEnvironmentVariables("%windir%\\system32\\shimgvw.dll,ImageView_Fullscreen " + e.FullPath); Process.Start(s1, s2); Console.WriteLine("{0} : Image \"{0}\" at {1:t}.", e.ChangeType, e.FullPath, DateTime.Now); } } If you don't have a tried and true solution, a simple push in the right direction would be just as valuable. And FYI, this will be running on a 64-bit Windows 7 machine. Thanks!

    Read the article

  • wxpython GUI and multiprocessing - how to send data back from the long running process

    - by wxpydon
    Hello everyone, Trying to run a time consuming task from a wxpython GUI. The basic idea is to start the long time task from the GUI (pressing a button) and then, a static text on the dialog should be updated from it. First I tried some threading (http://wiki.wxpython.org/LongRunningTasks and many other resourses seen), and I want to show back the messages using Publisher.class. It didn't went so well, after a message or two, the GUI seems to frozen. Now I want to achieve that with multiprocessing. I have this method inside my 'GUI' class: def do_update(self, e): self.txt_updatemsg.SetLabel("Don't stop this \n") ... pub = Publisher() # i tried also calling directly from dbob object # Publisher() is a singleton so this must be useless? pub.subscribe(self.__update_txt_message, ('updatedlg', 'message')) dbob = dbutils.DBUtils() # DBUtils is the class with 'long time' tasks dbob.publisher = pub p = Process(target=self.do_update_process, args=(dbob,)) p.start() while p.is_alive: wx.Yield def do_update_process(self, dbob): dbob.do_update() __update_txt_message is a simple function what sets the static text on dialog. Question is: how can I send back some text messages from this Process (just simple text, that's all I need) Thanks guys!

    Read the article

  • super light software development process

    - by Walty
    hi, For the development process I have involved so far, most have teams of SINGLE member, or occasionally two. We used python + django for the major development, the development process is actually very fast, and we do have code reviews, design pattern discussions, and constant refactoring. Though team size is small, I do think there are some development processes / best practices that could be enforced. For example, using svn would be definitely better than regular copy backup. I did read some articles & books about Agile, XP & continuous integration, I think they are nice, but still too heavy for this case (team of 1 or 2, and fast coding). For example, IMHO, with nice design pattern, and iterative development + refactoring, the TDD MIGHT be an overkill, or at least the overhead does not out-weight the advantages. And so is the pair programming. The automated testing is a nice idea, but it seems not technically feasible for every project. our current practices are: svn + milestone + code review I wonder if there are development processes / best practices specifically targeted on such super light teams? thanks.

    Read the article

  • How to allocate memory in another process for windows mobile

    - by Serafeim
    I'd like to read the contents of another process listview control in windows mobile. To do this, I need a pointer to some free memory to that process in order to put the values there (and then read them from my process). This can be done in normal Windows or Win32 with the VirtualAllocEx function. However, this function is not supported in windows mobile ! Can you recommend me a way to allocate that memory?

    Read the article

  • Variable modification in a child process

    - by teaLeef
    I am working on Bryant and O'Hallaron's Computer Systems, A Programmer's Perspective. Exercise 8.16 asks for the output of a program like (I changed it because they use a header file you can download on their website): #include <stdio.h> #include <stdlib.h> #include <sys/types.h> #include <sys/wait.h> #include <errno.h> #include <unistd.h> #include <string.h> int counter = 1; int main() { if (fork() == 0){ counter--; exit(0); } else{ Wait(NULL); printf("counter = %d\n", ++counter); } exit(0); } I answered "counter = 1" because the parent process waits for its children to terminate and then increments counter. But the child first decrements it. However, when I tested the program, I found that the correct answer was "counter = 2". Is the variable "counter" different in the child and in the parent process? If not, then why is the answer 2?

    Read the article

  • memory management objective c - returning objects from methods

    - by geeth
    Hi, Please clarify, how to deal with returned objects from methods? Below, I get employee details from GeEmployeetData function with autorelease, 1. Do I have to retain the returned object in Process method? 2. Can I release *emp in Process fucntion? -(void) Process { Employee *emp = [self GeEmployeetData] } +(Employee*) GeEmployeetData{ Employee *emp = [[Employee alloc]init]; //fill entity return [emp autorelease]; }

    Read the article

  • Understanding Process Scheduling in Oracle Solaris

    - by rickramsey
    The process scheduler in the Oracle Solaris kernel allocates CPU resources to processes. By default, the scheduler tries to give every process relatively equal access to the available CPUs. However, you might want to specify that certain processes be given more resources than others. That's where classes come in. A process class defines a scheduling policy for a set of processes. These three resources will help you understand and manage it process classes: Blog: Overview of Process Scheduling Classes in the Oracle Solaris Kernel by Brian Bream Timesharing, interactive, fair-share scheduler, fixed priority, system, and real time. What are these? Scheduling classes in the Solaris kernel. Brian Bream describes them and how the kernel manages them through context switching. Blog: Process Scheduling at the Thread Level by Brian Bream The Fair Share Scheduler allows you to dispatch processes not just to a particular CPU, but to CPU threads. Brian Bream explains how to use and provides examples. Docs: Overview of the Fair Share Scheduler by Oracle Solaris Documentation Team This official Oracle Solaris documentation set provides the nitty-gritty details for setting up classes and managing your processes. Covers: Introduction to the Scheduler CPU Share Definition CPU Shares and Process State CPU Share Versus Utilization CPU Share Examples FSS Configuration FSS and Processor Sets Combining FSS With Other Scheduling Classes Setting the Scheduling Class for the System Scheduling Class on a System with Zones Installed Commands Used With FSS -Rick Follow me on: Blog | Facebook | Twitter | Personal Twitter | YouTube | The Great Peruvian Novel

    Read the article

  • How To Attach Visual Studio 2010 To IIS Process Running On Windows 7

    - by Gopinath
    Debugging ASP.NET application hosted on IIS 7 running of Windows 7 using Visual Studio 2010 is a bit different from debugging applications hosted on IIS running on Windows XP. The key differences are Visual Studio 2010 demands for administrator mode and IIS runs under the process name w3wp.exe instead of aspnet_wp.exe. Here are the detailed steps to attach Visual Studio 2010 debugger to IIS 7 on Windows 7. 1. Launch Visual Studio 2010 in Administrator mode(right click on Visual Studio Icon and choose the option Run as Administrator) 2. Open source code the site you want to debug 3. Go to Tools -> Attach to Process.; Opens up Attach to Process.  4.  In Attach to Process dialog box, check the option Show process in all sessions. 5. Search for the process w3wp.exe, and click on Attach button. 6. Accept the warning messages. That’s is you are done. Visual Studio is now attached with IIS for debugging. Happy coding. This article titled,How To Attach Visual Studio 2010 To IIS Process Running On Windows 7, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • Benefits and features of different requirements-management systems and tools available?

    - by Gnark
    I am looking for a good comparision of different available professionial requirement managment tools. I am especially interested in the the features available within the different software solutions. Additionally to the "obvious" features I am looking for a proffesional Requirement Management System that supports for: multi-lingual customizable generation of documentation & history (graphs) search features (e.g. fulltext for comments), ordering, priorities version history bi-directional traceability of changes, artefacts, requirements, changes in requirements, etc. Any kind of integration of V-Model XT would be a really-nice-to-have-feature... Besides, I'd like to hear any personal motivated recommendations and/or experiences with different requirement management systems. Any input is highly appreciated. content consulted : similar question reqm tool with v-model nice, but too old paper (pdf) Tools Journal

    Read the article

  • How to convince management to deal with technical debt?

    - by Desolate Planet
    This is a question that I often ask myself when working with developers. I've worked at four companies so far, and I've noticed a lack of attention to keeping code clean and dealing with technical debt that hinders future progress in a software app. For example, the first company I worked for had written a database from scratch rather than take something like MySQL and that created hell for the team when refacoring or extending the app. I've always tried to be honest and clear with my manager when he discusses projections, but management doesn't seem interested in fixing what's already there and it's horrible to see the impact it has on team morale and in their attitude towards others. What are your thoughts on the best way to tackle this problem? What I've seen is people packing up and leaving and the company becomes a revolving door with developers coming and and out and making the code worse. How do you communicate this to management to get them interested in sorting out technical debt?

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >