Search Results

Search found 23423 results on 937 pages for 'desktop management'.

Page 28/937 | < Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >

  • I've inherited 200K lines of spaghetti code -- what now?

    - by kmote
    I hope this isn't too general of a question; I could really use some seasoned advice. I am newly employed as the sole "SW Engineer" in a fairly small shop of scientists who have spent the last 10-20 years cobbling together a vast code base. (It was written in a virtually obsolete language: G2 -- think Pascal with graphics). The program itself is a physical model of a complex chemical processing plant; the team that wrote it have incredibly deep domain knowledge but little or no formal training in programming fundamentals. They've recently learned some hard lessons about the consequences of non-existant configuration management. Their maintenance efforts are also greatly hampered by the vast accumulation of undocumented "sludge" in the code itself. I will spare you the "politics" of the situation (there's always politics!), but suffice to say, there is not a consensus of opinion about what is needed for the path ahead. They have asked me to begin presenting to the team some of the principles of modern software development. They want me to introduce some of the industry-standard practices and strategies regarding coding conventions, lifecycle management, high-level design patterns, and source control. Frankly, it's a fairly daunting task and I'm not sure where to begin. Initially, I'm inclined to tutor them in some of the central concepts of The Pragmatic Programmer, or Fowler's Refactoring ("Code Smells", etc). I also hope to introduce a number of Agile methodologies. But ultimately, to be effective, I think I'm going to need to hone in on 5-7 core fundamentals; in other words, what are the most important principles or practices that they can realistically start implementing that will give them the most "bang for the buck". So that's my question: What would you include in your list of the most effective strategies to help straighten out the spaghetti (and prevent it in the future)?

    Read the article

  • Project Management Helps AmeriCares Deliver International Aid

    - by Sylvie MacKenzie, PMP
    Excerpt from PROFIT - ORACLE - by Alison Weiss Handle with Care Sound project management helps AmeriCares bring international aid to those in need. The stakes are always high for AmeriCares. On a mission to restore health and save lives during times of disaster, the nonprofit international relief and humanitarian aid organization delivers donated medicines, medical supplies, and humanitarian aid to people in the U.S. and around the globe. Founded in 1982 with the express mission of responding as quickly and efficiently as possible to help people in need, the Stamford, Connecticut-based AmeriCares has delivered more than US$10.5 billion in aid to 147 countries over the past three decades. Launch the Slideshow “It’s critically important to us that we steward all the donations and that the medical supplies and medicines get to people as quickly as possible with no loss,” says Kate Sears, senior vice president for finance and technology at AmeriCares. “Whether we’re shipping IV solutions to victims of cholera in Haiti or antibiotics to Somali famine victims, we need to get the medicines there sooner because it means more people will be helped and lives improved or even saved.” Ten years ago, the tracking systems used by AmeriCares associates were paper-based. In recent years, staff started using spreadsheets, but the tracking processes were not standardized between teams. “Every team was tracking completely different information,” says Megan McDermott, senior associate, Sub-Saharan Africa partnerships, at AmeriCares. “It was just a few key things. For example, we tracked the date a shipment was supposed to arrive and the date we got reports from our partner that a hospital received aid on their end.” While the data was accurate, much detail was being lost in the process. AmeriCares management knew it could do a better job of tracking this enterprise data and in 2011 took a significant step by implementing Oracle’s Primavera P6 Professional Project Management. “It’s a comprehensive solution that has helped us improve the monitoring and controlling processes. It has allowed us to do our distribution better,” says Sears. In addition, the implementation effort has been a change agent, helping AmeriCares leadership rethink project management across the entire organization. Initially, much of the focus was on standardizing processes, but staff members also learned the importance of thinking proactively to prevent possible problems and evaluating results to determine if goals and objectives are truly being met. Such data about process efficiency and overall results is critical not only to AmeriCares staff but also to the donors supporting the organization’s life-saving missions. Efficiency Saves Lives One of AmeriCares’ core operations is to gather product donations from the private sector, establish where the most-urgent needs are, and solicit monetary support to send the aid via ocean cargo or airlift to welfare- and health-oriented nongovernmental organizations, hospitals, health networks, and government ministries based in areas in need. In 2011 alone, AmeriCares sent more than 3,500 shipments to 95 countries in response to both ongoing humanitarian needs and more than two dozen emergencies, including deadly tornadoes and storms in the U.S. and the devastating tsunami in Japan. When it comes to nonprofits in general, donors want to know that the charitable organizations they support are using funds wisely. Typically, nonprofits are evaluated by donors in terms of efficiency, an area where AmeriCares has an excellent reputation: 98 percent of expenses go directly to supporting programs and less than 2 percent represent administrative and fundraising costs. Donors, however, should look at more than simple efficiency, says Peter York, senior partner and chief research and learning officer at TCC Group, a nonprofit consultancy headquartered in New York, New York. They should also look at whether organizations have the systems in place to sustain their missions and continue to thrive. An expert on nonprofit organizational management, York has spent years studying sustainable charitable organizations. He defines them as nonprofits that are able to achieve the ongoing financial support to stay relevant and continue doing core mission work. In his analysis of well over 2,500 larger nonprofits, York has found that many are not sustaining, and are actually scaling back in size. “One of the biggest challenges of nonprofit sustainability is the general public’s perception that every dollar donated has to go only to the delivery of service,” says York. “What our data shows is that there are some fundamental capacities that have to be there in order for organizations to sustain and grow.” York’s research highlights the importance of data-driven leadership at successful nonprofits. “You’ve got to have the tools, the systems, and the technologies to get objective information on what you do, the people you serve, and the results you’re achieving,” says York. “If leaders don’t have the knowledge and the data, they can’t make the strategic decisions about programs to take organizations to the next level.” Historically, AmeriCares associates have used time-tested and cost-effective strategies to ship and then track supplies from donation to delivery to their destinations in designated time frames. When disaster strikes, AmeriCares ships by air and generally pulls out all the stops to deliver the most urgently needed aid within the first few days and weeks. Then, as situations stabilize, AmeriCares turns to delivering sea containers for the postemergency and ongoing aid so often needed over the long term. According to McDermott, getting a shipment out the door is fairly complicated, requiring as many as five different AmeriCares teams collaborating together. The entire process can take months—from when products are received in the warehouse and deciding which recipients to allocate supplies to, to getting customs and governmental approvals in place, actually shipping products, and finally ensuring that the products are received in-country. Delivering that aid is no small affair. “Our volume exceeds half a billion dollars a year worth of donated medicines and medical supplies, so it’s a sizable logistical operation to bring these products in and get them out to the right place quickly to have the most impact,” says Sears. “We really pride ourselves on our controls and efficiencies.” Adding to that complexity is the fact that the longer it takes to deliver aid, the more dire the human need can be. Any time AmeriCares associates can shave off the complicated aid delivery process can translate into lives saved. “It’s really being able to track information consistently that will help us to see where are the bottlenecks and where can we work on improving our processes,” says McDermott. Setting a Standard Productivity and information management improvements were key objectives for AmeriCares when staff began the process of implementing Oracle’s Primavera solution. But before configuring the software, the staff needed to take the time to analyze the systems already in place. According to Greg Loop, manager of database systems at AmeriCares, the organization received guidance from several consultants, including Rich D’Addario, consulting project manager in the Primavera Global Business Unit at Oracle, who was instrumental in shepherding the critical requirements-gathering phase. D’Addario encouraged staff to begin documenting shipping processes by considering the order in which activities occur and which ones are dependent on others to get accomplished. This exercise helped everyone realize that to be more efficient, they needed to keep track of shipments in a more standard way. “The staff didn’t recognize formal project management methodology,” says D’Addario. “But they did understand what the most important things are and that if they go wrong, an entire project can go off course.” Before, if a boatload of supplies was being sent to Haiti and there was a problem somewhere, a lot of time was taken up finding out where the problem was—because staff was not tracking things in a standard way. As a result, even more time was needed to find possible solutions to the problem and alert recipients that the aid might be delayed. “For everyone to put on the project manager hat and standardize the way every single thing is done means that now the whole organization is on the same page as to what needs to occur from the time a hurricane hits Haiti and when a boat pulls in to unload supplies,” says D’Addario. With so much care taken to put a process foundation firmly in place, configuring the Primavera solution was actually quite simple. Specific templates were set up for different types of shipments, and dashboards were implemented to provide executives with clear overviews of every project in the system. AmeriCares’ Loop reports that system planning, refining, and testing, followed by writing up documentation and training, took approximately four months. The system went live in spring 2011 at AmeriCares’ Connecticut headquarters. While the nonprofit has an international presence, with warehouses in Europe and offices in Haiti, India, Japan, and Sri Lanka, most donated medicines come from U.S. entities and are shipped from the U.S. out to the rest of the world. In addition, all shipments are tracked from the U.S. office. AmeriCares doesn’t expect the Primavera system to take months off the shipping time, especially for sea containers. However, any time saved is still important because it will allow aid to be delivered to people more quickly at a lower overall cost. “If we can trim a day or two here or there, that can translate into lives that we’re saving, especially in emergency situations,” says Sears. A Cultural Change Beyond the measurable benefits that come with IT-driven process improvement, AmeriCares management is seeing a change in culture as a result of the Primavera project. One change has been treating every shipment of aid as a project, and everyone involved with facilitating shipments as a project manager. “This is a revolutionary concept for us,” says McDermott. “Before, we were used to thinking we were doing logistics—getting a container from point A to point B without looking at it as one project and really understanding what it meant to manage it.” AmeriCares staff is also happy to report that collaboration within the organization is much more efficient. When someone creates a shipment in the Primavera system, the same shared template is used, which means anyone can log in to the system to see the status of a shipment. Knowledgeable staff can access a shipment project to help troubleshoot a problem. Management can easily check the status of projects across the organization. “Dashboards are really useful,” says McDermott. “Instead of going into the details of each project, you can just see the high-level real-time information at a glance.” The new system is helping team members focus on proactively managing shipments rather than simply reacting when problems occur. For example, when a container is shipped, documents must be included for customs clearance. Now, the shipping template has built-in reminders to prompt team members to ask for copies of these documents from freight forwarders and to follow up with partners to discover if a shipment is on time. In the past, staff may not have worked on securing these documents until they’d been notified a shipment had arrived in-country. Another benefit of capturing and adopting best practices within the Primavera system is that staff training is easier. “Capturing the processes in documented steps and milestones allows us to teach new staff members how to do their jobs faster,” says Sears. “It provides them with the knowledge of their predecessors so they don’t have to keep reinventing the wheel.” With the Primavera system already generating positive results, management is eager to take advantage of advanced capabilities. Loop is working on integrating the company’s proprietary inventory management system with the Primavera system so that when logistics or warehousing operators input data, the information will automatically go into the Primavera system. In the past, this information had to be manually keyed into spreadsheets, often leading to errors. Mining Historical Data Another feature on the horizon for AmeriCares is utilizing Primavera P6 Professional Project Management reporting capabilities. As the system begins to include more historical data, management soon will be able to draw on this information to conduct analysis that has not been possible before and create customized reports. For example, at the beginning of the shipment process, staff will be able to use historical data to more accurately estimate how long the approval process should take for a particular country. This could help ensure that food and medicine with limited shelf lives do not get stuck in customs or used beyond their expiration dates. The historical data in the Primavera system will also help AmeriCares with better planning year to year. The nonprofit’s staff has always put together a plan at the beginning of the year, but this has been very challenging simply because it is impossible to predict disasters. Now, management will be able to look at historical data and see trends and statistics as they set current objectives and prepare for future need. In addition, this historical data will provide AmeriCares management with the ability to review year-end data and compare actual project results with goals set at the beginning of the year—to see if desired outcomes were achieved and if there are areas that need improvement. It’s this type of information that is so valuable to donors. And, according to York, project management software can play a critical role in generating the data to help nonprofits sustain and grow. “It is important to invest in systems to help replicate, expand, and deliver services,” says York. “Project management software can help because it encourages nonprofits to examine program or service changes and how to manage moving forward.” Sears believes that AmeriCares donors will support the return on investment the organization will achieve with the Primavera solution. “It won’t be financial returns, but rather how many more people we can help for a given dollar or how much more quickly we can respond to a need,” says Sears. “I think donors are receptive to such arguments.” And for AmeriCares, it is all about the future and increasing results. The project management environment currently may be quite simple, but IT staff plans to expand the complexity and functionality as the organization grows in its knowledge of project management and the goals it wants to achieve. “As we use the system over time, we’ll continue to refine our best practices and accumulate more data,” says Sears. “It will advance our ability to make better data-driven decisions.”

    Read the article

  • Desktop Applications Versus Web Applications

    Up until the advent of the internet programmers really only developed one type of application used by end-users.  This type of application was called a desktop application. As the name implies, these applications ran strictly from a desktop computer, and were limited by the resources available to the computer. Initially, this type of applications did not need resources outside of the scope of the computer in which they installed. The problem with this type of application is that if multiple end-users need to access the same desktop application, then the application must be installed on the end-user’s computer. In this age of software development security was not as big of a concern as it is today with other types of applications. This is primarily due to the fact that an end-user must have access to the computer where the software is installed in order for them to access the application. In addition, developers could also password protect the application just in case an authorized end-user was able to gain access to the computer. With the birth of the internet a second form of application emerged because developers were trying to solve inherent issues with the preexisting desktop application. One of the solutions to overcome some of the short comings of desktop applications is the web application. Web applications are hosted on a centralized server and clients only need to have network access and a web browser in order to access the application. Because a web application can be installed on a remote server it removes the need for individual installations of the same application on each end-user’s computer.  The main benefits to an application being hosted on a server is increased accessibility to the application due to the fact that nothing has to be installed on a desktop computer for an end-user to be able to access the application. In addition, web applications are much easier to maintain because any change to the application is applied on the server and is inherently applied to any end-user trying to use the application. This removes the time needed to install and maintain individual installations of a desktop application. However with the increased accessibility there are additional costs that are incurred compared to a desktop application because of the additional cost and maintenance of a server hosting the application. Typically, after a desktop application is purchased there are no additional reoccurring fees associated with the application.  When developing a web based application there are additional considerations that must be addressed compared to a desktop application. The added benefit of increased accessibility also now adds a new failure point when trying to gain access to an application. An end-user now must have network connectivity in order to access the application. This issue is not a concern for desktop applications because there resources are typically bound to the computer in which they run. Since the availability of an application is increased with the use of the client-server model in a web based application, additional security concerns now come in to play. As stated before a, desktop application is bound to the accessibility of the end-user to the computer that the application is installed. This is not the case with web based applications because they potentially could have access from anywhere with the proper internet/network connection. Additional security steps are required to insure the integrity of the application and its data. Examples of these steps include and are not limited to the following: Restricted/Password Areas This form of security is used when specific information can only be accessed by end-users based on a set of accessibility rules. IP Restrictions This form of security is used when only specific locations need to access an application. This form of security is applied from within the web server or a firewall. Network Restrictions (Firewalls) This form of security is used to contain access to an application within a specific sub set of a network. Data Encryption This form of security is used transform personally identifiable information in to something unreadable so that it can be stored for future use. Encrypted Protocols (HTTPS) This form of security is used to prevent others from reading messages being sent between applications over a network.

    Read the article

  • Centralized Windows/Mac Patch Management that is easy to use

    - by BiggsTRC
    I'm looking for advice on what patch management solutions you would recommend based upon your experience. I'm also looking for which ones you would not recommend based upon your experience. We have a mixed network of Windows and Mac clients. Our central servers are all Windows servers, although I have considered putting in a Mac server to better handle our Mac clients. The issue we are facing currently is that we need to maintain the patches on all of our third-party applications. Right now we use WSUS, which handles with patching of Windows and some Microsoft products but that is about it. I need something to cover the other applications, specifically things like Adobe products (Reader, Flash, Dreamweaver, etc.) Our network isn't that big (maybe 200 clients) and I don't have a person to dedicate just to patching and maintaining a patch management solution. Thus very large and complicated solutions like System Center are most likely out. I have recently been looking at Dell's Kace K1000 solution (http://www.kace.com/products/systems-management-appliance/). It seems simple and it provides a lot of tools in one package that I would like/need as well. I like the fact that it is self-contained in an appliance and that it is designed for solutions like mine. However, I'm not sure if this is the best solution. I've also looked some at Shavlik's Netchk solution (http://www.shavlik.com/netchk-protect.aspx) but I don't need an anti-virus product. However, it looks like they might have a very good patch database. My question is this: What are your thoughts on these to products? Are there better products out there? Are there issues that I'm not considering? I want something that is very good at patching a broad range of products, that is simple to use, that takes a minimal amount of management (like WSUS), and that (hopefully) works with Mac and Windows.

    Read the article

  • Remote Desktop disconnects after reaching "Estimating connection quality..."

    - by Sam Pearson
    I'm connecting to a Windows 8 machine from a Windows 7 machine. When I try to RDP in to the machine, it prompts me for my credentials, then zooms through the process of connecting until it reaches "Estimating connection quality." After a few seconds, it disconnects without giving any message whatsoever and returns me to the Remote Desktop Connection connect window. No error message, no popups, nothing. It just silently fails to connect after reaching "Estimating connection quality." How do I solve this issue?

    Read the article

  • "this network location can't be included because it is not indexed" on Windows 2008R2 Remote Desktop

    - by crgnz
    I'm setting up a new terminal server for our users on Win2008R2 (I guess I should call it Remote Desktop Services now!) When I try to change the location of "Documents" (by removing the default Documents library and adding a new one), to use the file server ie \\fileserver\username\Documents I get the message: "This network location can't be included because it is not indexed" I certainly don't want to make folders available offline, and in fact, I have set the GPO to prohibit offline folders on the terminal servers. What is the best practice for document libraries on terminal server and network file shares?

    Read the article

  • Cannot access local resource (C drive) on remote desktop

    - by Robert Massa
    I've recently upgraded my client PC to Windows 7, and ever since I can't get local resource sharing for remote desktop to work. I'm connecting to a 2003 server which isn't is my current domain. All my optical and virtual drives are being shared, but the C drive stays hidden. I checked the options, and do indicate that I want to share my C drive. Is there any permission I should change for this to work? The server is configured correctly because when connecting from an XP client this problem doesn't occur. I've tried accessing the share directly by opening the \\tsclient\c path, but this doesn't work neither. \\tsclient only shows the other drives. Also copy 'n paste doesn't seem to work neither(tried restarting rdpclip to no avail), getting Cannot copy file File.dat, the device is not connected.

    Read the article

  • Bring a Touch of the Wild West to Your Desktop with the Rango Theme for Windows 7

    - by Asian Angel
    Rango the chameleon has his hands full when he becomes the new sheriff in an Old West town called Dirt. Now you can bring his adventures to your desktop with this new theme from Microsoft. The theme comes with seven wallpapers featuring Rango, his new friends, and others he meets along the way. Download the Rango Windows 7 Theme [Windows 7 Personalization Gallery] Latest Features How-To Geek ETC Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions How to Enable User-Specific Wireless Networks in Windows 7 How to Use Google Chrome as Your Default PDF Reader (the Easy Way) Bring a Touch of the Wild West to Your Desktop with the Rango Theme for Windows 7 Manage Your Favorite Social Accounts in Chrome and Iron with Seesmic E.T. II – Extinction [Fake Movie Sequel Video] Remastered King’s Quest Games Offer Classic Gaming on Modern Machines Compare Your Internet Cost and Speed to Global Averages [Infographic] Orbital Battle for Terra Wallpaper

    Read the article

  • Problem closing MDI child window in Terminal Services/Remote Desktop Connection 7.0

    - by Justin Love
    I have one user whose computer just got updated to the 7.0 Remote Desktop Connection. Concurrently, she has started having a problem closing the MDI child windows in an old FoxPro application running on the remote server. We have two different servers, both 2003, running the same application, one locally and one at a remote office. Only the remote office server is giving trouble. It works fine for me, even when logging into her TS account. No other users have complained. The other day the same user experienced an error message (path not found for a path showing a localization placeholder) starting the RDC, fixed by reboot. I suspect she may have had RDC running during the 7.0 upgrade.

    Read the article

  • Restrict VPN user to Remote Desktop only with Sonicwall

    - by Matt
    Basically I want him to only be able to log onto the VPN in order to use Remote Desktop to use HIS machine. Not surf the internet or do anything like that, but just use the programs on his machine that he doesn't have at home. We use a Sonicwall NSA 220 with their regular VPN client. I can create a user for him, but when I create an access rule it applies to all VPN users. How can I make something like that only apply to ONE user?

    Read the article

  • Remote Desktop Client Crashes following domain join

    - by Roberto Charlie Ciarleglio
    I recently joined my laptop to our windows domain and now the remote desktop client crashes when i try and connect to any machine. It works if I run as administrator but not ordinarily. The domain join migrated my local profile to the domain profile which i think is where the problem lies. I'm guessing its a permission thing as I had a similar problem with dropbox and had to delete reg keys and reinstall. I can't figure out how to fix this problem though. The event viewer shows this: Faulting application name: mstsc.exe, version: 6.1.7601.17514, time stamp: 0x4ce7ab44 Faulting module name: FACredProv2.dll, version: 2.4.95.1, time stamp: 0x4bb8d766 Exception code: 0xc0000005 Fault offset: 0x00000000000025b2 Faulting process id: 0xb24 Faulting application start time: 0x01cd43fbd3a81fba Faulting application path: C:\Windows\System32\mstsc.exe Faulting module path: C:\Windows\System32\FACredProv2.dll Report Id: 154ee55a-afef-11e1-a443-b8ac6f704c5d any help would be appreciated!

    Read the article

  • SQL SERVER – Another lesser known feature of SQL Server Management Studio 2012 – Guest Post by Balmukund Lakhani

    - by Pinal Dave
    This is a fantastic blog post from my dear friend Balmukund ( blog | twitter | facebook ). He had presented a fantastic session in our last UG and there were lots of requests from attendees that he blogs about it. Well, here is the blog post about the same very popular UG session. Let us read the entire blog post in the voice of the Balmukund himself. In one of my previous guest blog on SQL Authority, I wrote about “Additional Connection Parameter” tab of login screen in SQL Server Management Studio (a.k.a. SSMS). On the similar lines, this blog is going to show little less known new feature of login main screen (“Connect to Server”) of SSMS 2012. You might have seen below screen countless times and you might wonder what is there is blog about in this simple screen. Well, continue reading and you would get the answer. Many times, DBA have to login to production server from non-regular machine, may be a developer’s workstation. Once you login to SQL, do your work and close the management studio. Do you know that your server name is saved in management studio? Of course, very useful feature because you may not like to type server name/IP address every time. Whatever servers you have connected, it would be stored by management studio. But sometime, it’s annoying! What you would do if you want SQL Server Management Studio to forget “all” the servers listed in drop down of Server name? To do that, you need to know how and where it’s stored. You can use one of my favorite tool from sysinternals called Process Monitor (also known as ProcMon) and easily figure out that this is stored in a file under your windows user profile. Below is the file in SQL 2008 R2 Management Studio. %appdata%\Microsoft\Microsoft SQL Server\100\Tools\Shell\SqlStudio.bin For SQL Server 2012, here is what we can see in ProcMon So, the path is %appdata%\Microsoft\Microsoft SQL Server\110\Tools\Shell\SqlStudio.bin So far, you might wonder, where is the new feature? I have been asked by many users to delete entries from SSMS “Connect to Server” server name list. Well, unofficially, you can delete the file directly which we found via ProcMon. Note that delete file to get rid of server list is not officially supported by Microsoft. Better way to achieve this is provided in SSMS 2012. To delete the servers from the list, highlight the name we want to delete (via keyboard or mouse) and then press delete key via keyboard. We can’t be multi-select and has to be done one by one. We can delete as many entries we want. I have delete few from first screenshot taken and here is the modified version. This is not available in SQL 2008 R2 and its previous version. This came from feedback given to SQL Server Product group. Hope you have learned something new today! Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Remote Desktop or Streaming Software/Services that Supports Gaming

    - by Griffin
    I've simply been amazed by the quality and speed of Onlive, as this technology has the potential of making hardware requirements irrelevant to the average user. However, at the moment Onlive is only for remotely controlling video games, and not desktops or other devices in general. I'm in pursuit of software or services that can accomplish this as well as Onlive does. I need: viewer (client) program portability (able to run on a USB stick) DirectX, OpenGL / full-screen game compatibility on the server side.** Gaming-acceptable color/scaling quality and responsiveness. I have a very powerful desktop at home and I want to be able to access this raw power from any other computer that I stick my USB into (in the same way Onlive gives gamers use of their powerful servers) What software/services has most of the above? NOTE: please specify what features your suggestion doesn't have.

    Read the article

  • Getting started with terminal services (remote desktop services) and thin clients

    - by therulebookman
    I've got a Windows Server 2008 R2 box and I want to make it a RDS server and connect with thin clients. I don't want to do VDI with hyper-v, as this box is already virtualized. RDS is installed and I've installed an RDS CAL. I've browsed the Technet articles, but navigating technet is worthless. Can anyone point me to a concise get-me-started guide to terminal services/remote desktop services? If I can just get aimed in the right direction I can probably figure it out myself. Thanks.

    Read the article

  • Remote Desktop connection repeatedly aborting

    - by DerKlaus
    I connect to my workplace computer using Remote Desktop. After 1-2 minutes the application freezes to tell me after one more minute that the connection was aborted. It then reconnects. Everything works again for 1-2 minutes. Then the process repeats. Probably Forever. My coworkers do not experience such problems when connecting to the workplace. My workplace computer: Windows 7 32bit My home computer: Windows 7 64bit connected to the internet via WLAN-router with integrated ADSL modem (Linksys WAG200G) Things I already tried to fix the problem: disabled the Windows firewall disabled the other firewall reduced the MTU upgraded the firmware on the router configured port-forwarding to forward all packets to my home computer The problem remains unchanged. What could be the cause of the connection aborts? What else can I try to fix the connection? Thanks in advance.

    Read the article

  • Outlook 2010 corrupts all Office attachments on a Server 2008R2 Remote Desktop Server

    - by Zhadu
    I have a rather annoying problem with a clients new Remote Desktop server. The problem is that any and all Office attachments (tested with Word and Excel documents) sent to the users via email cannot be opened due to the file seemingly being corrupted/damaged. I have determined that it is a local problem on the server, as the attachments work fine on my own pc. Also I believe the problem is isolated to Outlook, as the users can open already stored Office files without any issues. What are your thoughts on this? Extra information: The server is running Microsoft Server 2008 R2. The Office is a version 2010. It is handeling the roles of AD and RDS - The client only has one server, hence the breach of best practise. There is currently no AV software on the server. I have tried running a repair as well as a reinstalling of Office, with the error still there.

    Read the article

  • Windows Server 2008 R2 64bit Screen Frozen and Remote Desktop Freezes but Server Continues Working

    - by Jacques
    I've asked this question a couple of times but I don't seem to be getting any real answers. We have a SBS (Windows Server 2008 Rc) server and suddenly the screen has started freezing. Even when we go into the system via remote desktop it worked once or twice (since the problem started), but now the RDP screen freezes once it gets just past the Welcome screen. The server itself is running, SQL is working, Exchange is working, file share is fine. It's just the UI that isn't working. We've tried hard resetting and that works for a short while before the problem comes back. Where do we begin to resolve this issue? Thanks, Jacques

    Read the article

  • Remote Desktop settings not being applied for user

    - by Anthony K
    We have a number of Win 2003 servers for which we have Remote Desktop enabled. Each user has their profile edited so that they can only connect for 2 hours maximum and have 30 minutes idle time, after which they are disconnected and the session closed. On one server however, the administrator account does not have the maximum session limit working. We can stay connected for days if we want. Originally this was how it was setup, and we later changed the profile for all users so that there are limits. We have rebooted the server a couple of times since, and the Management Console shows the limits. If we are idle for too long we are disconnected. Other users are having all the limits observed. Any suggestions?

    Read the article

  • gray dotted box outlining desktop icons on windows 7

    - by Max
    I occasionally get this problem where for some reason small, dotted gray boxes appear around my desktop icons. It always goes away after I restart, but I'm just curious what it is. I don't know of anything in particular I do to cause it, but it only outlines icons I click, and it only does it to one icon at a time. The picture below is the box on the recycle bin. IF I click a different icon it'll happen to that one instead. I'm using windows 7. Thank you.

    Read the article

  • how to start LXDE session automatically after tightvncserver starts to make me able see desktop when connecting to the host via vncclient?

    - by Oleksandr Dudchenko
    I have system which is equipped with Intel Celeron processor 1.1 GHz s370 with 384 Mb of RAM on Intel d815egew motherboard which supports wake-on-lan function. I want to use such a PC for Internet sharing to the local network. Also this PC is a DHCP+DNS server as well as router/gateway. Based on above I decided to install Lubuntu as it is lightweight system. I installed Lubuntu 10.04.4 LTS from alternate ISO. System has no auto login. System boots and has acceptable performance. Host PC has onboard 4 network adapters: eth0 – ethernet controller which is used for Local Network connections. Has static address 10.0.0.1 eth1 – ethernet controller which is not used and not configured so far, I plan to connect printer here later on. eth2 - ethernet controller which is used to connect to Internet, which we plan to share for the local network wlan0 – wireless controller, it is used in role of access poit for local Network and has address 10.0.0.2 We want to control our gateway remotely. So, we need to be able to power it on remotely. To allow this I’ve done the following things: $ cd /etc/init.d/ made a new file with command $ sudo vim wakeonlanconfig Wrote the following lines to the newly created file, saved and closed it #!/bin/bash ethtool -s eth0 wol g ethtool -s eth2 wol g exit Made the abovementioned file executable $ sudo chmod a+x wakeonlanconfig Then included it into autostart sequence during boot. $ sudo update-rc.d -f wakeonlanconfig defaults after system reboot we will be able to poweron system remotely. Than we need to have a possibility to connect remotely to the host via SSH and VNC. So, I installed following packets with the following commands: $ sudo apt-get update $ sudo apt-get install openssh-server tightvncserver Add ssh daemon into autostart sequence during boot. $ sudo update-rc.d -f ssh defaults Power off the host PC $ sudo halt Then I went to remote place, send magic paket and powered the Host up. System started... And I connected to the host via Putty from remote system under Windows. Than logged in and run the command to start vnc server. $ tightvncserver -geometry 800x600 -depth 16 :2 VNC server successfully started and I got message like follows. New 'X' desktop is gateway:2 Starting applications specified in /home/dolv/.vnc/xstartup Log file is /home/dolv/.vnc/gateway:2.log Using UltraVNC Viewer programm under windows I connected to the host's vnc server, enterd the password and.... sow only mouse cursor in form of cross on a grey background of 800x600 dots, no desktop. Here is my .vnc/xstartup file #!/bin/sh xrdb $HOME/.Xresources xsetroot -solid grey #x-terminal-emulator -geometry 80x24+10+10 -ls -title "$VNCDESKTOP Desktop" & #x-window-manager & # Fix to make GNOME work export XKL_XMODMAP_DISABLE=1 /etc/X11/Xsession The Question: What I have to change and where to make LXDE session start automatically after tightvncserver starts?

    Read the article

  • Differences between Remote Desktop and Terminal services

    - by Uwe
    What is the difference between Remote Desktop and Terminal services? We run a windows 2008 R2 server. There are several administrators who need to access this server. Windows 2008 allows only two concurrent sessions with different users. So I thought of installing terminal services. But I wonder what will happen to the server if I do so? What will be installed additionally? Will there be more features, ports, issues with the server?

    Read the article

  • How to render RSS feed in a desktop RSS reader?

    - by Thiago Moraes
    Consider one feed like this: http://feeds.feedburner.com/codinghorror It has the entire content inside the description tag of the feed, so you don't need to access the website to read the post. Now I have the problem of creating an interface for a feed like this on a desktop client. What's the best way to render the text in a pleasant way to the user? My first thought was to parse the entire HTML as if I was a web browser, but that looks really hard to do in a satisfying way. Are there any better (faster) alternatives? Rephrasing: how a desktop rss client such as feeddamon parses the input to display it nicely? Does it have a web browser inside it?

    Read the article

  • Could it be sane to use Windows Server 2012 as desktop

    - by nCdy
    what about using it on desktop? I've got enough strong PC with intel core i7 and 8GB Ram so what should I think about: why not? Were looking about major differences compared to windows 8, found less. for example new file system - can it affect me? In my usual day I need development instruments alike visual studio, virtualization tools, and some games So far I can't find something that must stop me, everything I need can work (seems like) there. Tell me why I must not do it or if that is sane to do.

    Read the article

  • How to Configure a vm on the same machine to do remote desktop [closed]

    - by Varun K
    I want to achieve following: (Note I'd like to get this done first of all with Win7 as both host and vm OS) Install Windows 7/xp/Windows 8 VM on Windows 7/Windows 8 host machine Configure it so that I can connect to it via remote desktop. This is because I use a screen reader software and audio output directly from VMs is not highly responsive. My software has a feature that it can connect to its copy on the remote machine (during rdp session) and then start receiving the text description which it translates into audio on the client (host in this case) machine. I want to know: Which VM software can let me do this – VMWare/Ms Virtual PC or VirtualBox If it is possible with every VM software, could you give an example of how to do this with anyone of these 3? Specifically, I know how to install Windows on VM (on both VMWare/Virtual PC), but don't really know how to configure a network such that I can remote into that VM from host OS. Hope it clarifies what I'm trying to achieve.

    Read the article

  • My Music Folder Appearing On Desktop

    - by Michael
    Hello, I use Windows 7 and have recently been encountering a problem with the My Music folder. I keep all my music on an external hard drive and use a third party program such as VLC to play albums. I also have a large list of single MP3s that I play in WinAmp. Neither of these cause any problem but whenever I play a song in the bundled Windows Media Player a new My Music shortcut appears on the desktop. WMP is my default player, I've tried changing it to other players but whenever I use WMP it still creates this shortcut. The My Music folder is empty as all music is stored on the external hard drive and I've set WMP to collate, store and retain no information about anything, so I don't think it's anything to do with playlist history or anything. Any ideas what might be happening? Any help in solving this annoyance is appreciated.

    Read the article

< Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >