Search Results

Search found 71826 results on 2874 pages for 'master data services'.

Page 32/2874 | < Previous Page | 28 29 30 31 32 33 34 35 36 37 38 39  | Next Page >

  • Running Services.msc as a different User C#

    - by Simon Mark Smith
    Hi, I have a requirement to create a simple windows forms application that allows an admin user to manage the Services on remote servers. We don't want to give the admins the usernames and passwords to the servers so these will be encrypted and stored in a database. My question is whether or not it is possible to spawn a Services.msc window when impersonating one of the users stored within the database? I have looked at the ProcessStartInfo class but because Services.msc is not an executable it does not seem to like executing this. Any ideas on a simple way of doing the actual impersonation and loading of Services.msc - say off a button click? Thanks

    Read the article

  • .NET ServiceInstaller get too much time for uninstall services

    - by rodnower
    Hello, we have some Setup Project wrote in Visual Studio 2008 in C# that installs and uninstalls services with ServiceInstaller class. When I install the services this don't get too much time, but when I uninstall with following code the process for each service get few seconds (and we have many services): ServiceInstaller si = new ServiceInstaler(); string path = string.Format("/assemblypath={0}", strServiceExecutablePath); string[] cmdline = { path }; InstallContext context = new InstallContext(string.Empty, cmdline); si.Context = context; si.ServiceName = strServiceName; si.Uninstall(null); Some one know why? Here I want to ask some related question. What difference between working of: InstallUtill /u exePath when it uninstall service and: sc delete serviceName And why when I delete some record from registry from CurrentControlSet\services I still see the service in services.msc but with: <Failed to read description. Error code:2 In description? From where I need to delete service manually for delete it complitely? Thank you for ahead.

    Read the article

  • Folders in SQL Server Data Tools

    - by jamiet
    Recently I have begun a new project in which I am using SQL Server Data Tools (SSDT) and SQL Server Integration Services (SSIS) 2012. Although I have been using SSDT & SSIS fairly extensively while SQL Server 2012 was in the beta phase I usually find that you don’t learn about the capabilities and quirks of new products until you use them on a real project, hence I am hoping I’m going to have a lot of experiences to share on my blog over the coming few weeks. In this first such blog post I want to talk about file and folder organisation in SSDT. The predecessor to SSDT is Visual Studio Database Projects. When one created a new Visual Studio Database Project a folder structure was provided with “Schema Objects” and “Scripts” in the root and a series of subfolders for each schema: Apparently a few customers were not too happy with the tool arbitrarily creating lots of folders in Solution Explorer and hence SSDT has gone in completely the opposite direction; now no folders are created and new objects will get created in the root – it is at your discretion where they get moved to: After using SSDT for a few weeks I can safely say that I preferred the older way because I never used Solution Explorer to navigate my schema objects anyway so it didn’t bother me how many folders it created. Having said that the thought of a single long list of files in Solution Explorer without any folders makes me shudder so on this project I have been manually creating folders in which to organise files and I have tried to mimic the old way as much as possible by creating two folders in the root, one for all schema objects and another for Pre/Post deployment scripts: This works fine until different developers start to build their own different subfolder structures; if you are OCD-inclined like me this is going to grate on you eventually and hence you are going to want to move stuff around so that you have consistent folder structures for each schema and (if you have multiple databases) each project. Moreover new files get created with a filename of the object name + “.sql” and often people like to have an extra identifier in the filename to indicate the object type: The overall point is this – files and folders in your solution are going to change. Some version control systems (VCSs) don’t take kindly to files being moved around or renamed because they recognise the renamed/moved file simply as a new file and when they do that you lose the revision history which, to my mind, is one of the key benefits of using a VCS in the first place. On this project we have been using Team Foundation Server (TFS) and while it pains me to say it (as I am no great fan of TFS’s version control system) it has proved invaluable when dealing with the SSDT problems that I outlined above because it is integrated right into the Visual Studio IDE. Thus the advice from this blog post is: If you are using SSDT consider using an Visual-Studio-integrated VCS that can easily handle file renames and file moves I suspect that fans of other VCSs will counter by saying that their VCS weapon of choice can handle renames/file moves quite satisfactorily and if that’s the case…great…let me know about them in the comments. This blog post is not an attempt to make people use one particular VCS, only to make people aware of this issue that might rise when using SSDT. More to come in the coming few weeks! @jamiet

    Read the article

  • Reporting Services Returning HTTP 401 Unauthorized

    - by Chris Arnold
    I have just ported an existing ASP.NET application to a new web server (Windows Server 2008 R2 and SQL Server 2008). It is successfully running on 4 other servers of varying O/S (which I also setup). My ASP.NET app calls into the Reporting Services Web Service (ReportExecution2005.asmx) to generate a report and save it as a pdf to the file system. I consistently receive "System.Net.WebException - The request failed with HTTP status 401: Unauthorized." In UTTER desperation I have performed the following... Granted all Users complete access to SSRS via the Reports web page. Granted all Users 'Full control' to <%ProgramFiles%\Microsoft SQL Server\MSRS10.MSSQLSERVER I am not a network / server specialist but I'm the only one that can deal with this and it's driving me batty. Help!

    Read the article

  • rsAccessDenied - SQL server 2008 reporting services

    - by rboorgapally
    Hi, I am running SQL server 2008 developer edition on windows vista home premium. I created a reporting services project that was built successfully in BIDS. When I try to deploy it it gives the following error: Error rsAccessDenied : The permissions granted to user 'COMP\MYSELF' are insufficient for performing this operation. The MYSELF account is the only account on the system. It has administrator rights. The reporting service is running with the LocalSystem service account. If I log in with the MYSELF account into reportmanager, I cannot see the site settings tab. Without the site settings tab, how do I add or change the roles for MYSELF account. In summary, please help me to open the reportmanager in the browser with the site settings link so that I can change the role of the user account.

    Read the article

  • Terminal Services - MS Access Frequently "Not Responding"

    - by jonfhancock
    Exposition: We use a program built in MS Access that I serve via Terminal Services. I just installed a new TS Server with a Quad Core 2.6GHz Xeon, 8GB RAM, and 4 SATA drives in a RAID 0. In installed Server 2008 R2 (64bit obviously). It's only role is TS. The problem: With just a few sessions (under 10), I start getting frequent Not Responding messages in each session. When it happens, the users aren't doing anything particularly taxing, just form navigation and simple insert queries. I can live with some stalls, but it is visually jarring in WS08 because the screen goes gray, and it presents a dialog offering to wait or close with some other options. Questions: Any suggestions for improving performance and reducing hangs? Is it possible to disable the dialog (always wait) and screen graying?

    Read the article

  • Routing to a Terminal Services Cluster

    - by Dave
    I am trying to connect to a Load Balanced Windows 2008 R2 cluster using Remote Desktop Services. I have no trouble connecting to the the Servers' IP addresses (.253.16 and .253.17) or the Cluster address (.253.20) from inside the subnet (.253). The trouble is when I try to connect from the other subnet(.251). I can remote to the other non-clustered servers (.253.12 and .253.15) inside the .253 subnet from the .251 without an issue. I receive a ping reply from the cluster and other servers when I am on the .251 subnet. But when I try to connect via remote desktop it times out but only to any of the IPs on the cluster (.20,.17,.16). My ASA 5510 handling the routing reports message in the log: Deny TCP (no connection) from 192.168.251.2/4283 to 192.168.253.16/3389 flag FIN PSH ACK Here is a picture if it helps http://dl.dropbox.com/u/4217864/terminal%20server.jpg Thanks for any help

    Read the article

  • Reporting Services 2008 R2 export to PDF embedded fonts not shown

    - by Gabriel Guimarães
    Hi, I have installed a font on the server that hosts Reporting Services 2008 R2, after that I've restarted the SSRS service, and sucessfully deployed a report with a custom font. I can see it using the font on the web, when I export it to Excel, however on PDF the font is not visible. If I click on File - Properties - Fonts. I'm presented with a screen that shows me a list of fonts on the PDF. There's an a icon with Helvetica-BoldOblique Type: Type 1 Encoding: Ansi Actual Font: Arial-BoldItalicMT Actual Font Type: TrueType The second one is a double T icon with the font I'm using and a (Embedded Subset) sufix. Its a True Type font and ANSI encoded. However the text is not using this embeded font, If I select the text and copy to a word document (I have the font installed) I can see the text in the font, however not on PDF, what's wrong here?

    Read the article

  • How to best migrate one Windows 2008 R2 / SharePoint / Exchange / Terminal Services (All-in-one) int

    - by MadBoy
    Hello, My client has one machine with Windows 2008 R2 and everything on it. By everything I mean AD, DNS, SharePoint 2010 Standard, Exchange 2010 Standard, Terminal Services, Office 2010 and a bunch of additional apps. Everything stands on I7 x 2 and 36gb ram for 7 people total. I've decided that we should virtualize it and split things into 4 VM's and keep host only with Hyper-V installed to host all the machines. What problems should I expect? What good advices can you give. My plan is that when i move everything to VM's i will move vm's to safe place and format the host as it has a lot of really bad things happening on it. But this also means that everything will be wiped from current solution so I have to be sure that Exchange etc will work when host gets wiped. MadBoy

    Read the article

  • MOSS 2007 Sharepoint Shared Services AD Import SQL/Search Error prevents user import

    - by TrevJen
    When attempting to import new AD users (Shared Services Administration Shared Service User Profiles and Properties) I receive an error on the top of the User Profiles and Properties page. "An error has occurred while accessing the SQL Server database or the Office SharePoint Server Search service. If this is the first time you have seen this message, try again later. If this problem persists, contact your administrator." I have tried the following: Rebooted server Checked service account permissions and passwords Checked the MIPSCHEDULEDJOB table to ensure all 6 required entries are there.

    Read the article

  • Windows Server 2003 Terminal Services error

    - by Adrian S
    Hi! I have a Win2K3 machine which I want to access via remote desktop. When I try remote desktop on another machine the session just ends as soon as it attempts connection. I never see the log-on sceeen or anything. It just returns to the remote desktop connection dialog. I have checked the service on the target machine and it's up and running, so how can I determine the error? Is there any way to re-install terminal services and see if it just fixed?

    Read the article

  • Multiple Reporting Services databases in one instance?

    - by Tedd Hansen
    Is it possible to have multiple Reporting Services databases in one MSSQL instance? I have a MSSQL 2008 R2 with RS set to SharePoint Integrated Mode. This RS is in use and can't be changed. I do however need a RS in native mode for the TFS installation to be able to use it. Am I required to set up a new instance of MSSQL? Bonus question: If so, is that permitted under the MS licensing scheme or is it an additional cost?

    Read the article

  • Windows services not starting automatically?

    - by Jeff Atwood
    We've had some nasty time sync problems on our Windows Server 2008 R2 servers lately. I traced this back to something very simple: the Windows Time Service was not started! The time can't possibly sync via NTP when the time service isn't running... The Windows Time Service was set to start "automatically" in the services control panel, which I double and triple checked. I also checked the event logs and I didn't see any service failures or anything like that. In fact, it looked a heck of a lot like the Windows Time Service never started up automatically after the weekly Windows Updates were installed and the servers were rebooted. (this is set to happen every Saturday at 7 PM.) The minute I started the Time Service, the time synced fine. So, then, the question: why would a service set to start "Automatically" ... not be started automatically? That seems sort of crazy to me.

    Read the article

  • Services of virtual machines

    - by RredCat
    I am looking a way to establish dev machine on the cloud. I want to have access for development from different places. I don't want to play with sync brunch and so on. It would be Lubuntu with 1G or 512M memory and I want to have way to setup my image. What I have found in the current moment: Amazon EC2 service Azure Virtual Machines I am pretty sure that there should be more services like this. I hope to find one specified for this purpose. I have experience to work in such way and I like it. Unfortunately it was server of certain company for projects of this company and I can't use it for my private aims. Could anybody suggest me anything?

    Read the article

  • Oracle Communications Data Model

    - by jean-pierre.dijcks
    I've mentioned OCDM in previous posts but found the following (see end of the post) podcast on the topic and figured it is worthwhile to spread the news some more. ORetailDM and OCommunicationsDM are the two data models currently available from Oracle. Both are intended to capture: Business best practices and industry knowledge Pre-built advanced analytics intended to predict future events before they happen (like the Churn model shown below) Oracle technology best practices to ensure optimal performance of the model All of this typically comes with a reduced time to implementation, or as the marketing slogan goes, reduced time to value. Here are the links: Podcast on OCDM OTN pages for OCDM and ORDM

    Read the article

  • Importing data from text file to specific columns using BULK INSERT

    - by Dinesh Asanka
    Bulk insert is much faster than using other techniques such as  SSIS. However, when you are using bulk insert you can’t insert to specific columns. If, for example, there are five columns in a table you should have five values for each record in the text file you are importing from. This is an issue when you are expecting default values to be inserted into tables. Let us say you have table as below: In this table, you are expecting ID, Status and CreatedDate to be updated automatically, so your text file may only have   FirstName  LastName  values as below: Dinesh,Asanka Saman,Liyanage Ruwan,Silva Susantha,Bathige Jude,Peires Sanjeewa,Jayawickrama If you use bulk insert to this table like follows, You will be returned an error: Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 1 (ID). To avoid this you will need to create a view with the columns you are expecting to fill and use bulk insert against it. If you check the table now, you will see table with values in the text file and the default values.

    Read the article

  • Javascript: Safely upload a client data file

    - by Jeffrey Sweeney
    I'm (still) working on a template-based XML editing program. It's a GUI-based XML editor that only allows users to add certain tags and attributes based off the requirements. You can see the current version here for an idea. Now, I'd like to allow users to upload their own data templates, but I'm concerned about potential XSS hacks. Currently, the template file is in Javascript object literal notation, which unsurprisingly is a security nightmare if the user can upload their own. I was thinking of using XML instead, but is there an even better alternative?

    Read the article

  • Reuse the data CRUD methods in data access layer, but they are updated too quickly

    - by ValidfroM
    I agree that we should put CRUD methods in a data access layer, However, in my current project I have some issues. It is a legacy system, and there are quite a lot CRUD methods in some concrete manager classes. People including me seem to just add new methods to it, rather than reuse the existing methods. Because We don't know whether the existing method is what we need Even if we have source code, do we really need read other's code then make decision? It is updated too quickly. Do not have time get familiar with the DAO API. Back to the question, how do you solve that in your project? If we say "reuse", it really needs to be reusable rather than just an excuse.

    Read the article

  • Get aggregated view of data for entire website with Google Analytics

    - by crmpicco
    I have a website (www.ayrshireminis.com), which has three main sections under different directories, these are: /forum /galleries /contact I would like to have an aggregated view of the data for the whole website, but also for each section. What is the recommended approach for doing this? I believe I can create a web property that includes a profile for the entire website and duplicated filtered profiles, each section having an include filter. This is my gut instinct, but i'd like to know if there is another (better) way to do it? Maybe by having one account that includes a profile for the whole site and another profile with an include filter for the individual sections?

    Read the article

  • Design: How to model / where to store relational data between classes

    - by Walker
    I'm trying to figure out the best design here, and I can see multiple approaches, but none that seems "right." There are three relevant classes here: Base, TradingPost, and Resource. Each Base has a TradingPost which can offer various Resources depending on the Base's tech level. Where is the right place to store the minimum tech level a base must possess to offer any given resource? A database seems like overkill. Putting it in each subclass of Resource seems wrong--that's not an intrinsic property of the Resource. Do I have a mediating class, and if so, how does it work? It's important that I not be duplicating code; that I have one place where I set the required tech level for a given item. Essentially, where does this data belong? P.S. Feel free to change the title; I struggled to come up with one that fits.

    Read the article

  • Still no detected structured data in Google Webmaster Tools [on hold]

    - by user6211
    Can you give me some suggestions what's wrong with my structured data? Google still cannot read it. It looks like this: <div class="identity"> <div itemscope itemtype="http://schema.org/LocalBusiness"> <a itemprop="url" href="http://MYDOMAIN.co.uk/"><div itemprop="name"><strong>MY_COMPANY</strong></div></a> <div itemprop="address" itemscope itemtype="http://schema.org/PostalAddress"> <span itemprop="streetAddress">MY_ADDRESS</span>, <span itemprop="addressLocality">London</span>, <span itemprop="postalCode">SE5 MY_XYZ</span>, <span itemprop="addressCountry">UK</span> </div> </div> </div>

    Read the article

  • Consolidating hotels data from various booking sites with different IDs or reference

    - by Victor
    In one of my projects, I have data for hotels, and other booking sites are able to book this hotel. For example: Hotel A - Booking (ID = 4002), Expedia (ID = 123), Priceline (ID = 147) The three booking engines each uses their own Id to reference to Hotel A. I would need to check manually and make the right reference to the hotel. If I have 100,000 hotels, I have to check manually 300,000 (considering 3 booking sites) times? They might provide API, then I can cross check the name, address or latitude/longitude, but if they differ a little bit then I might give the wrong reference to the wrong hotel. I'm sure there are better ways to do this. There are many travel sites out there which do hotel price checking on many booking sites, but how do they do to make sure they are checking the right hotel on these booking sites? Anyone has any experience on this?

    Read the article

  • Mount external HD ubuntu 12.10

    - by Luigi Tiburzi
    Although it's an abundantly treated matter, I'm unable to find an answer valid for my needs. I had a 12.04 installation of ubuntu and I decided to install the 12.10. I copied (using GParted) the partition where my system was to an external hd where there is a windows partition. Then I installed the newest ubuntu version and now I want to take back some files (for example my .emacs) from that partition but when I try to mount it, it is not found as sdb and if I mount it from /dev/usb/hddev0 I don't get any output, only a blinking cursor, no errors, no output. I even tried to mount it as an ntfs disk but the result was the same. It's like the hd cannot be detected. So how can I access data to that disk? Could I get them from GParted terminal instead of Ubuntu one? Thanks

    Read the article

  • disk not accessible

    - by user107044
    i formatted my hard drive yesterday and it was working well even after the formatting. But when I restarted my system again , is is showing that the space is alloted to my files but they are inaccessible. I have even tried to unhide the files and folders, if they got hidden somehow. But nothing works. the hard drive is being shown empty but the properties are saying that it still conatins the data : http://imgur.com/ObjTE in the image, it is showing that the directory has only 1 file of size:4.8 kbps but the space being used by the drive is 11.6 GB. do suggest some solution.

    Read the article

  • Using Ubuntu to recover data from a crashed Windows install

    - by user289391
    I was using Windows on my laptop when suddenly the blue screen of death appeared and then laptop restarted and wrote for me this : Intel UNDI, PXE-2.1 (built 083) Copyright (C) 1997-200 Intel Corporation This Product is covered by one or more of the following patents: US5,307459, US5,434,872, US5732,094, US6579,884, US6115,776 and US6,327,625 Realtek PCIe FE Family Controller Series v120 (01/26/10) PXE-M0F: ExitingPXEROM. reboot failed I have Ubuntu on an external disk so I have now booted to that. Two questions: Any theories on what happened? How can I use Ubuntu to recovers my data from Windows install?

    Read the article

< Previous Page | 28 29 30 31 32 33 34 35 36 37 38 39  | Next Page >