Search Results

Search found 69987 results on 2800 pages for 'wcf data services'.

Page 94/2800 | < Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >

  • Create Sum of calculated rows in Microsoft Reporting Services

    - by kd7iwp
    This seems like it should be simple but I can't find anything yet. In Reporting Services I have a table with up to 6 rows that all have calculated values and dynamic visibility. I would like to sum these rows. Basically I have a number of invoice items and want to make a total. I can't change anything on the DB side since my stored procedures are used elsewhere in the system. Each row pulls data from a different dataset as well, so I can't do a sum of the dataset. Can I sum all the rows with a table footer? Similarly to totaling a number of rows in Excel? It seems very redundant to put my visibility expression from each row into my footer row to calculate the sum.

    Read the article

  • Building highly scalable web services

    - by christopher-mccann
    My team and I are in the middle of developing an application which needs to be able to handle pretty heavy traffic. Not facebook level but in the future I would like to be able to scale to that without massive code re-writes. My thought was to modularise out everything into seperate services with their own interfaces. So for example messaging would have a messaging interface that might have send and getMessages() as methods and then the PHP web app would simply query this interface through soap or curl or something like that. The messaging application could then be any kind of application so a Java application or Python or whatever was suitable for that particular functionality with its own seperate database shard. Is this a good approach?

    Read the article

  • Silverlight datagrid don't show any data with anonymous query RIA services

    - by user289082
    Hi all! I have an anonymous linq query that I bind to a datagrid, when I debug it brings alright the data but it doesn't show in the datagrid, I suspect that the request to RIA services isn't completed before I bound it to the datagrid. I could use the LoadOperation<() Completed event. But it only works with Defined Entities so how can I do that? For reference here is the last post: http://stackoverflow.com/questions/2403903/linq-query-null-reference-exception Here is the query: var bPermisos = from b in ruc.Permisos where b.IdUsuario == SelCu.Id select new { Id=b.Id, IdUsuario=b.IdUsuario, IdPerfil=b.IdPerfil, Estatus=b.Estatus, Perfil=b.Cat_Perfil.Nombre, Sis=b.Cat_Perfil.Cat_Sistema.Nombre }; I'm a totally newbie sorry if is a very simple question. Thanks!!

    Read the article

  • RIA Services - Two entity models share an entity name

    - by Alex
    I have two entity models hooked up to two different databases. However, the two databases both have a table named 'brand', for example. As such, there is a naming conflict in my models. Now, I've been able to add a namespace to each model, via Custom Tool Namespace in the model's properties, but the generated code in my Silverlight project will try to use both namespaces, and come up with this, Imports MyProject.ModelA Imports MyProject.ModelB Public ReadOnly Property brands() As EntitySet(Of brand) Get Return MyBase.EntityContainer.GetEntitySet(Of brand) End Get End Property giving me this exception: 'Error 1 'brand' is ambiguous, imported from the namespaces or types 'MyProject.ModelA,MyProject.ModelB'. Has anyone had experience with naming conflicts like this using RIA services? How did you solve it?

    Read the article

  • Ria Services loading foreign keys with Linq-to-SQL

    - by Stephan
    I have a database that consists of 5 tables : Course, Category, Location, CourseCategories, and CourseLocations. The last 2 tables just contain the two foreign keys. A Course has many-to-many relationships with both category and location. I am trying to load the data into a Silverlight app using Ria Services. My DB model is Linq-to-SQL. I have tried adding the [Include] attribute to the metadata classes and I have added the DataLoadOptions so it should load the all tables when you ask for a Course. However on the client side I am never getting back any entries in the CourseCategories and CourseLocations properties. What else needs to be done to get the foreign key relationships to exist across the serialization.

    Read the article

  • Reporting Services Textbox Format Question

    - by coson
    Good Day, I am creating a report with Reporting Services and am using several text boxes horizontally aligned next to each other. I would like to put periods so that report looks like: 1234 Robert Jones................... (234) 921-4922 1235 Jennifer Wilson................ (919) 582-2914 Is this possible to right-pad the text box with periods, or would I need to roll out some code to accomplish the above effect. I tried doing this and the results looked like: 1234 Robert Jones............... (234) 921-4922 1235 Jennifer Wilson.......... (919) 582-2914 Is something like this even possible? TIA, coson

    Read the article

  • Using Report (Reporting Services) parameter values in ASP.NET page

    - by noup
    I have a report (Reporting Services) integrated into an ASP.NET that shows dropdownlists to select report parameter values. The dropdownlists are populated using direct database selects, though I see the report RDL files do contain the paramter values and datasets as defined in the report designer. Is it possible to obtain the report parameters "available values" in ASP.NET to populate the dropdownlists? This would avoid some code duplication. Update If the parameter doesn't use a query for available values, the following works: foreach (ValidValue value in this.ReportViewerControl.ServerReport.GetParameters()["myParameter"].ValidValues) { this.DropDownListControl.Items.Add(new ListItem(value.Label, value.Value)); } Still haven't found a way to access report datasets though...

    Read the article

  • Windows Services -- High availability scenarios and design approach

    - by Vadi
    Let's say I have a standalone windows service running in a windows server machine. How to make sure it is highly available? 1). What are all the design level guidelines that you can propose? 2). How to make it highly available like primary/secondary, eg., the clustering solutions currently available in the market 3). How to deal with cross-cutting concerns in case any fail-over scenarios If any other you can think of please add it here .. Note: The question is only related to windows and windows services, please try to obey this rule :)

    Read the article

  • Easy way for Crystal Reports to MS SQL Server Reporting Services conversion

    - by scoob
    Is there a way to easily convert Crystal Reports reports to Reporting Services RDL format? We have quite a few reports that will be needing conversion soon. I know about the manual process (which is basically rebuilding all your reports from scratch in SSRS), but my searches pointed to a few possibilities with automatic conversion "acceleration" with several consulting firms. (As described on http://www.microsoft.com/sql/technologies/reporting/partners/crystal-migration.mspx). Do any of you have any valid experiences or recomendations regarding this particular issue? Are there any tools around that I do not know about?

    Read the article

  • Best Practice for creating Web Services

    - by Holograham
    To preface I am new to web development. I am looking at creating a core set of RESTful web services around a valuable document library of sorts (initial CRUD abilities). In doing so I am theoretically creating a perfectly re-usable and scalable back-end to be used by unanticipated applications in the future. My question centers around the best practice for doing this. My initial requirement has me also creating a unique front end. Would I make the front end and back end completely separate projects to enhance the re-usability. It would increase overhead. Looking at using GWT, Restlet, and JEE technology stack if this influences the setup at all.

    Read the article

  • Looking for a list of free data api's and web services

    - by darren
    I'm wondering if anybody has come across a comprehensive list of free sources for data (as a web api) or web services. I'm looking to start a new project to tinker with in my spare time and I am wondering what interesting data is available to play with. It seems like many api services such as last.fm or google search don't exist or are no longer free. Possible examples of what I am looking for information about a given ip address mapping api's information about books, movies, music information about places, businesses, attractions meteorological, financial or other scientific data shopping, products I would appreciate any suggestions you may have about interesting data freely available through the web. thanks

    Read the article

  • Beginner Design pattern question (Web Services involved)

    - by zombie
    Hi all ! I am a noob to web services world. I need to develop a login validator module and expose it as a service. I want it to be service independent, i.e I should have the option of exposing it as a SOAP service or REST service in the future. What pattern should I follow ? Sorry if I am unclear in my requirements, I can clarify as per need. Thanks !! Edit : I am using Eclipse as an IDE and Jersey libraries. I am not into any framework, simply using the MVC pattern. I find a lot of difference between SOAP ann REST methods, so I want my methods to be implementation independent - i.e I should be easily able to use my method through a SOAP or REST service call as per need. What should I do for maximum flexibility ?

    Read the article

  • SSAS Reporting Services - Set specific language / translation

    - by Chris
    Hi all, in the data warehouse there's a default language for the measures, and I added a translation for German captions. In a Visual Studio Report Server project, when creating a query with my German OS, the cube and its measures are displayed in German language. When dragging measures to the mdx query windows, the default measure name is used. That's what I want and what I expect, since when writing MDX queries I would like to use the default measure names. But when executing the query, the columns created for each measure is translated to German again. This resuls in having German columns names within my dataset, which I dont want. I'd like to have the english column names. I already tried to change the connection string to: Data Source=server;Initial Catalog=DataWarehouse;LocaleIdentifier=1033 But that doesn't help, I still see German translations. Anyone knows how to set a specific translation?

    Read the article

  • SQL Analysis Services - Dimension attributes with a "many" cardinality

    - by MonkeyBrother
    I am creating a cube with the following tables: Customer CustomerID, Name Customer Rep CustomerID, RepID Rep RepID, Name The important thing here is that there is a many to many relationship between Reps and Customers. I want to be able to ask the question "How much sales for customers working with rep 'A'?" In the data source view i set up the relationships between both customerid columns and both repid columns. I set up the rep attribute in the dimension builder and when I try to build the cube I get this error: Errors in the high-level relationship engine. the 'Rep' table that is required for a join cannot be reached based on the relationships in the data source view.

    Read the article

  • Reporting Services URL parameter problems

    - by GxG
    I have an URL to a location on the server where it can find teh report. The report works just fine if i manually refresh it. I tried using rc:ClearSession=TRUE and i also tried sending a random parameter, but the report is still not being refreshed. Any ideas? The main scenario: User eneters the page(with a grid view) User clicks on Export User sees the Report User deletes an entry from the page - grid view User clicks on Export again User sees the exact same report P.S. : The report query returns the data that should be displayed but the report returns the previous data.

    Read the article

  • Entity Framework 4, WCF &amp; Lazy Loading Tip

    - by Dane Morgridge
    If you are doing any work with Entity Framework and custom WCF services in EFv1, everything works great.  As soon as you jump to EFv4, you may find yourself getting odd errors that you can’t seem to catch.  The problem is almost always has something to do with the new lazy loading feature in Entity Framework 4.  With Entity Framework 1, you didn’t have lazy loading so this problem didn’t surface.  Assume I have a Person entity and an Address entity where there is a one-to-many relationship between Person and Address (Person has many Addresses). In Entity Framework 1 (or in EFv4 with lazy loading turned off), I would have to load the Address data by hand by either using the Include or Load Method: var people = context.People.Include("Addresses"); or people.Addresses.Load(); Lazy loading works when the first time the Person.Addresses collection is accessed: 1: var people = context.People.ToList(); 2:  3: // only person data is currently in memory 4:  5: foreach(var person in people) 6: { 7: // EF determines that no Address data has been loaded and lazy loads 8: int count = person.Addresses.Count(); 9: } 10:  Lazy loading has the useful (and sometimes not useful) feature of fetching data when requested.  It can make your life easier or it can make it a big pain.  So what does this have to do with WCF?  One word: Serialization. When you need to pass data over the wire with WCF, the data contract is serialized into either XML or binary depending on the binding you are using.  Well, if I am using lazy loading, the Person entity gets serialized and during that process, the Addresses collection is accessed.  When that happens, the Address data is lazy loaded.  Then the Address is serialized, and the Person property is accessed, and then also serialized and then the Addresses collection is accessed.  Now the second time through, lazy loading doesn’t kick in, but you can see the infinite loop caused by this process.  This is a problem with any serialization, but I personally found it trying to use WCF. The fix for this is to simply turn off lazy Loading.  This can be done at each call by using context options: context.ContextOptions.LazyLoadingEnabled = false; Turning lazy loading off will now allow your classes to be serialized properly.  Note, this is if you are using the standard Entity Framework classes.  If you are using POCO,  you will have to do something slightly different.  With POCO, the Entity Framework will create proxy classes by default that allow things like lazy loading to work with POCO.  This proxy basically creates a proxy object that is a full Entity Framework object that sits between the context and the POCO object.  When using POCO with WCF (or any serialization) just turning off lazy loading doesn’t cut it.  You have to turn off the proxy creation to ensure that your classes will serialize properly: context.ContextOptions.ProxyCreationEnabled = false; The nice thing is that you can do this on a call-by-call basis.  If you use a new context for each set of operations (which you should) then you can turn either lazy loading or proxy creation on and off as needed.

    Read the article

  • OData to the rescue. Exposing the eventlog as a data feed

    - by cibrax
    In one of the project where I was working one, we used the Microsoft Enterprise Library Exception Application Block integration with WCF for logging all the technical issues on the services/backend in Windows Event Log. This application block worked like a charm, all the errors were correctly logged on the Event Log without even needing to modify the service code. However, we also needed to provide a quick way to expose all those events to the different system users so they could get access to all the them remotely. In just a couple of minutes I came up with a simple solution based on ADO.NET Data Services. ADO.NET data services is very powerful in this sense, you only need to provide a IQueryable implementation, and that’s all. You get a RESTful service with rich query support for free. In this sample, I used Linq to Objects to get the latest entries from the Event Log, and I also filter the entries by the category used by the Application Block to avoid loading unnecessary entries in memory. public class LogDataSource     {         string source;         public LogDataSource(string source)         {             this.source = source;         }         public LogDataSource()         {         }         public IQueryable<LogEntry> LogEntries         {             get { return GetEntries().AsQueryable().OrderBy(e => e.TimeGenerated); }         }         private IEnumerable<LogEntry> GetEntries()         {             var applicationLog = System.Diagnostics.EventLog.GetEventLogs().Where(e => e.Log == "Application")                 .FirstOrDefault();             var entries = new List<LogEntry>();             if (applicationLog != null)             {                 foreach (EventLogEntry entry in applicationLog.Entries)                 {                     if (source == null || entry.Source.Equals(source, StringComparison.InvariantCultureIgnoreCase))                     {                         entries.Add(new LogEntry                         {                             Category = entry.Category,                             EventID = entry.InstanceId,                             Message = entry.Message,                             TimeGenerated = entry.TimeGenerated,                             Source = entry.Source,                         });                     }                 }             }             return entries.OrderByDescending(e => e.TimeGenerated)                         .Take(200);         }     } LogEntry is class I created for this service to expose an Event Log Entry.     [EntityPropertyMappingAttribute("Source",         SyndicationItemProperty.Title,         SyndicationTextContentKind.Plaintext, true)]     [EntityPropertyMapping("Message",         SyndicationItemProperty.Summary,         SyndicationTextContentKind.Plaintext, true)]     [EntityPropertyMapping("TimeGenerated",         SyndicationItemProperty.Updated,         SyndicationTextContentKind.Plaintext, true)]     [DataServiceKey("EventID")]     public class LogEntry     {         public long EventID         {             get;             set;         }         public string Category         {             get;             set;         }         public string Message         {             get;             set;         }         public DateTime TimeGenerated         {             get;             set;         }         public string Source         {             get;             set;         }     } As you can see, I used the new feature “Friendly feeds” to map several properties in the entries with standard ATOM elements. The “DataServiceKey” is only necessary because I am using the Reflection Provider (the exposed IQueryable implementation is just Linq to Objects) rather than the default Entity Framework Provider. The data service implementation is also quite simple, just a couple of lines were needed to expose the data source created previously. public class LogDataService : DataService<LogDataSource>     {         public static void InitializeService(IDataServiceConfiguration config)         {             config.SetEntitySetAccessRule("*", EntitySetRights.AllRead);         }         protected override LogDataSource CreateDataSource()         {             string source = ConfigurationManager.AppSettings["EventLogSource"];             if (source == null)             {                 throw new ApplicationException("The EventLogSource appsetting is missing in the configuration file");             }             return new LogDataSource(source);         }     } With this implementation in place, the final users not only get a feed with all the latest errors in the event log, but also support for performing queries against that data. This is one of the great things about ADO.NET Data services.

    Read the article

  • How to host WCF service and TCP server on same socket?

    - by Ole Jak
    Tooday I use ServiceHost for self hosting WCF cervices. I want to host near to my WCF services my own TCP programm for direct sockets operations (like lien to some sort of broadcasting TCP stream) I need control over namespaces (so I would be able to let my clients to send TCP streams directly into my service using some nice URLs like example.com:port/myserver/stream?id=1 or example.com:port/myserver/stream?id=anything and so that I will not be bothered with Idea of 1 client for 1 socket at one time moment, I realy want to keep my WCF services on the same port as my own server or what it is so to be able to call www.example.com:port/myWCF/stream?id=222...) Can any body please help me with this? I am using just WCF now. And I do not enjoy how it works. That is one of many resons why I want to start migration to clear TCP=) I can not use net-tcp binding or any sort of other cool WS-* binding (tooday I use the simpliest one so that my clients like Flash, AJAX, etc connect to me with ease). I needed Fast and easy in implemrnting connection protocol like one I created fore use with Sockets for real time hi ammount of data transfering. So.. Any Ideas? Please - I need help.

    Read the article

  • IPad SQLite Push and Pull Data from external MS SQL Server DB

    - by MattyD
    This carries on from my previous post (http://stackoverflow.com/questions/4182664/ipad-app-pull-and-push-relational-data). My plan is that when the ipad application starts I am going to pull data (config data i.e. Departments, Types etc etc relational data that is used across the system) from a webhosted MS SQL Server DB via a webservice and populate it into an SQL Lite DB on the IPad. Then when I load a listing I will pull the data over the line again via a webservice and populate it into the SQL Lite db on the ipad (than just run select commands to populate the listing). My questions are: 1. What is the most efficient way to transfer data across the line via the web? Everyone seems to do it a different way. My idea is that I will have a webService for each type of data pull (e.g. RetrieveContactListing) that will query the db and than convert that data into "something" to send across the line. My question really is what is the "something" that it should be converting into? 2. Everyone talks about odata services. Is this suited for applications where complex read and writes are needed? Ive created a simple iphone app before that talked to an sql server db (i just sent my own structured xml across the line) but now with this app the data calls are going to be a lot larger so efficiency is key.

    Read the article

  • Automating an SSRS 2008 R2 Report Snapshots and run report with most recent data

    - by Mr Shoubs
    I would like to automate a report snapshot, but there is only an option to take a snapshot in the Report History Tab. All the resources I've found suggest I need to go to processing options and select "Render this report from a snapshot". But I don't want to do that - when I go to a report, I want to get the most recent data. However daily at midnight I'd like to take a snapshot and store it in the history in case I want to compare the reports as of midnight for the last few weeks. Or am I doing this wrong and have to create a subscription instead? Note: this is for an auditing database and has way to much data in to query a range with more than 1 day in it - reports are restricted as such. (1 day has over 1 million rows on it's own).

    Read the article

  • pure-ftpd debian, can't get www-data user working

    - by lynks
    I'm trying to add FTP access to the apache web files, in the past I have done this with an ftpuser and group arrangement. This time I would like to make it possible to login directly as www-data (the default apache user on debian) to make things a bit cleaner. I have checked and re-checked all the common issues; MinUID is set to 1 (www-data has uid 33) www-data has shell set to /bin/bash in /etc/passwd PAMAuthentication is off UnixAuthentication is on I have restarted pure-ftpd using /etc/init.d/pure-ftpd restart My resulting pure-ftpd run is; /usr/sbin/pure-ftpd -l unix -A -Y 1 -u 1 -E -O clf:/var/log/pure-ftpd/transfer.log -8 UTF-8 -B My syslog contains; Oct 7 19:46:40 Debian-60-squeeze-64 pure-ftpd: ([email protected]) [WARNING] Can't login as [www-data]: account disabled And my ftp client is giving me; 530 Sorry, but I can't trust you Am I missing something obvious?

    Read the article

  • Chef: nested data bag data to template file returns "can't convert String into Integer"

    - by Dalho Park
    I'm creating simple test recipe with a template and data bag. What I'm trying to do is creating a config file from data bag that has simple nested information, but I receive error "can't convert String into Integer" Here are my setting file 1) recipe/default.rb data1 = data_bag_item( 'mytest', 'qa' )['test'] data2 = data_bag_item( 'mytest', 'qa' ) template "/opt/env/test.cfg" do source "test.erb" action :create_if_missing mode 0664 owner "root" group "root" variables({ :pepe1 = data1['part.name'], :pepe2 = data2['transport.tcp.ip2'] }) end 2)my data bag named "mytest" $knife data bag show mytest qa id: qa test: part.name: L12 transport.tcp.ip: 111.111.111.111 transport.tcp.port: 9199 transport.tcp.ip2: 222.222.222.222 3)template file test.erb part.name=<%= @pepe1 % transport.tcp.binding=<%= @pepe2 % Error reurns when I run chef-client on my server, [2013-06-24T19:50:38+00:00] DEBUG: filtered backtrace of compile error: /var/chef/cache/cookbooks/config_test/recipes/default.rb:19:in []',/var/chef/cache/cookbooks/config_test/recipes/default.rb:19:inblock in from_file',/var/chef/cache/cookbooks/config_test/recipes/default.rb:12:in from_file' [2013-06-24T19:50:38+00:00] DEBUG: filtered backtrace of compile error: /var/chef/cache/cookbooks/config_test/recipes/default.rb:19:in[]',/var/chef/cache/cookbooks/config_test/recipes/default.rb:19:in block in from_file',/var/chef/cache/cookbooks/config_test/recipes/default.rb:12:infrom_file' [2013-06-24T19:50:38+00:00] DEBUG: backtrace entry for compile error: '/var/chef/cache/cookbooks/config_test/recipes/default.rb:19:in `[]'' [2013-06-24T19:50:38+00:00] DEBUG: Line number of compile error: '19' Recipe Compile Error in /var/chef/cache/cookbooks/config_test/recipes/default.rb TypeError can't convert String into Integer Cookbook Trace: /var/chef/cache/cookbooks/config_test/recipes/default.rb:19:in []' /var/chef/cache/cookbooks/config_test/recipes/default.rb:19:inblock in from_file' /var/chef/cache/cookbooks/config_test/recipes/default.rb:12:in `from_file' Relevant File Content: /var/chef/cache/cookbooks/config_test/recipes/default.rb: 12: template "/opt/env/test.cfg" do 13: source "test.erb" 14: action :create_if_missing 15: mode 0664 16: owner "root" 17: group "root" 18: variables({ 19 :pepe1 = data1['part.name'], 20: :pepe2 = data2['transport.tcp.ip2'] 21: }) 22: end 23: I tried many things and if I comment out "pepe1 = data1['part.name'],", then :pepe2 = data2['transport.tcp.ip2'] works fine. only nested data "part.name" cannot be set to @pepe1. Does anyone knows why I receive the errors? thanks,

    Read the article

  • Data recovery on a corrupted 3TB disk

    - by Mark K Cowan
    Short version I probably need software to run a deep-scan recovery (ideally on Linux) to find files on NTFS filesystem. The file data is intact, but the references are no longer present. Analogous to recovering data from a "quick-formatted" partition. Hopefully there is a smarter way available than deep-scan, one which would recover filenames and possibly paths. Long version I have a 3TB disk containing a load of backups. Windows 7 SP1 refused to detect the disk when plugged in directly via SATA, so I put it on a USB/SATA adaptor which seemed to work at first. The SATA/USB adaptor probably does not support disks over 2.2TB though. Windows first asked me if I wanted to 'format' the disk, then later showed me most of the contents but some folder were inaccessible. I stupidly decided to run a CHKDSK on my backup disk, which made the folders accessible but also left them empty. I connected this disk via SATA to my main PC (Arch Linux). I tried: testdisk ntfsundelete ntfsfix --no-action (to look for diagnostically relevant faults, disk was "OK" though) to no avail as the files references in the tables had presumably been zeroed out by CHKDSK, rather than using a typical journal'd deletion). If it is useful at all, a majority of the files that I want to recover are JPEG, Photoshop PSD, and MPEG-3/MPEG-4/AVI/MKV files. If worst comes to worst, I'll just design my own sector scanner and use some simple heuristic-driven analysis to recover raw binary blocks of data from the disk which appears to match the structures of the above file types. I am unfamiliar with the exact workings of NTFS but used to be proficient at recovering FAT32 systems with just a hex-editor, so I can provide any useful diagnostic information if you let me know how to find it! My priorities in ascending order of importance for choosing the accepted answer: Restores directory structure Recovers many filenames in addition to the file data Is free / very cheap Runs on Linux Recovers a majority of file data The last point is the most important, but the more of the higher points you match the more rep you'll probably get :)

    Read the article

  • Windows service runs file locally but not on server

    - by Ben
    I created a simple Windows service in dot net which runs a file. When I run the service locally I see the file running in the task manager just fine. However, when I run the service on the server it won't run the file. I've checked the path to the file which is fine. I also checked the permissions on the folder and file, and they fine as well. Also there are no exceptions happening. Below is the code used to launch the process which runs the file. I posted this first on stack overflow, and some people were thinking this is a config issue, so I moved it here. Any ideas? try { // TODO: Add code here to start your service. eventLog1.WriteEntry("VirtualCameraService started"); // Create An instance of the Process class responsible for starting the newly process. System.Diagnostics.Process process1 = new System.Diagnostics.Process(); // Set the directory where the file resides process1.StartInfo.WorkingDirectory = "C:\\VirtualCameraServiceSetup\\"; // Set the filename name of the file to be opened process1.StartInfo.FileName = "VirtualCameraServiceProject.avc"; // Start the process process1.Start(); } catch (Exception ex) { eventLog1.WriteEntry("VirtualCameraService exception - " + ex.InnerException); }

    Read the article

  • Any "Magic Tricks" For Getting Data Back After Windows 7 Install

    - by user163757
    My old man installed Windows 7 without making a proper backup, and now realizes he left behind some important data. He did a true "clean install", so there is no Windows.old folder in the root directory. However, I believe the format performed on the hard drive was only a quick format, so I am hoping there is some chance at data recovery. I took his hard drive out, and have spent a majority of the weekend researching data recovery options. I paid $70 for the GetDataBack software, but have had little success with it. I can see all of the files I want to restore, however they appear corrupt when I try to open them. With that all being said, does anyone know of a viable way to recover some of this data, or is it a lost cause all together?

    Read the article

< Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >