Search Results

Search found 1934 results on 78 pages for 'ssis catalog'.

Page 30/78 | < Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >

  • Top Questions and Answers for Pluging into Oracle Database as a Service

    - by David Swanger
    Yesterday we hosted a comprehensive online forum that shared a comprehensive path to help your organization design, deploy, and deliver a Database as a Service cloud. If you missed the online forum, you can watch it on demand by registering here. We received numerous questions.  Below are highlights of the most informative: DBaaS requires a lengthy and careful design efforts. What is the minimum requirements of setting up a scaled-down environment and test it out? You should have an OEM 12c environment for DBaaS administration and then a target database deployment platform that has the key characteristics of what your production environment will look like. This could be a single server or it could be a small pool of hosts if your production DBaaS will be larger and you want to test a more robust / real world configuration with Zones and Pools or DR capabilities for example. How does this benefit companies having their own data center? This allows companies to transform their internal IT to a service delivery model for the database. The benefits to the company are significant cost savings, improved business agility and reduced risk. The benefits to the consumers (internal) of services if much fast provisioning, and response to change in business requirements. From a deployment perspective, is DBaaS's job solely DBA's job? The best deployment model enables the DBA (or end-user) to control the entire process. All resources required to deploy the service are pre-provisioned, and there are no external dependencies (on network, storage, sysadmins teams). The service is created either via a self-service portal or by the DBA. The purpose of self service seems to be that the end user does not rely on the DBA. I just need to give him a template. He decides how much AMM he needs. Why shall I set it one by one. That doesn't seem to be the purpose of self service. Most customers we have worked with define a standardized service catalog, with a few (2 to 5) different classes of service. For each of these classes, there is a pre-defined deployment template, and the user has the ability to select from some pre-defined service sizes. The administrator only has to create this catalog once. Each user then simply selects from the options offered in the catalog.  Looking at DBaaS service definition, it seems to be no different from a service definition provided by a well defined DBA team. Why do you attribute it to DBaaS? There are a couple of perspectives. First, some organizations might already be operating with a high level of standardization and a higher level of maturity from an ITIL or Service Management perspective. Their journey to DBaaS could be shorter and their Service Definition will evolve less but they still might need to add capabilities such as Self Service and Metering/Chargeback. Other organizations are still operating in highly siloed environments with little automation and their formal Service Definition (if they have one) will be a lot less mature today. Therefore their future state DBaaS will look a lot different from their current state, as will their Service Definition. How database as a service impact or help with "Click to Compute" or deploying "Database in cloud infrastructure" DBaaS enables Click to Compute. Oracle DBaaS can be implemented using three architecture models: Oracle Multitenant 12c, native consolidation using Oracle Database and consolidation using virtualization in infrastructure cloud. As Deploy session showed, you get higher consolidating density and efficiency using Multitenant and higher isolation using infrastructure cloud. Depending upon your business needs, DBaaS can be implemented using any of these models. How exactly is the DBaaS different from the traditional db? Storage/OS/DB all together to 'transparently' provide service to applications? Will there be across-databases access by application/user. Some key differences are: 1) The services run on a shared platform. 2) The services can be rapidly provisioned (< 15 minutes). 3) The services are dynamic and can be relocated, grown, shrunk as needed to meet business needs without disruption and rapidly. 4) The user is able to provision the services directly from a standardized service catalog.. With 24x7x365 databases its difficult to find off peak hrs to do basic admin tasks such as gathering stats, running backups, batch jobs. How does pluggable database handle this and different needs/patching downtime of apps databases might be serving? You can gather stats in Oracle Multitenant the same way you had been in regular databases. Regarding patching/upgrading, Oracle Multitenant makes patch/upgrade very efficient in that you can pre-provision a new version/patched multitenant db in a different ORACLE_HOME and then unplug a PDB from its CDB and plug it into the newer/patched CDB in seconds.  Thanks for all the great questions!  If you'd like to learn more and missed the online forum, you can watch it on demand here.

    Read the article

  • Creating packages in code - Package Configurations

    Continuing my theme of building various types of packages in code, this example shows how to building a package with package configurations. Incidentally it shows you how to add a variable, and a connection too. It covers the five most common configurations: Configuration File Indirect Configuration File SQL Server Indirect SQL Server Environment Variable  For a general overview try the SQL Server Books Online Package Configurations topic. The sample uses a a simple helper function ApplyConfig to create or update a configuration, although in the example we will only ever create. The most useful knowledge is the configuration string (Configuration.ConfigurationString) that you need to set. Configuration Type Configuration String Description Configuration File The full path and file name of an XML configuration file. The file can contain one or more configuration and includes the target path and new value to set. Indirect Configuration File An environment variable the value of which contains full path and file name of an XML configuration file as per the Configuration File type described above. SQL Server A three part configuration string, with each part being quote delimited and separated by a semi-colon. -- The first part is the connection manager name. The connection tells you which server and database to look for the configuration table. -- The second part is the name of the configuration table. The table is of a standard format, use the Package Configuration Wizard to help create an example, or see the sample script files below. The table contains one or more rows or configuration items each with a target path and new value. -- The third and final part is the optional filter name. A configuration table can contain multiple configurations, and the filter is  literal value that can be used to group items together and act as a filter clause when configurations are being read. If you do not need a filter, just leave the value empty. Indirect SQL Server An environment variable the value of which is the three part configuration string as per the SQL Server type described above. Environment Variable An environment variable the value of which is the value to set in the package. This is slightly different to the other examples as the configuration definition in the package also includes the target information. In our ApplyConfig function this is the only example that actually supplies a target value for the Configuration.PackagePath property. The path is an XPath style path for the target property, \Package.Variables[User::Variable].Properties[Value], the equivalent of which can be seen in the screenshot below, with the object being our variable called Variable, and the property to set is the Value property of that variable object. The configurations as seen when opening the generated package in BIDS: The sample code creates the package, adds a variable and connection manager, enables configurations, and then adds our example configurations. The package is then saved to disk, useful for checking the package and testing, before finally executing, just to prove it is valid. There are some external resources used here, namely some environment variables and a table, see below for more details. namespace Konesans.Dts.Samples { using System; using Microsoft.SqlServer.Dts.Runtime; public class PackageConfigurations { public void CreatePackage() { // Create a new package Package package = new Package(); package.Name = "ConfigurationSample"; // Add a variable, the target for our configurations package.Variables.Add("Variable", false, "User", 0); // Add a connection, for SQL configurations // Add the SQL OLE-DB connection ConnectionManager connectionManagerOleDb = package.Connections.Add("OLEDB"); connectionManagerOleDb.Name = "SQLConnection"; connectionManagerOleDb.ConnectionString = "Provider=SQLOLEDB.1;Data Source=(local);Initial Catalog=master;Integrated Security=SSPI;"; // Add our example configurations, first must enable package setting package.EnableConfigurations = true; // Direct configuration file, see sample file this.ApplyConfig(package, "Configuration File", DTSConfigurationType.ConfigFile, "C:\\Temp\\XmlConfig.dtsConfig", string.Empty); // Indirect configuration file, the emvironment variable XmlConfigFileEnvironmentVariable // contains the path to the configuration file, e.g. C:\Temp\XmlConfig.dtsConfig this.ApplyConfig(package, "Indirect Configuration File", DTSConfigurationType.IConfigFile, "XmlConfigFileEnvironmentVariable", string.Empty); // Direct SQL Server configuration, uses the SQLConnection package connection to read // configurations from the [dbo].[SSIS Configurations] table, with a filter of "SampleFilter" this.ApplyConfig(package, "SQL Server", DTSConfigurationType.SqlServer, "\"SQLConnection\";\"[dbo].[SSIS Configurations]\";\"SampleFilter\";", string.Empty); // Indirect SQL Server configuration, the environment variable "SQLServerEnvironmentVariable" // contains the configuration string e.g. "SQLConnection";"[dbo].[SSIS Configurations]";"SampleFilter"; this.ApplyConfig(package, "Indirect SQL Server", DTSConfigurationType.ISqlServer, "SQLServerEnvironmentVariable", string.Empty); // Direct environment variable, the value of the EnvironmentVariable environment variable is // applied to the target property, the value of the "User::Variable" package variable this.ApplyConfig(package, "EnvironmentVariable", DTSConfigurationType.EnvVariable, "EnvironmentVariable", "\\Package.Variables[User::Variable].Properties[Value]"); #if DEBUG // Save package to disk, DEBUG only new Application().SaveToXml(String.Format(@"C:\Temp\{0}.dtsx", package.Name), package, null); Console.WriteLine(@"C:\Temp\{0}.dtsx", package.Name); #endif // Execute package package.Execute(); // Basic check for warnings foreach (DtsWarning warning in package.Warnings) { Console.WriteLine("WarningCode : {0}", warning.WarningCode); Console.WriteLine(" SubComponent : {0}", warning.SubComponent); Console.WriteLine(" Description : {0}", warning.Description); Console.WriteLine(); } // Basic check for errors foreach (DtsError error in package.Errors) { Console.WriteLine("ErrorCode : {0}", error.ErrorCode); Console.WriteLine(" SubComponent : {0}", error.SubComponent); Console.WriteLine(" Description : {0}", error.Description); Console.WriteLine(); } package.Dispose(); } /// <summary> /// Add or update an package configuration. /// </summary> /// <param name="package">The package.</param> /// <param name="name">The configuration name.</param> /// <param name="type">The type of configuration</param> /// <param name="setting">The configuration setting.</param> /// <param name="target">The target of the configuration, leave blank if not required.</param> internal void ApplyConfig(Package package, string name, DTSConfigurationType type, string setting, string target) { Configurations configurations = package.Configurations; Configuration configuration; if (configurations.Contains(name)) { configuration = configurations[name]; } else { configuration = configurations.Add(); } configuration.Name = name; configuration.ConfigurationType = type; configuration.ConfigurationString = setting; configuration.PackagePath = target; } } } The following table lists the environment variables required for the full example to work along with some sample values. Variable Sample value EnvironmentVariable 1 SQLServerEnvironmentVariable "SQLConnection";"[dbo].[SSIS Configurations]";"SampleFilter"; XmlConfigFileEnvironmentVariable C:\Temp\XmlConfig.dtsConfig Sample code, package and configuration file. ConfigurationApplication.cs ConfigurationSample.dtsx XmlConfig.dtsConfig

    Read the article

  • puppet master --compile logs errors to stdout

    - by danny
    I see a bug about this that was accepted and then closed a year ago: http://projects.puppetlabs.com/issues/3670 but I'm using puppet 2.7.14 and am getting the same issue. I'm trying to use "puppet solo" (i.e. just running puppet apply on each server to be configured) as I only have 2 or 3 servers in this project and adding another server as a puppetmaster would be completely overkill. Unless I'm mistaken, the best way to apply a node manually to a server is to do: puppet master --compile=mynode > catalog.json puppet apply --catalog catalog.json But the puppet master command outputs a couple of warnings and notices to stdout, mixed in with the desired json content. And it uses colored output so I can't just pipe it through egrep -v '^warning:' EDIT: I guess it's not too big of a deal to use grep - since puppet 2.7 pretty-prints the actual content and the warnings don't ever start with spaces, piping the output through egrep '^( |{|})' works So my questions are basically: Is there a better way than this to apply a puppet node without using a puppetmaster? I can't really find any good references online to using puppet without a puppetmaster, even though that seems like a perfectly reasonable thing to do for a small project. Is there a setting or flag that I'm missing that will get puppet master to stop being an asshole and send its errors to stderr instead of stdout? Or do I really have to turn off color logging, then grep to exclude warning: and notice: lines?

    Read the article

  • can not connect to SQL running on amazon ec2 machine

    - by njj56
    I am using SQL managment studio 2008 running on an Amazon EC2 machine. I am unable to connect to the database in my asp.net application. The EC2 instance has been set to accept connections over the SQL port. I am also able to remote the machine as well as view websites hosted on the server. Listed below is part of the connection string relating to this instance. When the program is ran and this connection string is called, it returns tcp error 0 - no return response. it just times out. <add name="ProjectServer" connectionString="Data Source=*IP ADDRESS HERE*,1433;Initial Catalog=*Catalog Name*;User ID=IP-0A6ED514\Administrator;"/> I removed the ip and the catalog name for the example, but I am sure they are correct. The only thing that I could think may cause an error, is the differences in names between the user id and the server name - the server name is ip-0A6ED514\sharepoint but the user name is ip-0A6ED514\administrator when I log into the sql server manager on the EC2 instance. A password is not used. Not sure if I would need to leave in a blank string for password - also not sure if the difference between server name and user id to log in makes a difference. Any help is appreciated. Thank you. update - when this connection string is used with out the port, i get tcp provider error 40 - when the port is in there, i get error 0 edit- the sql server is using windows authentication - does this make a difference? Usually I always use SQL server authentication

    Read the article

  • Active Directory server down, recovering without reinstalling

    - by whatever
    My Windows 2003 server suddenly ceased to function as a DC (this server is the only DC of the domain). All AD related services are down. The only way I can login to the AD is physically to the machine. Everytime I access an AD-related service (e.g. "AD users and computers") I get the below error: Naming information cannot be located because: The specified directory service attribute or value does not exist. Contact your system administrator to verify that your domain is properly configured and is currently online. I found the below system event which matches the time when the issue started, this re-occurs everytime I reboot the server. NTDS General | Global Catalog | Active Directory was unable to establish a connection with the global catalog. Additional Data Error value: 1355 The specified domain either does not exist or could not be contacted. Internal ID: 3200d33 I started the troubleshooting with DNS. Netdiag throws the below error although I think this is simply a consequence of not being able to access the Global Catalog. The procedure entry point DnsGetPrimaryDomainName_UTF8 could not be located in the dynamic link library DNSAPI.dll. Anyway DNS seems OK because I can ping the DC FQDN from the DC itself. I found the below solution which is supposed to help by doing some cleanup of the metadata: http://support.microsoft.com/kb/216498 If I follow procedure 1 here is what I get at step 9: no current site Domain - DC=<mydomain>,DC=<com> no current server no current naming context I can continue the procedure until step 14. I haven't tested step 15 as my understanding is that I will have to reinstall the whole AD again. Is there any way I can recover my AD from there without having to reinstall the whole thing? Update: Yes, the server was powered off/on because reboot would take forever (not because I thought power cycling the unit would fix it more than a reboot).

    Read the article

  • Olympics data available for all on Windows Azure SQL Database and Power View

    - by jamiet
    Are you looking around for some decent test data for your BI demos? Well, if so, Microsoft have provided some data about all medals won at the Olympics Games (1900 to 2008) at OlympicsData workbook - Excel, SSIS, Azure sample; it provides analysis over athletes, countries, medal type, sport, discipline and various other dimensions. The data has been provided in an Excel workbook along with instructions on how to load the data into a Windows Azure SQL Database using SQL Server Integration Services (SSIS). Frankly though, the rigmarole of standing up your own Windows Azure SQL Database ok, SQL Azure database, is both costly (SQL Azure isn’t free) and time consuming (the provided instructions aren’t exactly an idiot’s guide and getting SSIS to work properly with Excel isn’t a barrel of laughs either). To ease the pain for all you BI folks out there that simply want to party on the data I have loaded it all into the SQL Azure database that I use for hosting AdventureWorks on Azure. You can read more about AdventureWorks on Azure below however I’ll summarise here by saying it is a SQL Azure database provided for the use of the SQL Server community and which is supported by voluntary donations. To view the data the credentials you need are: Server mhknbn2kdz.database.windows.net  Database AdventureWorks2012 User sqlfamily Password sqlf@m1ly Type those into SSMS and away you go, the data is provided in four tables [olympics].[Sport], [olympics].[Discipline], [olympics].[Event] & [olympics].[Medalist]: I figured this would be a good candidate for a Power View report so I fired up Excel 2013 and built such a report to slice’n’dice through the data – here are some screenshots that should give you a flavour of what is available: A view of all the available data Where do all the gymastics medals go? Which countries do top ten all-time medal winners come from? You get the idea. There is masses of information here and if you have Excel 2013 handy Power View provides a quick and easy way of surfing through it. To save you the bother of setting up the Power View report yourself you can have the one that I took these screenshots from, it is available on my SkyDrive at OlympicsAnalysis.xlsx so just hit the link and download to play to your heart’s content. Party on, people! As I said above the data is hosted on a SQL Azure database that I use for hosting “AdventureWorks on Azure” which I first announced in March 2013 at AdventureWorks2012 now available for all on SQL Azure. I’ll repeat the pertinent parts of that blog post here: I am pleased to announce that as of today … [AdventureWorks2012] now resides on SQL Azure and is available for anyone, absolutely anyone, to connect to and use for their own means. This database is free for you to use but SQL Azure is of course not free so before I give you the credentials please lend me your ears eyes for a short while longer. AdventureWorks on Azure is being provided for the SQL Server community to use and so I am hoping that that same community will rally around to support this effort by making a voluntary donation to support the upkeep which, going on current pricing, is going to be $119.88 per year. If you would like to contribute to keep AdventureWorks on Azure up and running for that full year please donate via PayPal to [email protected] Any amount, no matter how small, will help. If those 50+ people that retweeted me beforehand all contributed $2 then that would just about be enough to keep this up for a year. If the community contributes more than we need then there are a number of additional things that could be done: Host additional databases (Northwind anyone??) Host in more datacentres (this first one is in Western Europe) Make a charitable donation That last one, a charitable donation, is something I would really like to do. The SQL Community have proved before that they can make a significant contribution to charitable orgnisations through purchasing the SQL Server MVP Deep Dives book and I harbour hopes that AdventureWorks on Azure can continue in that vein. So please, if you think AdventureWorks on Azure is something that is worth supporting please make a contribution. I’d like to emphasize that last point. If my hosting this Olympics data is useful to you please support this initiative by donating. Thanks in advance. @Jamiet

    Read the article

  • Extracting contents of ConnectionStrings in web.config in Silverlight Business application.

    - by webKite
    I am trying to read dataSource ad Catalog from connectionString in web.config in Silverlight business project. Unfortunately when I used "SqlConnectionStringBuilder", I could not read connectionstring the has "connectionString="metadata=res:///MainDatabase.Main.csdl|res:///MainDatabase.Main.ssdl|......."" where as it work for "connectionString="Data Source=My-PC\SQL_2008;Initial Catalog =...."". I could get them using "Split" however, I don't like that solution. Is there any way to get my requirements? Thanks

    Read the article

  • How to redirect a form's get variables with IIS7

    - by Gareth
    Is it possible (and if so how) to redirect the url from a forms get variables into something more url friendly using IIS 7's URL Rewrite module. An example would be /Catalog/Search/Title=Something&Order=Price into /Catalog/Search/Title/Something/Order/Price Thanks for any suggestions

    Read the article

  • Handling extra newlines in csv files parsed with Python?

    - by rmihalyi
    I have a CSV file that contains extra newlines in some fields, e.g.: A, B, C, D, E, F 123, 456, tree , very, bla, indigo I tried the following: import csv catalog = csv.reader(open('test.csv', 'rU'), delimiter=",", dialect=csv.excel_tab) for row in catalog: print "Length: ", len(row), row and the result I got was this: Length: 6 ['A', ' B', ' C', ' D', ' E', ' F'] Length: 3 ['123', ' 456', ' tree'] Length: 4 [' ', ' very', ' bla', ' indigo'] Does anyone have any idea how I can quickly remove extraneous newlines? Thanks!

    Read the article

  • microsoft indexing service

    - by Daz
    Hello I want to utilise Microsoft indexing service , but I dont like to fact they you need to manually index the files and build the catalog. Is there a way to automate this eg, call an api from code, as a scheduled task , or perhaps command line commands that will index a folder and built the catalog. This is for a searching a document repository built using asp.net Thanks

    Read the article

  • A dacpac limitation – Deploy dacpac wizard does not understand SqlCmd variables

    - by jamiet
    Since the release of SQL Server 2012 I have become a big fan of using dacpacs for deploying SQL Server databases (for reasons that I will explain some other day) and I chose to use a dacpac to distribute my recently announced utility sp_ssiscatalog (read: Introducing sp_ssiscatalog (v1.0.0.0)). Unfortunately if you read that blog post you may have taken note of the following: Ordinarily a dacpac can be deployed to a SQL Server from SSMS using the Deploy Dacpac wizard however in this case there is a limitation. Due to sp_ssiscatalog referring to objects in the SSIS Catalog (which it has to do of course) the dacpac contains a SqlCmd variable to store the name of the database that underpins the SSIS Catalog; unfortunately the Deploy Dacpac wizard in SSMS has a rather gaping limitation in that it cannot deploy dacpacs containing SqlCmd variables. I think it is worth calling out this limitation separately in this blog post because its a limitation that all dacpac users need to be aware of. If you try and deploy the dacpac containing sp_ssiscatalog using the wizard in SSMS then this is what you will see: TITLE: Microsoft SQL Server Management Studio ------------------------------ Could not deploy package. (Microsoft.SqlServer.Dac) ------------------------------ ADDITIONAL INFORMATION: Missing values for the following SqlCmd variables:SSISDB. (Microsoft.Data.Tools.Schema.Sql) ------------------------------ BUTTONS: OK ------------------------------ The message is quite correct. The SSDT DB project that I used to build this dacpac *does* have a SqlCmd variable in it called SSISDB: Quite simply, the Dac Deployment wizard in SSMS is not capable of deploying such dacpacs. Your only option for deploying such dacpacs is to use the command-line tool sqlpackage.exe. Generally I use sqlpackage.exe anyway (which is why it has taken me months to encounter the aforementioned problem) and have found it preferable to using a GUI-based wizard. Your mileage may vary. @Jamiet

    Read the article

  • CodePlex Daily Summary for Friday, April 16, 2010

    CodePlex Daily Summary for Friday, April 16, 2010New Projects[C#] UML Touch: This project is my master thesis project for Msc Software Engineering at Oxford Brookes univesity. The goal of this project is to develop a UML Edi...3D Chart Reports, using ASP.NET 2.0: Project focuses on the visibility of internal supply chain management process. The objective is materialized, using MSChart and hence provides asse...Active Directory Health: ActiveDirectoryHealth (ADHealth) Aim: To create a suite of tools to allow an Active Directory administrator monitor and identify problems, particu...Art4Desktop: Aplicação desktop simplesAuto Increment field for any entity in MS CRM: Auto Increment field for any entity in MS CRM will add an attribute in any entity which will be auto increment type.Convection Game Engine (Basic Edition): A basic 2D game engine written in C# using XNA 3.1CSharp Intellisense: VS 2010 IntelliSense plug-in Provides custom IntelliSense for the CSharp Editor that is capable to filter out events, preoperties or methods. ...EP: Generic platform for enterprise applications developed on .NETFile Change Checker: A simple windows form application that checks and copies all modified files given a specified date. It aims to help developers that use Visual Stu...Folder: Folder is a puzzle platformer game, which provides you a whole new experience. You can fold objects in game play; a wall becomes a road, a high sti...Industrial Dashboard: WCF service that allows executing SQL Server stored procedures straight from javascript code, enabling sending and receiving structured data withou...kafrilion: Open Source Platform GameLogikBug.Injection: LogikBug.Injection is an IOC container and is very light weight. It is very simple to use and yet very roboust, there are many options and ways to ...MongoMvc: NoSql with MongoDB, NoRM and ASP.NET MVC. For more details check out the blog post http://weblogs.asp.net/shijuvarghese/archive/2010/04/16/nosql-wi...MOSSDAL: MOSS Data Access Layer for data from the Sharepoint Lists Service: MOSSDAL is a lightweight framework for working with Sharepoint MOSS List data using the List web Service. It can be used with silverlight or regul...Should: This is a set of test framework agnostic extensions assetion extensions. This project was born because test runners Should be independent of the...Software Localization Tool: summaryTIMETABLEASY: planning managerTR9: implementions of t9 technologies from mobile device to personal computer and laptopTrp net tools: net toolTwep: Twep is a JavaScript lib.Using PowerPivot to analyze MS Dynamics NAV: Project show how to prepare MS Dynamics NAV data for analyzing in PowerPivot for Excel. Project include Data Warehouse demo database, sql procedure...vmware virtual mashines management system for education: System for management many classes with PC and VMware virtual mashines (bad english,sorry)WebSite: Mr. John's projectsWebtrends DX and DC Services Libraries: This project is setup to provide assemblies to simplify access to the REST based DX and DC Web Services that Webtrends Provides. For access you ne...YetAnotherFileRenamer: Searches a directory and/or sub-directories for files matching a certain extension and copies and renames them to the same folder. The intent is a...New Releases3D Chart Reports, using ASP.NET 2.0: Beta iscm1.0.0.1: Its first release, might be a tolerant to bugs.A Guide to Parallel Programming: Drop 2 - Guide Chapters 2 and 5, accompanying code: This is Drop 2 with just the Guide Chapters 2 and 5 and the accompanying code samples. This drop requires Visual Studio 2010 Beta 2 or later in ord...AnimeJ: AnimeJ 1.1: This new release features reversible animations. It is fully backward compatible, and by simply specifying an additional parameter to Run() you can...AutoPoco: AutoPoco 0.4: A load of additions to the convention system Inheritance precedence added And a pile of extra data sourcesCSharp Intellisense: V1.5: CSharpIntellisensePresenter(Free) Created by: Bnaya Eshet (Bnaya Eshet (credit to Karl Shifflett)) Last Updated: Tuesday, April 13, 2010 Version...DevTreks -social budgeting that improves lives and livelihoods: Social Budgeting Web Software, DevTreks alpha 4: Alpha 4 upgrades all story-telling with one very basic, simple, 'story' data pattern, schema, and stylesheet. Basic, simple, stories and eBook pa...ESPEHA: Espeha 4.3: Deletion of categories and tasks via F8 Saving on CtrlS, CtrlShiftS Navigating to search on CtrlF Hiding on CtrlQ, CtrlX, CtrlH, Q, X, H + Drag ...Folder Bookmarks: Folder Bookmarks 1.4.4: This is the latest version of Folder Bookmarks (1.4.4), with general improvements. It has an installer - it will create a directory 'CPascoe' in My...HouseFly controls: HouseFly controls alpha 0.9.1: Version 0.9.1 alpha release of HouseFly controlsIndustrial Dashboard: 2.0 Beta: 2.0 Beta includes : IndustrialDashboard Framework Sample widgets : TabularReport, DropDownPicker, DatePicker, ScopePicker Sample html pageskdar: KDAR 0.0.20: KDAR - Kernel Debugger Anti Rootkit - signature's bases updated - many bugs fixedLogikBug.Injection: Initial Release: This project is dependent upon Microsoft.Practices.ServiceLocation.IServiceLocator and must be referenced when referencing LogikBug.Injection.Mesopotamia Experiment: Mesopotamia 1.2.72: New Features - Select organims to load in sim or in hw mode Fixes - fixed robotics engine not shutting down properly - selects new robotics port o...MobySharp: MobySharp 1.1: Fixed GetComments Added the new Likes featureMongoMvc: NoSQL with MongoDB, NoRM and ASP.NET MVC: A demo on NoSQL with MongoDB, NoRM and ASP.NET MVC. To run the demo application do the ollowing actions. 1. Create a directory call C:\data\...NetMassDownloader: Release 1.6.0.0: Mass Downloader For .Net Framework which allows you do download .Net Framework source code all at once.Offline debugging for VS2005 , VS2008 , Code...Nito.KitchenSink: Version 4: Features added in this release: PDBs are source-indexed to CodePlex, so it should be possible to step through the code (with on-demand downloads) w...Nito.KitchenSink: Version 5: The only feature for this release is support of the .NET 4.0 platform. Dependencies Nito.Linq 0.4 Beta (released 2010-04-15) Rx 1.0.2441.0 (rele...Nito.LINQ: Beta (v0.4): Rx version The "with Rx" versions of Nito.LINQ are built against Rx 1.0.2441.0, released 2010-04-15. Breaking changes IEnumerable<T>.Min and IEnum...N-Twill Twitter Client for VB.NET: NTWILL PROJECT 15-abril-2010: Este archivo es un Zip que contiene el proyecto hasta este punto editado... tiene algunas que otras funciones, pero lo cuelgo para tener un referen...PanBrowser: 1.2.0: added screensaver support, hit a key to exit, up/down keys cycle through images.PersianDateTimePicker, PersianMonthCalendar: PersianDatetimePicker,PersianMonthCalendar: PersianDatetimePicker,PersianMonthCalendar 1.1 releasesPokeIn Comet Ajax Library: PokeIn Library v03: New Features! Client MessageLog Management Added CrossBrowser Script Injector Added to work with non ajax supported browsers and activeX disabled...PokeIn Comet Ajax Library: PokeIn Sample VS09: Sample Project of PokeIn VS09. Contains version 0.3 of LibraryPokeIn Comet Ajax Library: PokeIn Sample VS10: Sample Visual Studio 10 project for PokeIn. Contains v03 Library of PokeInSilverlight Toolkit: April 2010: Suggestions? Features? Questions? Ask questions in our Silverlight Toolkit forum on Silverlight.net. The forum is the best resource for Silverlig...Software Localization Tool: SharpSLT 0.9: This is the first release of SharpSLT. Features Framework: .Net 3.5 UI language: English Works as standalone and External Tool for Visual C# Ex...Star Trooper for XNA 2D Tutorial: Lesson two content: Here is Lesson two original content for the StarTrooper 2D XNA tutorial. The blog tutorial has now started over on http://xna-uk.net/blogs/darkgen...StyleCop for ReSharper: StyleCop for ReSharper 5.0.14714.1: StyleCop for ReSharper 5.0.14714.1 o StyleCop 4.3.3.0 o ReSharper 5.0.1659.36 o Visual Studio 2008 / Visual Studio 2010 Installation Instructi...TaskUnZip for SSIS: TaskUnZip for SSIS 1.1.1.0 (beta 2): Ver. 1.1.1.0 (beta2) Bug: Correct installation bug. Add: Support SQL SERVER 2008 / 2005 Minor corrections Ver. 1.1.0.0 (Tnx to: Kevin Wendler)...TaskUnZip for SSIS: TaskUnZip for SSIS 1.2.0.0: Ver. 1.2.0.0 Add: Support SQL SERVER 2008 / 2005. Add: recursive compress.* Add: filter option for extract e compress file.* Add: Test archiv...Test Project (ignore): abcde: aaaaadaTR9: TR9 Beta: this setup is first release of the program.TR9: TR9 Source Code: TR9 Source CodeUsing PowerPivot to analyze MS Dynamics NAV: GL sql procedures and demo DB: For using sql procedures and creating DW database, please see Documentation.VivoSocial: VivoSocial 7.1.1: Version 7.1.1 of VivoSocial has been released. If you experienced any issues with the previous version, please update your modules to the 7.1.1 rel...WPF Inspirational Quote Management System: Release 1.1.1: - Changed to only allow one running instance a time. - Custom icon in system tray popup removed for now as this breaks when text size is set to gr...WPF ShaderEffect Generator: WPF ShaderEffect Generator 1.6.1: Just a minor release to fix the problem with resource Uri generation in the C# Shadereffect files. ChangesThe Uri should be correctly generated no...Most Popular ProjectsRawrWBFS ManagerAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitWindows Presentation Foundation (WPF)ASP.NETMicrosoft SQL Server Community & SamplesPHPExcelpatterns & practices – Enterprise LibraryMost Active ProjectsRawrpatterns & practices – Enterprise LibraryGMap.NET - Great Maps for Windows Forms & PresentationIndustrial DashboardFarseer Physics EngineIonics Isapi Rewrite FilterNB_Store - Free DotNetNuke Ecommerce Catalog ModuleBlogEngine.NETjQuery Library for SharePoint Web ServicesDotRas

    Read the article

  • A dacpac limitation – Deploy dacpac wizard does not understand SqlCmd variables

    - by jamiet
    Since the release of SQL Server 2012 I have become a big fan of using dacpacs for deploying SQL Server databases (for reasons that I will explain some other day) and I chose to use a dacpac to distribute my recently announced utility sp_ssiscatalog (read: Introducing sp_ssiscatalog (v1.0.0.0)). Unfortunately if you read that blog post you may have taken note of the following: Ordinarily a dacpac can be deployed to a SQL Server from SSMS using the Deploy Dacpac wizard however in this case there is a limitation. Due to sp_ssiscatalog referring to objects in the SSIS Catalog (which it has to do of course) the dacpac contains a SqlCmd variable to store the name of the database that underpins the SSIS Catalog; unfortunately the Deploy Dacpac wizard in SSMS has a rather gaping limitation in that it cannot deploy dacpacs containing SqlCmd variables. I think it is worth calling out this limitation separately in this blog post because its a limitation that all dacpac users need to be aware of. If you try and deploy the dacpac containing sp_ssiscatalog using the wizard in SSMS then this is what you will see: TITLE: Microsoft SQL Server Management Studio ------------------------------ Could not deploy package. (Microsoft.SqlServer.Dac) ------------------------------ ADDITIONAL INFORMATION: Missing values for the following SqlCmd variables:SSISDB. (Microsoft.Data.Tools.Schema.Sql) ------------------------------ BUTTONS: OK ------------------------------ The message is quite correct. The SSDT DB project that I used to build this dacpac *does* have a SqlCmd variable in it called SSISDB: Quite simply, the Dac Deployment wizard in SSMS is not capable of deploying such dacpacs. Your only option for deploying such dacpacs is to use the command-line tool sqlpackage.exe. Generally I use sqlpackage.exe anyway (which is why it has taken me months to encounter the aforementioned problem) and have found it preferable to using a GUI-based wizard. Your mileage may vary. @Jamiet

    Read the article

  • Script Task/Component and Template Information

    The Script Task and Script component are often used by people developing SSIS packages because they are easy to use and now because SSIS could be perceived to be more developer friendly they are very powerful. That being said we should no be using them everywhere.  There are generally Tasks/Components already provided that will do the job it may be that we have to rethink the way we want to draw our package. I had cause last week to break out the script component in SQL Server 2008 SP1 and found that it was broken.  I don’t know when it broke as I do not use them all that often.  My error was as below.     Something must have overwritten this template information.  I looked in Event Viewer and tried the things it suggested but the templates still did not work.  Here is how I got them eventually to work for me (Your Mileage may vary) Open up a Command Prompt window using an administrator level account and “as an administrator” vsta.exe /hostid SSIS_ScriptTask /setup vsta.exe /hostid SSIS_ScriptComponent /setup   This worked for me.  Hope it helps.

    Read the article

  • Migrating data from SQL Server 2000 to SQL Server 2005

    - by Muhammad Kashif Nadeem
    I have to migrate existing data which is in SQL Server 2000 to SQL Server 2005. The schema of two databases is different. For Example Locations table in SS2000 is split into two tables and has different columns. This is one time activity. After successful migration I don't need old db anymore. What is the best way to transfer data from one SQL Server to another having different schemas? I can write stored procedures to fetch data from SQL Server 2000 and insert/update tables in SQL Server 2005. What about SSIS? I don't have any experience with this and is this better to create package of SSIS because I don't need this again and need to learn it first. Thanks.

    Read the article

  • Master Data

    - by david.butler(at)oracle.com
    Let's take a deeper look at what we mean when we talk about 'Master' data. In its most general sense, master data is data that exists in more than one operational application. These are the applications that automate business processes. These applications require significant amounts of data to function correctly.  This includes data about the objects that are involved in transactions, as well as the transaction data itself.  For example, when a customer buys a product, the transaction is managed by a sales application.  The objects of the transaction are the Customer and the Product.  The transactional data is the time, place, price, discount, payment methods, etc. used at the point of sale. Many thousands of transactional data attributes are needed within the application. These important data elements are local to the applications and have no bearing on other applications. Harmonization and synchronization across applications is not necessary. The Customer and Product objects of the transaction also have a large number of attributes. Customer for example, includes hierarchies, hierarchical and matrixed relationships, contacts, classifications, preferences, accounts, identifiers, profiles, and addresses galore for 'ship to', 'mail to'; 'service at'; etc. Dozens of attributes exist for individuals, hundreds for organizations, and thousands for products. This data has meaning beyond any particular application. It exists in many applications and drives the vital cross application enterprise business processes. These are the processes that define and differentiate the organization. At every decision point, information about the objects of the process determines the direction of the process flow. This is the nature of the data that exists in more than one application, and this is why we call it 'master data'. Let me elaborate. Parties Oracle has developed a party schema to model all participants in your daily business operations. It models people, organizations, groups, customers, contacts, employees, and suppliers. It models their accounts, locations, classifications, and preferences.  And most importantly, it models the vast array of hierarchical and matrixed relationships that exist between all the participants in your real world operations.  The model logically separates people and organizations from their relationships and accounts.  This separation creates flexibility unmatched in the industry and accounts for the fact that the Oracle schema for Customers, Suppliers, and Accounts is a true superset of the wide variety of commercial and homegrown customer models in existence. Sites Sites are places where business is conducted. They can be addresses, clusters such as retail malls, locations within a cluster, floors within a building, places where meters are located, rooms on floors, etc.  Fully understanding all attributes of a site is key to many business processes. Attributes such as 'noise abatement policy' at a point of delivery, or the size of an oven in a business kitchen drive day-to-day activities such as delivery schedules or food promotions. Typically this kind of data is siloed in departments and scattered across applications and spreadsheets.  This leads to conflicting information and poor operational efficiencies. Oracle's Global Single Schema can hold all site attributes in one place and enables a single version of authoritative site information across the enterprise. Products and Services The Oracle Global Single Schema also includes a number of entities that define the products and services a company creates and offers for sale. Key entities include Items organized into Catalogs and Price Lists. The Catalog structures provide for the ability to capture different views of a product such as engineering, manufacturing, and service which are based on a unified product model. As a result, designers, manufacturing engineers, purchasers and partners can work simultaneously on a common product definition. The Catalog schema allows for unlimited attributes, combines them into meaningful groups, and maps them to catalog categories to track these different types of information. The model also maps an unlimited number of functional structures for each item. For example, multiple Bills of Material (BOMs) can be constructed representing requirements BOM, features BOM, and packaging BOM for an item. The Catalog model also supports hierarchical information about each item and all standard Global Data Synchronization attributes. Business Processes Utilizing Linked Data Entities Each business entity codified into a centralized master data environment significantly improves the efficiency of the automated business processes that use the consolidated data.  When all the key business entities used by an organization's process are so consolidated, the advantages are multiplied.  The primary reason for business process breakdowns (i.e. data errors across application boundaries) is eliminated. All processes are positively impacted and business process automation is itself automated.  I like to use the "Call to Resolution" business process as an example to help illustrate this important point. It involves call center applications, service applications, RMA applications, transportation applications, inventory applications, etc. Customer, Site, Product and Supplier master data must all be correct and consistent across these applications.  What's more, the data relationships between customer and product, and product and suppliers must be right. This is the minimum quality needed to insure the business process flows without error. But that is not the end of the story. Critical master data attributes such as customer loyalty, profitability, credit worthiness, and propensity to buy can optimize the call center point of contact component of the process. Critical product information such as alternative parts or equivalent products can optimize the resolution selected by the process. A comprehensive understanding of the 'service at' location can help insure multiple trips are avoided in the process. Full supplier information on reliability, delivery delays, and potential alternates can prevent supplier exceptions and play a significant role in optimizing the process.  In other words, these master data attributes enable the optimization of the "Call to Resolution" enterprise business process. Master data supports and guides business process flows. Thus the phrase 'Master Data' is indeed appropriate. MDM is the software that houses, manages, and governs the master data that resides in all applications and controls the enterprise business processes. A complete master data solution takes a data model that holds fully attributed master data entities and their inter-relationships. Oracle has this model. Oracle, with its deep understanding of application data is the logical choice for managing all your master data within the enterprise whether or not your organization actually runs any Oracle Applications.

    Read the article

  • DAO/Webservice Consumption in Web Application

    - by Gavin
    I am currently working on converting a "legacy" web-based (Coldfusion) application from single data source (MSSQL database) to multi-tier OOP. In my current system there is a read/write database with all the usual stuff and additional "read-only" databases that are exported daily/hourly from an Enterprise Resource Planning (ERP) system by SSIS jobs with business product/item and manufacturing/SCM planning data. The reason I have the opportunity and need to convert to multi-tier OOP is a newer more modern ERP system is being implemented business wide that will be a complete replacement. This newer ERP system offers several interfaces for third party applications like mine, from direct SQL access to either a dotNet web-service or a SOAP-like web-service. I have found several suitable frameworks I would be happy to use (Coldspring, FW/1) but I am not sure what design patterns apply to my data access object/component and how to manage the connection/session tokens, with this background, my question has the following three parts: Firstly I have concerns with moving from the relative safety of a SSIS job that protects me from downtime and speed of the ERP system to directly connecting with one of the web services which I note seem significantly slower than I expected (simple/small requests often take up to a whole second). Are there any design patterns I can investigate/use to cache/protect my data tier? It is my understanding data access objects (the component that connects directly with the web services and convert them into the data types I can then work with in my Domain Objects) should be singletons (and will act as an Adapter/Facade), am I correct? As part of the data access object I have to setup a connection by username/password (I could set up multiple users and/or connect multiple times with this) which responds with a session token that needs to be provided on every subsequent request. Do I do this once and share it across the whole application, do I setup a new "connection" for every user of my application and keep the token in their session scope (might quickly hit licensing limits), do I set the "connection" up per page request, or is there a design pattern I am missing that can manage multiple "connections" where a requests/access uses the first free "connection"? It is worth noting if the ERP system dies I will need to reset/invalidate all the connections and start from scratch, and depending on which web-service I use might need manually close the "connection/session"

    Read the article

  • Script Task/Component and Template Information

    The Script Task and Script component are often used by people developing SSIS packages because they are easy to use and now because SSIS could be perceived to be more developer friendly they are very powerful. That being said we should no be using them everywhere.  There are generally Tasks/Components already provided that will do the job it may be that we have to rethink the way we want to draw our package. I had cause last week to break out the script component in SQL Server 2008 SP1 and found that it was broken.  I don’t know when it broke as I do not use them all that often.  My error was as below.     Something must have overwritten this template information.  I looked in Event Viewer and tried the things it suggested but the templates still did not work.  Here is how I got them eventually to work for me (Your Mileage may vary) Open up a Command Prompt window using an administrator level account and “as an administrator” vsta.exe /hostid SSIS_ScriptTask /setup vsta.exe /hostid SSIS_ScriptComponent /setup   This worked for me.  Hope it helps.

    Read the article

  • IIS needs to be restarted every morning

    - by Kevin
    In one of my Application and DB Server , SSIS package runs at night. Every morning i need to reset IIS to work the Application Fast and smoothly. One day i tried to SKIP the SSIS Package and next day i hvnt Done the IIS reset. What could be the problem. Is there any alternate Solution for IIS reset. How can i schedule and make sure the IIS is RESTARTED through Batch File / s. Application is developed in .NET and DB is SQL latest version. The application is hostes on cloud server. Your prompt reply will be helpful for me.

    Read the article

  • Problem transferring file to FTP server - 503 Could not create file

    - by blntechie
    I try to transfer a file from SSIS package to a FTP server provided by my client. When I try connecting and transfer using a FTP tool (WinSCP) manually it works fine and file was transferred. But when i use the ftp command in command prompt and try to transfer using "send" command i get an error "553 could not create file". This also happens from SSIS package. I'm not in control of the FTP server but if anyone can provide a solution I can pass it to the admin or anything to be done from my end?

    Read the article

  • XSLT adding elements on the same path

    - by Stefan
    Consider the following XML: <?xml version="1.0" encoding="ISO-8859-1"?> <catalog> <cd> <title>Empire Burlesque</title> <artist> <name>Bob</name> <surname>Dylan</surname> </artist> <country>USA</country> <company>Columbia</company> <price>10.90</price> <year>1985</year> </cd> </catalog> I want to add elements to this XML using XSLT, to get the following result: <?xml version="1.0" encoding="ISO-8859-1"?> <catalog> <cd> <title>Empire Burlesque</title> <artist> <name>Bob</name> <surname>Dylan</surname> <!-- NEW --> <middlename>???</middlename> </artist> <country>USA</country> <company>Columbia</company> <price>10.90</price> <year>1985</year> <!-- NEW --> <comment>great one</comment> </cd> <!-- NEW --> <cd> <title>Hide your heart</title> <artist> <name>Bonnie</name> <surname>Tyler</surname> </artist> <country>UK</country> <company>CBS Records</company> <price>9.90</price> <year>1988</year> </cd> </catalog> To achieve that, I wrote the following XSLT: <?xml version="1.0" encoding="ISO-8859-1"?> <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"> <xsl:template name="injectXml"> <xsl:param name="whatToInject"/> <xsl:copy> <xsl:copy-of select="node() | @*"/> <xsl:copy-of select="$whatToInject"/> </xsl:copy> </xsl:template> <xsl:template match="//catalog"> <xsl:call-template name="injectXml"> <xsl:with-param name="whatToInject"> <cd> <title>Hide your heart</title> <artist> <name>Bonnie</name> <surname>Tyler</surname> </artist> <country>UK</country> <company>CBS Records</company> <price>9.90</price> <year>1988</year> </cd> </xsl:with-param> </xsl:call-template> </xsl:template> <xsl:template match="//cd[year=1985]"> <xsl:call-template name="injectXml"> <xsl:with-param name="whatToInject"> <comment>great one</comment> </xsl:with-param> </xsl:call-template> </xsl:template> <xsl:template match="//cd[year=1985]/artist"> <xsl:call-template name="injectXml"> <xsl:with-param name="whatToInject"> <middlename>???</middlename> </xsl:with-param> </xsl:call-template> </xsl:template> </xsl:stylesheet> Why it's not working? How to do it?

    Read the article

  • MEF + Plug-In not updating

    - by mybrokengnome
    I asked this on the MEF Codeplex forum already, but I haven't gotten a response yet, so I figured I'd try StackOverflow. Here's the original post if anyone's interested (this is just a copy from it): MEF Codeplex "Let me first say that I'm completely new to MEF (just discovered it today) and am very happy with it so far. However, I've ran in to a problem that is very frustrating. I'm creating an app that will have a plugin architecture and the plugins will only be stored in a single DLL file (or coded into the main app). The DLL file needs to be able to be recompiled during run-time and the app should recognize this and re-load the plugins (I know this is difficult, but it's a requirement). To accomplish this I took the approach covered http://blog.maartenballiauw.be/category/MEF.aspx there (look for WebServerDirectoryCatalog). Basically the idea is to "monitor the plugins folder, copy the new/modified assemblies to the web application’s /bin folder and instruct MEF to load its exports from there." This is my code, which is probably not the correct way to do it but it's what I found in some samples around the net: main()... string myExecName = Assembly.GetExecutingAssembly().Location; string myPath = System.IO.Path.GetDirectoryName(myExecName); catalog = new AggregateCatalog(); pluginCatalog = new MyDirectoryCatalog(myPath + @"/Plugins"); catalog.Catalogs.Add(pluginCatalog); exportContainer = new CompositionContainer(catalog); CompositionBatch compBatch = new CompositionBatch(); compBatch.AddPart(this); compBatch.AddPart(catalog); exportContainer.Compose(compBatch); and private FileSystemWatcher fileSystemWatcher; public DirectoryCatalog directoryCatalog; private string path; private string extension; public MyDirectoryCatalog(string path) { Initialize(path, "*.dll", "*.dll"); } private void Initialize(string path, string extension, string modulePattern) { this.path = path; this.extension = extension; fileSystemWatcher = new FileSystemWatcher(path, modulePattern); fileSystemWatcher.Changed += new FileSystemEventHandler(fileSystemWatcher_Changed); fileSystemWatcher.Created += new FileSystemEventHandler(fileSystemWatcher_Created); fileSystemWatcher.Deleted += new FileSystemEventHandler(fileSystemWatcher_Deleted); fileSystemWatcher.Renamed += new RenamedEventHandler(fileSystemWatcher_Renamed); fileSystemWatcher.IncludeSubdirectories = false; fileSystemWatcher.EnableRaisingEvents = true; Refresh(); } void fileSystemWatcher_Renamed(object sender, RenamedEventArgs e) { RemoveFromBin(e.OldName); Refresh(); } void fileSystemWatcher_Deleted(object sender, FileSystemEventArgs e) { RemoveFromBin(e.Name); Refresh(); } void fileSystemWatcher_Created(object sender, FileSystemEventArgs e) { Refresh(); } void fileSystemWatcher_Changed(object sender, FileSystemEventArgs e) { Refresh(); } private void Refresh() { // Determine /bin path string binPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "Plugins"); string newPath = ""; // Copy files to /bin foreach (string file in Directory.GetFiles(path, extension, SearchOption.TopDirectoryOnly)) { try { DirectoryInfo dInfo = new DirectoryInfo(binPath); DirectoryInfo[] dirs = dInfo.GetDirectories(); int count = dirs.Count() + 1; newPath = binPath + "/" + count; DirectoryInfo dInfo2 = new DirectoryInfo(newPath); if (!dInfo2.Exists) dInfo2.Create(); File.Copy(file, System.IO.Path.Combine(newPath, System.IO.Path.GetFileName(file)), true); } catch { // Not that big deal... Blog readers will probably kill me for this bit of code :-) } } // Create new directory catalog directoryCatalog = new DirectoryCatalog(newPath, extension); directoryCatalog.Refresh(); } public override IQueryable<ComposablePartDefinition> Parts { get { return directoryCatalog.Parts; } } private void RemoveFromBin(string name) { string binPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, ""); File.Delete(Path.Combine(binPath, name)); } So all this actually works, and after the end of the code in main my IEnumerable variable is actually filled with all the plugins in the DLL (which if you follow the code is located in Plugins/1 so that I can modify the dll in the plugins folder). So now at this point I should be able to re-compile the plugins DLL, drop it in to the Plugins folder, my FileWatcher detect that it's changed, and then copy it into folder "2" and directoryCatalog should point to the new folder. All this actually works! The problem is, even though it seems like every thing is pointed to the right place, my IEnumerable variable is never updated with the new plugins. So close, but yet so far! Any suggestions? I know the downsides of doing it this way, that no dll is actually getting unloaded and causing a memory leak, but it's a Windows App and will probably be started at least once a day, and the plugins are un-likely to change that often, but it's still a requirement from the client that it does this without re-loading the app. Thanks! Thanks for any help you all can provide, it's driving me crazy not being able to figure this out."

    Read the article

  • What are best practices for collecting, maintaining and ensuring accuracy of a huge data set?

    - by Kyle West
    I am posing this question looking for practical advice on how to design a system. Sites like amazon.com and pandora have and maintain huge data sets to run their core business. For example, amazon (and every other major e-commerce site) has millions of products for sale, images of those products, pricing, specifications, etc. etc. etc. Ignoring the data coming in from 3rd party sellers and the user generated content all that "stuff" had to come from somewhere and is maintained by someone. It's also incredibly detailed and accurate. How? How do they do it? Is there just an army of data-entry clerks or have they devised systems to handle the grunt work? My company is in a similar situation. We maintain a huge (10-of-millions of records) catalog of automotive parts and the cars they fit. We've been at it for a while now and have come up with a number of programs and processes to keep our catalog growing and accurate; however, it seems like to grow the catalog to x items we need to grow the team to y. I need to figure some ways to increase the efficiency of the data team and hopefully I can learn from the work of others. Any suggestions are appreciated, more though would be links to content I could spend some serious time reading. THANKS! Kyle

    Read the article

< Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >