Search Results

Search found 881 results on 36 pages for 'audit trail'.

Page 30/36 | < Previous Page | 26 27 28 29 30 31 32 33 34 35 36  | Next Page >

  • Defining reliable SIlverlight 4 architecture

    - by doteneter
    Hello everybody, It's my first question on SO. I know that there were many topics on Silverlight and architecture but didn't find answers that satisfies me. I'm ASP.NET MVC developer and are used to work on architectures built with the best practices (loose coupling with DI, etc.) Now I'm faced to the new Silverlight 4 project and would like to be sure I'm doing the best choices as I'm not experienced. Main features required by the applications are as follows : use existing SQL Server Database but with possibility to move to the cloud. using EF4 for the data acess with SQL Server. exitensibility : adding new modules without changing the main host. loose coupling. I was looking at different webcasts (Taulty, etc.), blogs about Silverlight and came up with the following architecture. EF 4 for data access (as specified with the requirements) WCF RIA Services for mid-tiers controling access to data for queries and enabling end-to-end support for data validation, authentication and roles. MEF Support for enabling modules. Unity 2.0 for DI. The problem is that I don't know how to define a reliable architecture where all these elements play well together. Should I use a framework instead like Prism or Caliburn? But for now I'm not sure what scenarios they support. What's the best usages for Unity in Silverlight ? I used to use IoC in ASP.NET MVC for loos coupling and other things like interception for audit logging. It seems that for Silverlight Unity doesn't support Interception. I would like to use it to enable loose coupling and to enable to move to the cloud if needed. Thanks in advance for your help.

    Read the article

  • Using the Salesforce PHP API to generate a User Profile Report

    - by Phill Pafford
    Hi All, Looking to do a security audit of all user permissions. I think I can use the Salesforce PHPToolkit 11 API to generate the report but new to Salesforce and a little confused on where to start. In Salesforce Setup Under: Administration Setup -> Manage Users -> Profiles -> Profile Names If you click on each user name you can see the permissions set and the actions the user is allowed to perform. Wanted a way to generate an excel report for all users with all the permissions for that user. Example: User Name | Can view Case | Can edit case | Can delete case | etc... phill yes no no x... and so on. I see that in Salesforce I can run a high level report on the Profile but I need to drill down for each user. Has anyone every done this type of reporting before? any help on this would be great. Thanks in advacne, --Phill

    Read the article

  • SQL Quey slow in .NET application but instantaneous in SQL Server Management Studio

    - by user203882
    Here is the SQL SELECT tal.TrustAccountValue FROM TrustAccountLog AS tal INNER JOIN TrustAccount ta ON ta.TrustAccountID = tal.TrustAccountID INNER JOIN Users usr ON usr.UserID = ta.UserID WHERE usr.UserID = 70402 AND ta.TrustAccountID = 117249 AND tal.trustaccountlogid = ( SELECT MAX (tal.trustaccountlogid) FROM TrustAccountLog AS tal INNER JOIN TrustAccount ta ON ta.TrustAccountID = tal.TrustAccountID INNER JOIN Users usr ON usr.UserID = ta.UserID WHERE usr.UserID = 70402 AND ta.TrustAccountID = 117249 AND tal.TrustAccountLogDate < '3/1/2010 12:00:00 AM' ) Basicaly there is a Users table a TrustAccount table and a TrustAccountLog table. Users: Contains users and their details TrustAccount: A User can have multiple TrustAccounts. TrustAccountLog: Contains an audit of all TrustAccount "movements". A TrustAccount is associated with multiple TrustAccountLog entries. Now this query executes in milliseconds inside SQL Server Management Studio, but for some strange reason it takes forever in my C# app and even timesout (120s) sometimes. Here is the code in a nutshell. It gets called multiple times in a loop and the statement gets prepared. cmd.CommandTimeout = Configuration.DBTimeout; cmd.CommandText = "SELECT tal.TrustAccountValue FROM TrustAccountLog AS tal INNER JOIN TrustAccount ta ON ta.TrustAccountID = tal.TrustAccountID INNER JOIN Users usr ON usr.UserID = ta.UserID WHERE usr.UserID = @UserID1 AND ta.TrustAccountID = @TrustAccountID1 AND tal.trustaccountlogid = (SELECT MAX (tal.trustaccountlogid) FROM TrustAccountLog AS tal INNER JOIN TrustAccount ta ON ta.TrustAccountID = tal.TrustAccountID INNER JOIN Users usr ON usr.UserID = ta.UserID WHERE usr.UserID = @UserID2 AND ta.TrustAccountID = @TrustAccountID2 AND tal.TrustAccountLogDate < @TrustAccountLogDate2 ))"; cmd.Parameters.Add("@TrustAccountID1", SqlDbType.Int).Value = trustAccountId; cmd.Parameters.Add("@UserID1", SqlDbType.Int).Value = userId; cmd.Parameters.Add("@TrustAccountID2", SqlDbType.Int).Value = trustAccountId; cmd.Parameters.Add("@UserID2", SqlDbType.Int).Value = userId; cmd.Parameters.Add("@TrustAccountLogDate2", SqlDbType.DateTime).Value =TrustAccountLogDate; // And then... reader = cmd.ExecuteReader(); if (reader.Read()) { double value = (double)reader.GetValue(0); if (System.Double.IsNaN(value)) return 0; else return value; } else return 0;

    Read the article

  • What are some strategies for maintaining a common database schema with a team of developers and no D

    - by Mahmoud Abdelkader
    I'm curious about how others have approached the problem of maintaining and synchronizing database changes across many (10+) developers without a DBA? What I mean, basically, is that if someone wants to make a change to the database, what are some strategies to doing that? (i.e. I've created a 'Car' model and now I want to apply the appropriate DDL to the database, etc..) We're primarily a Python shop and our ORM is SQLAlchemy. Previously, we had written our models in such a way to create the models using our ORM, but we recently ditched this because: We couldn't track changes using the ORM The state of the ORM wasn't in sync with the database (e.g. lots of differences primarily related to indexes and unique constraints) There was no way to audit database changes unless the developer documented the database change via email to the team. Our solution to this problem was to basically have a "gatekeeper" individual who checks every change into the database and applies all accepted database changes to an accepted_db_changes.sql file, whereby the developers who need to make any database changes put their requests into a proposed_db_changes.sql file. We check this file in, and, when it's updated, we all apply the change to our personal database on our development machine. We don't create indexes or constraints on the models, they are applied explicitly on the database. I would like to know what are some strategies to maintain database schemas and if ours is seems reasonable. Thanks!

    Read the article

  • C# Communication between threads.

    - by GT
    Hi, I am using .NET 3.5 and am trying to wrap my head around a problem (not being a supreme threading expert bear with me). I have a windows service which has a very intensive process that is always running, I have put this process onto a separate thread so that the main thread of my service can handle operational tasks - i.e., service audit cycles, handling configuration changes, etc, etc. I'm starting the thread via the typical ThreadStart to a method which kicks the process off - call it workerthread. On this workerthread I am sending data to another server, as is expected the server reboots every now and again and connection is lost and I need to re-establish the connection (I am notified by the lost of connection via an event). From here I do my reconnect logic and I am back in and running, however what I easily started to notice to happen was that I was creating this worker thread over and over again each time (not what I want). Now I could kill the workerthread when I lose the connection and start a new one but this seems like a waste of resources. What I really want to do, is marshal the call (i.e., my thread start method) back to the thread that is still in memory although not doing anything. Please post any examples or docs you have that would be of use. Thanks.

    Read the article

  • SQL Query slow in .NET application but instantaneous in SQL Server Management Studio

    - by user203882
    Here is the SQL SELECT tal.TrustAccountValue FROM TrustAccountLog AS tal INNER JOIN TrustAccount ta ON ta.TrustAccountID = tal.TrustAccountID INNER JOIN Users usr ON usr.UserID = ta.UserID WHERE usr.UserID = 70402 AND ta.TrustAccountID = 117249 AND tal.trustaccountlogid = ( SELECT MAX (tal.trustaccountlogid) FROM TrustAccountLog AS tal INNER JOIN TrustAccount ta ON ta.TrustAccountID = tal.TrustAccountID INNER JOIN Users usr ON usr.UserID = ta.UserID WHERE usr.UserID = 70402 AND ta.TrustAccountID = 117249 AND tal.TrustAccountLogDate < '3/1/2010 12:00:00 AM' ) Basicaly there is a Users table a TrustAccount table and a TrustAccountLog table. Users: Contains users and their details TrustAccount: A User can have multiple TrustAccounts. TrustAccountLog: Contains an audit of all TrustAccount "movements". A TrustAccount is associated with multiple TrustAccountLog entries. Now this query executes in milliseconds inside SQL Server Management Studio, but for some strange reason it takes forever in my C# app and even timesout (120s) sometimes. Here is the code in a nutshell. It gets called multiple times in a loop and the statement gets prepared. cmd.CommandTimeout = Configuration.DBTimeout; cmd.CommandText = "SELECT tal.TrustAccountValue FROM TrustAccountLog AS tal INNER JOIN TrustAccount ta ON ta.TrustAccountID = tal.TrustAccountID INNER JOIN Users usr ON usr.UserID = ta.UserID WHERE usr.UserID = @UserID1 AND ta.TrustAccountID = @TrustAccountID1 AND tal.trustaccountlogid = (SELECT MAX (tal.trustaccountlogid) FROM TrustAccountLog AS tal INNER JOIN TrustAccount ta ON ta.TrustAccountID = tal.TrustAccountID INNER JOIN Users usr ON usr.UserID = ta.UserID WHERE usr.UserID = @UserID2 AND ta.TrustAccountID = @TrustAccountID2 AND tal.TrustAccountLogDate < @TrustAccountLogDate2 ))"; cmd.Parameters.Add("@TrustAccountID1", SqlDbType.Int).Value = trustAccountId; cmd.Parameters.Add("@UserID1", SqlDbType.Int).Value = userId; cmd.Parameters.Add("@TrustAccountID2", SqlDbType.Int).Value = trustAccountId; cmd.Parameters.Add("@UserID2", SqlDbType.Int).Value = userId; cmd.Parameters.Add("@TrustAccountLogDate2", SqlDbType.DateTime).Value =TrustAccountLogDate; // And then... reader = cmd.ExecuteReader(); if (reader.Read()) { double value = (double)reader.GetValue(0); if (System.Double.IsNaN(value)) return 0; else return value; } else return 0;

    Read the article

  • How to make xml in C#?

    - by kacalapy
    i have not worked with XML in a while can someone post the syntax needed to build and save an xml node structure that resembles that of a tree structure that is created by a recursive function. basically i have a recursive function that saves data found an a page(url) and then follows each URL found on that page recursivley and does the same to it. to audit this i want an output as a xml file to disk so i can see how it is doing its recursion and parsing. the code i have is below. please add to it the needed xml calls needed to create a xml structure like the one i show below. (please include the full name space so i can see where the objects i need in .net are for this.) page1.htm has 2 links on it: <a href=page1_1.htm> <a href=page1_2.htm> page1_1.htm has 2 links on it <a href=page1_1_a.htm> (this then will have some links also) <a href=page1_1_b.htm> (this then will have no more links on it - dead end) the xml should do something like this: <node url=page1.htm> ...<node url=page1_1_a.htm> ...... <node url="xxx.htm"/> ...... <node url="yyy.htm".> ... </node> ...<node url=page1_1_b.htm /> </node>

    Read the article

  • SQL Server: Output an XML field as tabular data using a stored procedure

    - by Pawan
    I am using a table with an XML data field to store the audit trails of all other tables in the database. That means the same XML field has various XML information. For example my table has two records with XML data like this: 1st record: <client> <name>xyz</name> <ssn>432-54-4231</ssn> </client> 2nd record: <emp> <name>abc</name> <sal>5000</sal> </emp> These are the two sample formats and just two records. The table actually has many more XML formats in the same field and many records in each format. Now my problem is that upon query I need these XML formats to be converted into tabular result sets. What are the options for me? It would be a regular task to query this table and generate reports from it. I want to create a stored procedure to which I can pass that I need to query "<emp>" or "<client>", then my stored procedure should return tabular data.

    Read the article

  • Executing sequential stored procedures; works in query analyzer, doesn't in my .NET application

    - by evanmortland
    Hello, I have an audit record table that I am writing to. I am connecting to MyDb, which has a stored procedure called 'CreateAudit', which is a passthrough stored procedure to another database on the same machine called MyOther DB with a stored procedure called 'CreatedAudit' as well. In other words in MyDB I have CreateAudit, which does the following EXEC dbo.MyOtherDB.CreateAudit. I call the MyDb CreateAudit stored procedure from my application, using subsonic as the DAL. The first time I call it, I call it with the following (pseudocode): Result = CreateAudit(recordId, "Opened") One line after that, I call: Result2 = CreateAudit(recordId, "Closed") In my second stored procedure it is supposed to mark the record that was created by the CreateAudit(recordId, "Opened") with a status of closed. It works great if I run them independently of one another, but when they run in sequence in the application, the record is not marked as "Closed". When I run SQL profiler I see that both queries ran, and if I copy the queries out and run them from query analyzer the record gets marked as closed 100% of the time! When I run it from the application, about once every 20 times or so, the record is successfully marked closed - the other 19 times nothing happens, but I do not get an error! Is it possible for the .NET app to skip over the ouput from the first stored procedure and start executing the second stored procedure before the record in the first is created? When I add a "WAITFOR DELAY '00:00:00:003'" to the top of my stored procedure, the record is also closed 100% of the time. My head is spinning, any ideas why this is happening! Thanks for any responses, very interested in hearing how this can happen.

    Read the article

  • Auditing in Entity Framework.

    - by Gabriel Susai
    After going through Entity Framework I have a couple of questions on implementing auditing in Entity Framework. I want to store each column values that is created or updated to a different audit table. Rightnow I am calling SaveChanges(false) to save the records in the DB(still the changes in context is not reset). Then get the added | modified records and loop through the GetObjectStateEntries. But don't know how to get the values of the columns where their values are filled by stored proc. ie, createdate, modifieddate etc. Below is the sample code I am working on it. //Get the changed entires( ie, records) IEnumerable<ObjectStateEntry> changes = context.ObjectStateManager.GetObjectStateEntries(EntityState.Modified); //Iterate each ObjectStateEntry( for each record in the update/modified collection) foreach (ObjectStateEntry entry in changes) { //Iterate the columns in each record and get thier old and new value respectively foreach (var columnName in entry.GetModifiedProperties()) { string oldValue = entry.OriginalValues[columnName].ToString(); string newValue = entry.CurrentValues[columnName].ToString(); //Do Some Auditing by sending entityname, columnname, oldvalue, newvalue } } changes = context.ObjectStateManager.GetObjectStateEntries(EntityState.Added); foreach (ObjectStateEntry entry in changes) { if (entry.IsRelationship) continue; var columnNames = (from p in entry.EntitySet.ElementType.Members select p.Name).ToList(); foreach (var columnName in columnNames) { string newValue = entry.CurrentValues[columnName].ToString(); //Do Some Auditing by sending entityname, columnname, value } }

    Read the article

  • Global hotkey capture in VB.net

    - by ggonsalv
    I want to have my app which is minimized to capture data selected in another app's window when the hot key is pressed. My app definitely doesn't have the focus. Additionally when the hot key is pressed I want to present a fading popup (Outlook style) so my app never gets focus. At a minimum I want to capture the Window name, Process ID and the selected data. The app which has focus is not my application? I know one option is to sniff the Clipboard, but are there any other solutions. This is to audit the rate of data-entry in to another system of which I have no control. It is a mainframe emulation client program(attachmate). The plan is complete data entry in Application X. Select a certain section of the screen in App X which is proof of data entry (transaction ID). Press the Magic Hotkey, which then 'sends' the selection to my App. From System.environment or system.Threading I can find the Windows logon. Similiarly I can also capture the time. All the data will be logged to SQL. Once Complete show Outlook style pop up saying the data entry has been logged. Any thoughts.

    Read the article

  • Convert Google Analytics cookies to Local/Session Storage

    - by David Murdoch
    Google Analytics sets 4 cookies that will be sent with all requests to that domain (and ofset its subdomains). From what I can tell no server actually uses them directly; they're only sent with __utm.gif as a query param. Now, obviously Google Analytics reads, writes and acts on their values and they will need to be available to the GA tracking script. So, what I am wondering is if it is possible to: rewrite the __utm* cookies to local storage after ga.js has written them delete them after ga.js has run rewrite the cookies FROM local storage back to cookie form right before ga.js reads them start over Or, monkey patch ga.js to use local storage before it begins the cookie read/write part. Obviously if we are going so far out of the way to remove the __utm* cookies we'll want to also use the Async variant of Analytics. I'm guessing the down vote was because I didn't ask a question. DOH! My questions are: Can it be done as described above? If so, why hasn't it been done? I have a default HTML/CSS/JS boilerplate template that passes YSlow, PageSpeed, and Chrome's Audit with near perfect scores. I'm really looking for a way to squeeze those remaining cookie bytes from Google Analytics in browsers that support local storage.

    Read the article

  • Auditing front end performance on web application

    - by user1018494
    I am currently trying to performance tune the UI of a company web application. The application is only ever going to be accessed by staff, so the speed of the connection between the server and client will always be considerably more than if it was on the internet. I have been using performance auditing tools such as Y Slow! and Google Chrome's profiling tool to try and highlight areas that are worth targeting for investigation. However, these tools are written with the internet in mind. For example, the current suggestions from a Google Chrome audit of the application suggests is as follows: Network Utilization Combine external CSS (Red warning) Combine external JavaScript (Red warning) Enable gzip compression (Red warning) Leverage browser caching (Red warning) Leverage proxy caching (Amber warning) Minimise cookie size (Amber warning) Parallelize downloads across hostnames (Amber warning) Serve static content from a cookieless domain (Amber warning) Web Page Performance Remove unused CSS rules (Amber warning) Use normal CSS property names instead of vendor-prefixed ones (Amber warning) Are any of these bits of advice totally redundant given the connection speed and usage pattern? The users will be using the application frequently throughout the day, so it doesn't matter if the initial hit is large (when they first visit the page and build their cache) so long as a minimal amount of work is done on future page views. For example, is it worth the effort of combining all of our CSS and JavaScript files? It may speed up the initial page view, but how much of a difference will it really make on subsequent page views throughout the working day? I've tried searching for this but all I keep coming up with is the standard internet facing performance advice. Any advice on what to focus my performance tweaking efforts on in this scenario, or other auditing tool recommendations, would be much appreciated.

    Read the article

  • Async task ASP.net HttpContext.Current.Items is empty - How do handle this?

    - by GuruC
    We are running a very large web application in asp.net MVC .NET 4.0. Recently we had an audit done and the performance team says that there were a lot of null reference exceptions. So I started investigating it from the dumps and event viewer. My understanding was as follows: We are using Asyn Tasks in our controllers. We rely on HttpContext.Current.Items hashtable to store a lot of Application level values. Task<Articles>.Factory.StartNew(() => { System.Web.HttpContext.Current = ControllerContext.HttpContext.ApplicationInstance.Context; var service = new ArticlesService(page); return service.GetArticles(); }).ContinueWith(t => SetResult(t, "articles")); So we are copying the context object onto the new thread that is spawned from Task factory. This context.Items is used again in the thread wherever necessary. Say for ex: public class SomeClass { internal static int StreamID { get { if (HttpContext.Current != null) { return (int)HttpContext.Current.Items["StreamID"]; } else { return DEFAULT_STREAM_ID; } } } This runs fine as long as number of parallel requests are optimal. My questions are as follows: 1. When the load is more and there are too many parallel requests, I notice that HttpContext.Current.Items is empty. I am not able to figure out a reason for this and this causes all the null reference exceptions. 2. How do we make sure it is not null ? Any workaround if present ? NOTE: I read through in StackOverflow and people have questions like HttpContext.Current is null - but in my case it is not null and its empty. I was reading one more article where the author says that sometimes request object is terminated and it may cause problems since dispose is already called on objects. I am doing a copy of Context object - its just a shallow copy and not a deep copy.

    Read the article

  • Can a second stored procedure doing the same thing finish before first one?

    - by evanmortland
    Hello, I have an audit record table that I am writing to. I am connecting to MyDb, which has a stored procedure called 'CreateAudit', which is a passthrough stored procedure to another database on the same machine called 'CreatedAudit' as well. I call the CreateAudit stored procedure from my application, using subsonic as the DAL. The first time I call it, I call it with the following (pseudocode): Result = CreateAudit(recordId, "Opened") Right after that, I call: Result2 = CreateAudit(recordId, "Closed") In my second stored procedure it is supposed to mark the record that was created by the CreateAudit(recordId, "Opened") with a status of closed. It works great if I run them independently of one another, but when they run in sequence in the application, the record is not marked as "Closed". When I run SQL profiler I see that both queries ran, and if I copy the queries out and run them from query analyzer the record gets marked as closed 100% of the time! When I run it from the application, about once every 20 times or so, the record is successfully marked closed - the other 19 times nothing happens, but I do not get an error! Is it possible for the .NET app to skip over the ouput from the first stored procedure and start executing the second stored procedure before the record in the first is created? When I add a "WAITFOR DELAY '00:00:00:003'" to the top of my stored procedure, the record is also closed 100% of the time. My head is spinning, any ideas why this is happening! Thanks for any responses, very interested in hearing how this can happen.

    Read the article

  • CreationName for SSIS 2008 and adding components programmatically

    If you are building SSIS 2008 packages programmatically and adding data flow components, you will probably need to know the creation name of the component to add. I can never find a handy reference when I need one, hence this rather mundane post. See also CreationName for SSS 2005. We start with a very simple snippet for adding a component: // Add the Data Flow Task package.Executables.Add("STOCK:PipelineTask"); // Get the task host wrapper, and the Data Flow task TaskHost taskHost = package.Executables[0] as TaskHost; MainPipe dataFlowTask = (MainPipe)taskHost.InnerObject; // Add OLE-DB source component - ** This is where we need the creation name ** IDTSComponentMetaData90 componentSource = dataFlowTask.ComponentMetaDataCollection.New(); componentSource.Name = "OLEDBSource"; componentSource.ComponentClassID = "DTSAdapter.OLEDBSource.2"; So as you can see the creation name for a OLE-DB Source is DTSAdapter.OLEDBSource.2. CreationName Reference  ADO NET Destination Microsoft.SqlServer.Dts.Pipeline.ADONETDestination, Microsoft.SqlServer.ADONETDest, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91 ADO NET Source Microsoft.SqlServer.Dts.Pipeline.DataReaderSourceAdapter, Microsoft.SqlServer.ADONETSrc, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91 Aggregate DTSTransform.Aggregate.2 Audit DTSTransform.Lineage.2 Cache Transform DTSTransform.Cache.1 Character Map DTSTransform.CharacterMap.2 Checksum Konesans.Dts.Pipeline.ChecksumTransform.ChecksumTransform, Konesans.Dts.Pipeline.ChecksumTransform, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b2ab4a111192992b Conditional Split DTSTransform.ConditionalSplit.2 Copy Column DTSTransform.CopyMap.2 Data Conversion DTSTransform.DataConvert.2 Data Mining Model Training MSMDPP.PXPipelineProcessDM.2 Data Mining Query MSMDPP.PXPipelineDMQuery.2 DataReader Destination Microsoft.SqlServer.Dts.Pipeline.DataReaderDestinationAdapter, Microsoft.SqlServer.DataReaderDest, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91 Derived Column DTSTransform.DerivedColumn.2 Dimension Processing MSMDPP.PXPipelineProcessDimension.2 Excel Destination DTSAdapter.ExcelDestination.2 Excel Source DTSAdapter.ExcelSource.2 Export Column TxFileExtractor.Extractor.2 Flat File Destination DTSAdapter.FlatFileDestination.2 Flat File Source DTSAdapter.FlatFileSource.2 Fuzzy Grouping DTSTransform.GroupDups.2 Fuzzy Lookup DTSTransform.BestMatch.2 Import Column TxFileInserter.Inserter.2 Lookup DTSTransform.Lookup.2 Merge DTSTransform.Merge.2 Merge Join DTSTransform.MergeJoin.2 Multicast DTSTransform.Multicast.2 OLE DB Command DTSTransform.OLEDBCommand.2 OLE DB Destination DTSAdapter.OLEDBDestination.2 OLE DB Source DTSAdapter.OLEDBSource.2 Partition Processing MSMDPP.PXPipelineProcessPartition.2 Percentage Sampling DTSTransform.PctSampling.2 Performance Counters Source DataCollectorTransform.TxPerfCounters.1 Pivot DTSTransform.Pivot.2 Raw File Destination DTSAdapter.RawDestination.2 Raw File Source DTSAdapter.RawSource.2 Recordset Destination DTSAdapter.RecordsetDestination.2 RegexClean Konesans.Dts.Pipeline.RegexClean.RegexClean, Konesans.Dts.Pipeline.RegexClean, Version=2.0.0.0, Culture=neutral, PublicKeyToken=d1abe77e8a21353e Row Count DTSTransform.RowCount.2 Row Count Plus Konesans.Dts.Pipeline.RowCountPlusTransform.RowCountPlusTransform, Konesans.Dts.Pipeline.RowCountPlusTransform, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b2ab4a111192992b Row Number Konesans.Dts.Pipeline.RowNumberTransform.RowNumberTransform, Konesans.Dts.Pipeline.RowNumberTransform, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b2ab4a111192992b Row Sampling DTSTransform.RowSampling.2 Script Component Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost, Microsoft.SqlServer.TxScript, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91 Slowly Changing Dimension DTSTransform.SCD.2 Sort DTSTransform.Sort.2 SQL Server Compact Destination Microsoft.SqlServer.Dts.Pipeline.SqlCEDestinationAdapter, Microsoft.SqlServer.SqlCEDest, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91 SQL Server Destination DTSAdapter.SQLServerDestination.2 Term Extraction DTSTransform.TermExtraction.2 Term Lookup DTSTransform.TermLookup.2 Trash Destination Konesans.Dts.Pipeline.TrashDestination.Trash, Konesans.Dts.Pipeline.TrashDestination, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b8351fe7752642cc TxTopQueries DataCollectorTransform.TxTopQueries.1 Union All DTSTransform.UnionAll.2 Unpivot DTSTransform.UnPivot.2 XML Source Microsoft.SqlServer.Dts.Pipeline.XmlSourceAdapter, Microsoft.SqlServer.XmlSrc, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91 Here is a simple console program that can be used to enumerate the pipeline components installed on your machine, and dumps out a list of all components like that above. You will need to add a reference to the Microsoft.SQLServer.ManagedDTS assembly. using System; using System.Diagnostics; using Microsoft.SqlServer.Dts.Runtime; public class Program { static void Main(string[] args) { Application application = new Application(); PipelineComponentInfos componentInfos = application.PipelineComponentInfos; foreach (PipelineComponentInfo componentInfo in componentInfos) { Debug.WriteLine(componentInfo.Name + "\t" + componentInfo.CreationName); } Console.Read(); } }

    Read the article

  • Java Spotlight Episode 108: Patrick Curran and Heather VanCura on JCP.Next @jcp_org

    - by Roger Brinkley
    Interview with Patrick Curran and Heather VanCura on JCP.Next. Right-click or Control-click to download this MP3 file. You can also subscribe to the Java Spotlight Podcast Feed to get the latest podcast automatically. If you use iTunes you can open iTunes and subscribe with this link:  Java Spotlight Podcast in iTunes. Show Notes News Welcome to the newly merged JCP EC! The November/December issue of Java Magazine is now out Red Hat announces intent to contribute to OpenJFX New OpenJDK JEPs: JEP 168: Network Discovery of Manageable Java Processes JEP 169: Value Objects Java EE 7 Survey Latest Java EE 7 Status GlassFish 4.0 Embedded (via @agoncal) Events Nov 13-17, Devoxx, Antwerp, Belgium Nov 20, JCP Public Meeting (see details below) Nov 20-22, DOAG 2012, Nuremberg, Germany Dec 3-5, jDays, Göteborg, Sweden Dec 4-6, JavaOne Latin America, Sao Paolo, Brazil Dec 14-15, IndicThreads, Pune, India Feature InterviewPatrick Curran is Chair of the Java Community Process organization. In this role he oversees the activities of the JCP's Program Management Office including evolving the process and the organization, managing its membership, guiding specification leads and experts through the process, chairing Executive Committee meetings, and managing the JCP.org web site.Patrick has worked in the software industry for more than 25 years, and at Sun and then Oracle for 20 years. He has a long-standing record in conformance testing, and before joining the JCP he led the Java Conformance Engineering team in Sun's Client Software Group. He was also chair of Sun's Conformance Council, which was responsible for defining Sun's policies and strategies around Java conformance and compatibility.Patrick has participated actively in several consortia and communities including the W3C (as a member of the Quality Assurance Working Group and co-chair of the Quality Assurance Interest Group), and OASIS (as co-chair of the Test Assertions Guidelines Technical Committee). Patrick's blog is here.Heather VanCura manages the JCP Program Office and is responsible for the day-to-day nurturing, support, and leadership of the community. She oversees the JCP.org web site, JSR management and posting, community building, events, marketing, communications, and growth of the membership through new members and renewals.  Heather has a front row seat for studying trends within the community and recommending changes. Several changes to the program in recent years have included enabling broader participation, increased transparency and agility in JSR development.  When Heather joined the PMO staff in a community building marketing manager role for the JCP program, she was responsible for establishing the JCP brand logo programs, the JCP.org site, and engaging the community in online surveys and usability studies. She also developed marketing reward programs,  campaigns, sponsorships, and events for the JCP program, including the community gathering at the annual JavaOne Conference.   Before arriving at the JCP community in 2000, Heather worked with various technology companies.  Heather enjoys speaking at conferences, such as Devoxx, Java Zone, and the JavaOne Conferences. She maintains the JCP Blog, Twitter feed (@jcp_org) and Facebook page.  Heather resides in the San Francisco Bay Area, California USA. JCP Executive Committee Public Meeting Details Date & Time Tuesday November 20, 2012, 3:00 - 4:00 pm PST Location Teleconference Dial-in +1 (866) 682-4770 Conference code: 627-9803 Security code: 52732 ("JCPEC" on your phone handset) For global access numbers see http://www.intercall.com/oracle/access_numbers.htm Or +1 (408) 774-4073 WebEx Browse for the meeting from https://jcp.webex.com No registration required (enter your name and email address) Password: JCPEC Agenda JSR 355 (the EC merge) implementation report JSR 358 (JCP.next.3) status report 2.8 status update and community audit program Discussion/Q&A Note The call will be recorded and the recording published on jcp.org, so those who are unable to join in real-time will still be able to participate. September 2012 EC meeting PMO report with JCP 2.8 statistics.JSR 358 Project page What’s Cool Sweden: Hot Java in the Winter GE Engergy using Invoke Daynamic for embedded development

    Read the article

  • What’s Your Tax Strategy? Automate the Tax Transfer Pricing Process!

    - by tobyehatch
    Does your business operate in multiple countries? Well, whether you like it or not, many local and international tax authorities inspect your tax strategy.  Legal, effective tax planning is perceived as a “moral” issue. CEOs are being asked to testify on their process of tax transfer pricing between multinational legal entities.  Marc Seewald, Senior Director of Product Management for EPM Applications specializing in all tax subjects and Product Manager for Oracle Hyperion Tax Provisioning, and Bart Stoehr, Senior Director of Product Strategy for Oracle Hyperion Profitability and Cost Management joined me for a discussion/podcast on this interesting subject.  So what exactly is “tax transfer pricing”? Marc defined it this way. “Tax transfer pricing is a profit allocation methodology required to be used by multinational corporations. Specifically, the ultimate goal of the transfer pricing is to ensure that the global multinational pays their fair share of income tax in each of their local markets. Specifically, it prevents companies from unfairly moving profit from ‘high tax’ countries to ‘low tax’ countries.” According to Marc, in today’s global economy, profitability can be significantly impacted by goods and services exchanged between the related divisions within a single multinational company.  To ensure that these cost allocations are done fairly, there are rules that govern the process. These rules ensure that intercompany allocations fairly represent the actual nature of the businesses activity- as if two divisions were unrelated - and provide a clear audit trail of how the costs have been allocated to prove that allocations fall within reasonable ranges.  What are the repercussions of improper tax transfer pricing? How important is it? Tax transfer pricing allocations can materially impact the amount of overall corporate income taxes paid by a company worldwide, in some cases by hundreds of millions of dollars!  Since so much tax revenue is at stake, revenue agencies like the IRS, and international regulatory bodies like the Organization for Economic Cooperation and Development (OECD) are pushing to reform and clarify reporting for tax transfer pricing. Most recently the OECD announced an “Action Plan for Base Erosion and Profit Shifting”. As Marc explained, the times are changing and companies need to be responsive to this issue. “It feels like every other week there is another company being accused of avoiding taxes,” said Marc. Most recently, Caterpillar was accused of avoiding billions of dollars in taxes. In the last couple of years, Apple, GE, Ikea, and Starbucks, have all been accused of tax avoidance. It’s imperative that companies like these have a clear and auditable tax transfer process that enables them to justify tax transfer pricing allocations and avoid steep penalties and bad publicity. Transparency and efficiency are what is needed when it comes to the tax transfer pricing process. Bart explained that tax transfer pricing is driving a deeper inspection of profit recognition specifically focused on the tax element of profit.  However, allocations needed to support tax profitability are nearly identical in process to allocations taking place in other parts of the finance organization. For example, the methods and processes necessary to arrive at tax profitability by legal entity are no different than those used to arrive at fully loaded profitability for a product line. In fact, there is a great opportunity for alignment across these two different functions.So it seems that tax transfer pricing should be reflected in profitability in general. Bart agreed and told us more about some of the critical sub-processes of an overall tax transfer pricing process within the Oracle solution for tax transfer pricing.  “First, there is a ton of data preparation, enrichment and pre-allocation data analysis that is managed in the Oracle Hyperion solution. This serves as the “data staging” to the next, critical sub-processes.  From here, we leverage the Oracle EPM platform’s ability to re-use dimensions and legal entity driver data and financial data with Oracle Hyperion Profitability and Cost Management (HPCM).  Within HPCM, we manage the driver data, define the legal entity to legal entity allocation rules (like cost plus), and have the option to test out multiple, simultaneous tax transfer pricing what-if scenarios.  Once processed, a tax expert can evaluate the effectiveness of any one scenario result versus another via a variance analysis configured with HPCM’s pre-packaged reporting capability known as Oracle Hyperion SmartView for Office.”   Further, Bart explained that the ability to visibly demonstrate how a cost or revenue has been allocated is really helpful and auditable.  “HPCM’s Traceability Maps are that visual representation of all allocation flows that have been executed and is the tax transfer analyst’s best friend in maintaining clear documentation for tax transfer pricing audits. Simply click and drill as you inspect the chain of allocation definitions and results. Once final, the post-allocated tax data can be compared to the GL to create invoices and journal entries for posting to your GL system of choice.  Of course, there is a framework for overall governance of the journal entries, allocation percentages, and reporting to include necessary approvals.” Lastly, Marc explained that the key value in using the Oracle Hyperion solution for tax transfer pricing is that it keeps everything in alignment in one single place. Specifically, Oracle Hyperion effectively becomes the single book of record for the GAAP, management, and the tax set of books. There are many benefits to having one source of the truth. These include EFFICIENCY, CONTROLS and TRANSPARENCY.So, what’s your tax strategy? Why not automate the tax transfer pricing process!To listen to the entire podcast, click here.To learn more about Oracle Hyperion Profitability and Cost Management (HPCM), click here.

    Read the article

  • Converting Openfire IM datetime values in SQL Server to / from VARCHAR(15) and DATETIME data types

    - by Brian Biales
    A client is using Openfire IM for their users, and would like some custom queries to audit user conversations (which are stored by Openfire in tables in the SQL Server database). Because Openfire supports multiple database servers and multiple platforms, the designers chose to store all date/time stamps in the database as 15 character strings, which get converted to Java Date objects in their code (Openfire is written in Java).  I did some digging around, and, so I don't forget and in case someone else will find this useful, I will put the simple algorithms here for converting back and forth between SQL DATETIME and the Java string representation. The Java string representation is the number of milliseconds since 1/1/1970.  SQL Server's DATETIME is actually represented as a float, the value being the number of days since 1/1/1900, the portion after the decimal point representing the hours/minutes/seconds/milliseconds... as a fractional part of a day.  Try this and you will see this is true:     SELECT CAST(0 AS DATETIME) and you will see it returns the date 1/1/1900. The difference in days between SQL Server's 0 date of 1/1/1900 and the Java representation's 0 date of 1/1/1970 is found easily using the following SQL:   SELECT DATEDIFF(D, '1900-01-01', '1970-01-01') which returns 25567.  There are 25567 days between these dates. So to convert from the Java string to SQL Server's date time, we need to convert the number of milliseconds to a floating point representation of the number of days since 1/1/1970, then add the 25567 to change this to the number of days since 1/1/1900.  To convert to days, you need to divide the number by 1000 ms/s, then by  60 seconds/minute, then by 60 minutes/hour, then by 24 hours/day.  Or simply divide by 1000*60*60*24, or 86400000.   So, to summarize, we need to cast this string as a float, divide by 86400000 milliseconds/day, then add 25567 days, and cast the resulting value to a DateTime.  Here is an example:   DECLARE @tmp as VARCHAR(15)   SET @tmp = '1268231722123'   SELECT @tmp as JavaTime, CAST((CAST(@tmp AS FLOAT) / 86400000) + 25567 AS DATETIME) as SQLTime   To convert from SQL datetime back to the Java time format is not quite as simple, I found, because floats of that size do not convert nicely to strings, they end up in scientific notation using the CONVERT function or CAST function.  But I found a couple ways around that problem. You can convert a date to the number of  seconds since 1/1/1970 very easily using the DATEDIFF function, as this value fits in an Int.  If you don't need to worry about the milliseconds, simply cast this integer as a string, and then concatenate '000' at the end, essentially multiplying this number by 1000, and making it milliseconds since 1/1/1970.  If, however, you do care about the milliseconds, you will need to use DATEPART to get the milliseconds part of the date, cast this integer to a string, and then pad zeros on the left to make sure this is three digits, and concatenate these three digits to the number of seconds string above.  And finally, I discovered by casting to DECIMAL(15,0) then to VARCHAR(15), I avoid the scientific notation issue.  So here are all my examples, pick the one you like best... First, here is the simple approach if you don't care about the milliseconds:   DECLARE @tmp as VARCHAR(15)   DECLARE @dt as DATETIME   SET @dt = '2010-03-10 14:35:22.123'   SET @tmp = CAST(DATEDIFF(s, '1970-01-01 00:00:00' , @dt) AS VARCHAR(15)) + '000'   SELECT @tmp as JavaTime, @dt as SQLTime If you want to keep the milliseconds:   DECLARE @tmp as VARCHAR(15)   DECLARE @dt as DATETIME   DECLARE @ms as int   SET @dt = '2010-03-10 14:35:22.123'   SET @ms as DATEPART(ms, @dt)   SET @tmp = CAST(DATEDIFF(s, '1970-01-01 00:00:00' , @dt) AS VARCHAR(15))           + RIGHT('000' + CAST(@ms AS VARCHAR(3)), 3)   SELECT @tmp as JavaTime, @dt as SQLTime Or, in one fell swoop:   DECLARE @dt as DATETIME   SET @dt = '2010-03-10 14:35:22.123'   SELECT @dt as SQLTime     , CAST(DATEDIFF(s, '1970-01-01 00:00:00' , @dt) AS VARCHAR(15))           + RIGHT('000' + CAST( DATEPART(ms, @dt) AS VARCHAR(3)), 3) as JavaTime   And finally, a way to simply reverse the math used converting from Java date to SQL date. Note the parenthesis - watch out for operator precedence, you want to subtract, then multiply:   DECLARE @dt as DATETIME   SET @dt = '2010-03-10 14:35:22.123'   SELECT @dt as SQLTime     , CAST(CAST((CAST(@dt as Float) - 25567.0) * 86400000.0 as DECIMAL(15,0)) as VARCHAR(15)) as JavaTime Interestingly, I found that converting to SQL Date time can lose some accuracy, when I converted the time above to Java time then converted  that back to DateTime, the number of milliseconds is 120, not 123.  As I am not interested in the milliseconds, this is ok for me.  But you may want to look into using DateTime2 in SQL Server 2008 for more accuracy.

    Read the article

  • Off The Beaten Path—Three Things Growing Midsize Companies are Thankful For

    - by Christine Randle
    By: Jim Lein, Senior Director, Oracle Accelerate Last Sunday I went on a walkabout.  That’s when I just step out the door of my Colorado home and hike through the mountains for hours with no predetermined destination. I favor “social trails”, the unmapped routes pioneered by both animal and human explorers.  These tracks  are usually more challenging than established, marked routes and you can’t be 100% sure of where you’re going to end up. But I’ve found the rewards to be much greater. For awhile, I pondered on how—depending upon your perspective—the current economic situation worldwide could be viewed as either a classic “the glass is half empty” or a “the glass is half full” scenario. Midsize companies buy Oracle to grow and so I’m continually amazed and fascinated by the success stories our customers relate to me.  Oracle’s successful midsize companies are growing via innovation, agility, and opportunity. For them, the glass isn’t half full—it’s overflowing. Growing Midsize Companies are Thankful for: Innovation The sun angling through the pine trees reminded me of a conversation with a European customer a year ago May.  You might not recognize the name but, chances are, your local evening weather report relies on this company’s weather observation, monitoring and measurement products.  For decades, the company was recognized in its industry for product innovation, but its recent rapid growth comes from tailoring end to end product and service solutions based on the needs of distinctly different customer groups across industrial, public sector, and defense sectors.  Hours after that phone call I was walking my dog in a local park and came upon a small white plastic box sprouting short antennas and dangling by a nylon cord from a tree branch.  I cut it down. The name of that customer’s company was stamped on the housing. “It’s a radiosonde from a high altitude weather balloon,” he told me the next day. “Keep it as a souvenir.”  It sits on my fireplace mantle and elicits many questions from guests. Growing Midsize Companies are Thankful for: Agility In July, I had another interesting discussion with the CFO of an Asia-Pacific company which owns and operates a large portfolio of leisure assets. They are best known for their epic outdoor theme parks. However, their primary growth today is coming from a chain of indoor amusement centers in the USA where billiards, bowling, and laser tag take the place of roller coasters, kiddy rides, and wave pools. With mountains and rivers right out my front door, I’m not much for theme parks, but I’ll take a spirited game of laser tag any day.  This company has grown dramatically since first implementing Oracle ERP more than a decade ago. Their profitable expansion into a completely foreign market is derived from the ability to replicate proven and efficient best business practices across diverse operating environments.  They recently went live on Oracle’s Fusion HCM and Taleo. Their CFO explained to me how, with thousands of employees in three countries, Fusion HCM and Taleo would enable them to remain incredibly agile by acting on trends linking individual employee performance to their management, establishing and maintaining those best practices. Growing Midsize Companies are Thankful for: Opportunity I have three GPS apps on my iPhone. I use them mainly to keep track of my stats—distance, time, and vertical gain. However, every once in awhile I need to find the most efficient route back home before dark from my current location (notice I didn’t use the word “lost”). In August I listened in on an interview with the CFO of another European company that designs and delivers telematics solutions—the integrated use of telecommunications and informatics—for managing the mobile workforce. These solutions enable customers to achieve evolutionary step-changes in their performance and service delivery. Forgive the overused metaphor, but this is route optimization on steroids.  The company’s executive team saw an opportunity in this emerging market and went “all in”. Consequently, they are being rewarded with tremendous growth results and market domination by providing the ability for their clients to collect and analyze performance information related to fuel consumption, service workforce safety, and asset productivity. This Thanksgiving, I’m thankful for health, family, friends, and a career with an innovative company that helps companies leverage top tier software to drive and manage growth. And I’m thankful to have learned the lesson that good things happen when you get off the beaten path—both when hiking and when forging new routes through a complex world economy. Halfway through my walkabout on Sunday, after scrambling up a long stretch of scree-covered hill, I crested a ridge with an obstructed view of 14,265 ft Mt Evans just a few miles to the west.  There, nowhere near a house or a trail, someone had placed a wooden lounge chair. Its wood was worn and faded but it was sturdy. I had lunch and a cold drink in my pack. Opportunity knocked and I seized it. Happy Thanksgiving.  

    Read the article

  • Weekend reading: Microsoft/Oracle and SkyDrive based code-editor

    - by jamiet
    A couple of news item caught my eye this weekend that I think are worthy of comment. Microsoft/Oracle partnership to be announced tomorrow (24/06/2013) According to many news site Microsoft and Oracle are about to announce a partnership (Oracle set for major Microsoft, Salesforce, Netsuite partnerships) and they all seem to be assuming that it will be something to do with “the cloud”. I wouldn’t disagree with that assessment, Microsoft are heavily pushing Azure and Oracle seem (to me anyway) to be rather lagging behind in the cloud game. More specifically folks seem to be assuming that Oracle’s forthcoming 12c database release will be offered on Azure. I did a bit of reading about Oracle 12c and one of its key pillars appears to be that it supports multi-tenant topologies and multi-tenancy is a common usage scenario for databases in the cloud. I’m left wondering then, if Microsoft are willing to push a rival’s multi-tenant solution what is happening to its own cloud-based multi-tenant offering – SQL Azure Federations. We haven’t heard anything about federations for what now seems to be a long time and moreover the main Program Manager behind the technology, Cihan Biyikoglu, recently left Microsoft to join Twitter. Furthermore, a Principle Architect for SQL Server, Conor Cunningham, recently presented the opening keynote at SQLBits 11 where he talked about multi-tenant solutions on SQL Azure and not once did he mention federations. All in all I don’t have a warm fuzzy feeling about the future of SQL Azure Federations so I hope that that question gets asked at some point following the Microsoft/Oracle announcement. Text Editor on SkyDrive with coding-specific features Liveside.net got a bit of a scoop this weekend with the news (Exclusive: SkyDrive.com to get web-based text file editing features) that Microsoft’s consumer-facing file storage service is going to get a new feature – a web-based code editor. Here’s Liveside’s screenshot: I’ve long had a passing interest in online code editors, indeed back in December 2009 I wondered out loud on this blog site: I started to wonder when the development tools that we use would also become cloud-based. After all, if we’re using cloud-based services does it not make sense to have cloud-based tools that work with them? I think it does. Project Houston Since then the world has moved on. Cloud 9 IDE (https://c9.io/) have blazed a trail in the fledgling world of online code editors and I have been wondering when Microsoft were going to start playing catch-up. I had no doubt that an online code editor was in Microsoft’s future; its an obvious future direction, why would I want to have to download and install a bloated text editor (which, arguably, is exactly what Visual Studio amounts to) and have to continually update it when I can simply open a web browser and have ready access to all of my code from wherever I am. There are signs that Microsoft is already making moves in this direction, after all the URL for their new offering Team Foundation Service doesn’t mention TFS at all – my own personalised URL for Team Foundation Service is http://jamiet.visualstudio.com – using “Visual Studio” as the domain name for a service that isn’t strictly speaking part of Visual Studio leads me to think that there’s a much bigger play here and that one day http://visualstudio.com will house an online code editor. With that in mind then I find Liveside’s revelation rather intriguing, why would a code editing tool show up in Skydrive? Perhaps SkyDrive is going to get integrated more tightly into TFS, I’m very interested to see where this goes. The larger question playing on my mind though is whether an online code editor from Microsoft will support SQL Server developers. I have opined before (see The SQL developer gap) about the shoddy treatment that SQL Server developers have to experience from Microsoft and I haven’t seen any change in Microsoft’s attitude in the three and a half years since I wrote that post. I’m constantly bewildered by the lack of investment in SQL Server developer productivity compared to the riches that are lavished upon our appdev brethren. When you consider that SQL Server is Microsoft’s third biggest revenue stream it is, frankly, rather insulting. SSDT was a step in the right direction but the hushed noises I hear coming out of Microsoft of late in regard to SSDT don’t bode fantastically well for its future. So, will an online code editor from Microsoft support T-SQL development? I have to assume not given the paucity of investment on us lowly SQL Server developers over the last few years, but I live in hope! Your thoughts in the comments section please. I would be very interested in reading them. @Jamiet

    Read the article

  • Release Management as Orchestra

    - by ericajanine
    I read an excellent, concise article (http://www.buildmeister.com/articles/software_release_management_best_practices) on the basics of release management practices. In the article, it states "Release Management is often likened to the conductor of an orchestra, with the individual changes to be implemented the various instruments within it." I played in music ensembles for years, so this is especially close to my heart as example. I learned most of my discipline from hours and hours of practice at the hand of a very skilled conductor and leader. I also learned that the true magic in symphonic performance is one where everyone involved is focused on one sound, one goal. In turn, that solid focus creates a sound and experience bigger than just mechanics alone accomplish. In symphony, a conductor's true purpose is to make you, a performer, better so the overall sound and end product is better. The big picture (the performance of the composition) is the end-game, and all musicians in the orchestra know without question their part makes up an important but incomplete piece of that performance. A good conductor works with each section (e.g. group) to ensure their individual pieces are solid. Let's restate: The conductor leads and is responsible for ensuring those pieces are solid. While the performers themselves are doing the work, the conductor is the final authority on when the pieces are ready or not. If not, the conductor initiates the efforts to get them ready or makes the decision to scrap their parts altogether for the sake of an overall performance. Let it sink in, because it's clear--It is not the performer's call if they play their part as agreed, it's the conductor's final call to allow it. In comparison, if a software release manager is a conductor, the only way for that manager to be effective is to drive the overarching process and execution of individual pieces of a software development lifecycle. It does not mean the release manager performs each and every piece, it means the release manager has oversight and influence because the end-game is a successful software enhancin a useable environment. It means the release manager, not the developer or development manager, has the final call if something goes into a software release. Of course, this is not a process of autocracy or dictation of absolute rule, it's cooperative effort. But the release manager must have the final authority to make a decision if something is ready to be added to the bigger piece, the overall symphony of software changes being considered for package and release. It also goes without saying a release manager, like a conductor, must have full autonomy and isolation from other software groups. A conductor is the one on the podium waving a little stick at the each section and cueing them for their parts, not yelling from the back of the room while also playing a tuba and taking direction from the horn section. I have personally seen where release managers are relegated to being considered little more than coordinators, red-tapers to "satisfy" the demands of an audit group without being bothered to actually respect all that a release manager gives a group willing to employ them fully. In this dysfunctional scenario, development managers, project managers, business users, and other stakeholders have been given nearly full clearance to demand and push their agendas forward, causing a tail-wagging-the-dog scenario where an inherent conflict will ensue. Depending on the strength, determination for peace, and willingness to overlook a built-in expectation that is wrong, the release manager here must face the crafted conflict head-on and diffuse it as quickly as possible. Then, the release manager must clearly make a case why a change cannot be released without negative impact to all parties involved. If a political agenda is solely driving a software release, there IS no symphony, there is no "software lifecycle". It's just out-of-tune noise. More importantly, there is no real conductor. Sometimes, just wanting to make a beautiful sound is not enough. If you are a release manager, are you freed up enough to move, to conduct the sections of software creation to ensure a solid release performance is possible? If not, it's time to take stock in what your role actually is and see if that is what you truly want to achieve in your position. If you are, then you can successfully build your career and that of the people in your groups to create truly beautiful software (music) together.

    Read the article

  • Top 10 Linked Blogs of 2010

    - by Bill Graziano
    Each week I send out a SQL Server newsletter and include links to interesting blog posts.  I’ve linked to over 500 blog posts so far in 2010.  Late last year I started storing those links in a database so I could do a little reporting.  I tend to link to posts related to the OLTP engine.  I also try to link to the individual blogger in the group blogs.  Unfortunately that wasn’t possible for the SQLCAT and CSS blogs.  I also have a real weakness for posts related to PASS. These are the top 10 blogs that I linked to during the year ordered by the number of posts I linked to. Paul Randal – Paul writes extensively on the internals of the relational engine.  Lots of great posts around transactions, transaction log, disaster recovery, corruption, indexes and DBCC.  I also linked to many of his SQL Server myths posts. Glenn Berry – Glenn writes very interesting posts on how hardware affects SQL Server.  I especially like his posts on the various CPU platforms.  These aren’t necessarily topics that I’m searching for but I really enjoy reading them. The SQLCAT Team – This Microsoft team focuses on the largest and most interesting SQL Server installations.  The regularly publish white papers and best practices. SQL Server CSS Team – These are the top engineers from the Microsoft Customer Service and Support group.  These are the folks you finally talk to after your case has been escalated about 20 times.  They write about the interesting problems they find. Brent Ozar – The posts I linked to mostly focused on the relational engine: CPU, NUMA, SSD drives, performance monitoring, etc.  But Brent writes about a real variety of topics including blogging, social networking, speaking, the MCM, SQL Azure and anything else that seems to strike his fancy.  His posts are always well written and though provoking. Jeremiah Peschka – A number of Jeremiah’s posts weren’t about SQL Server.  He’s very active in the “NoSQL” area and I linked to a number of those posts.  I think it’s important for people to know what other technologies are out there. Brad McGehee – Brad writes about being a DBA including maintenance plans, DBA checklists, compression and audit. Thomas LaRock – I linked to a variety of posts from PBM to networking to 24 Hours of PASS to TDE.  Just a real variety of topics.  Tom always writes with an interesting style usually mixing in a movie theme and/or bacon. Aaron Bertrand – Many of my links this year were Denali features.  He also had a great series on bad habits to kick. Michael J. Swart – This last one surprised me.  There are some well known SQL Server bloggers below Michael on this list.  I linked to posts on indexes, hierarchies, transactions and I/O performance and a variety of other engine related posts.  All are interesting and well thought out.  Many of his non-SQL posts are also very good.  He seems to have an interest in puzzles and other brain teasers.  Michael, I won’t be surprised again!

    Read the article

  • WebCenter Customer Spotlight: Texas Industries, Inc.

    - by me
    Author: Peter Reiser - Social Business Evangelist, Oracle WebCenter  Solution SummaryTexas Industries, Inc. (TXI) is a leading supplier of cement, aggregate, and consumer product building materials for residential, commercial, and public works projects. TXI is based in Dallas and employs around 2,000 employees. The customer had the challenge of decentralized and manual processes for entering 180,000 vendor invoices annually.  Invoice entry was a time- and resource-intensive process that entailed significant personnel requirements. TXI implemented a centralized solution leveraging Oracle WebCenter Imaging, a smart routing solution that enables users to capture invoices electronically with Oracle WebCenter Capture and Oracle WebCenter Forms Recognition to send  the invoices through to Oracle Financials for approvals and processing.  TXI significantly lowered resource needs for payable processing,  increase productivity by 80% and reduce invoice processing cycle times by 84%—from 20 to 30 days to just 3 to 5 days, on average. Company OverviewTexas Industries, Inc. (TXI) is a leading supplier of cement, aggregate, and consumer product building materials for residential, commercial, and public works projects. With operating subsidiaries in six states, TXI is the largest producer of cement in Texas and a major producer in California. TXI is a major supplier of stone, sand, gravel, and expanded shale and clay products, and one of the largest producers of bagged cement and concrete  products in the Southwest. Business ChallengesTXI had the challenge of decentralized and manual processes for entering 180,000 vendor invoices annually.  Invoice entry was a time- and resource-intensive process that entailed significant personnel requirements. Their business objectives were: Increase the efficiency of core business processes, such as invoice processing, to support the organization’s desire to maintain its role as the Southwest’s leader in delivering high-quality, low-cost products to the construction industry Meet the audit and regulatory requirements for achieving Sarbanes-Oxley (SOX) compliance Streamline entry of 180,000 invoices annually to accelerate processing, reduce errors, cut invoice storage and routing costs, and increase visibility into payables liabilities Solution DeployedTXI replaced a resource-intensive, paper-based, decentralized process for invoice entry with a centralized solution leveraging Oracle WebCenter Imaging 11g. They worked with the Oracle Partner Keste LLC to develop a smart routing solution that enables users to capture invoices electronically with Oracle WebCenter Capture and then uses Oracle WebCenter Forms Recognition and the Oracle WebCenter Imaging workflow to send the invoices through to Oracle Financials for approvals and processing. Business Results Significantly lowered resource needs for payable processing through centralization and improved efficiency  Enabled the company to process invoices faster and pay bills earlier, allowing it to take advantage of additional vendor discounts Tracked to increase productivity by 80% and reduce invoice processing cycle times by 84%—from 20 to 30 days to just 3 to 5 days, on average Achieved a 25% reduction in paper invoice storage costs now that invoices are captured digitally, and enabled a 50% reduction in shipping costs, as the company no longer has to send paper invoices between headquarters and production facilities for approvals “Entering and manually processing more than 180,000 vendor invoices annually was time and labor intensive. With Oracle Imaging and Process Management, we have automated and centralized invoice entry and processing at our corporate office, improving productivity by 80% and reducing invoice processing cycle times by 84%—a very important efficiency gain.” Terry Marshall, Vice President of Information Services, Texas Industries, Inc. Additional Information TXI Customer Snapshot Oracle WebCenter Content Oracle WebCenter Capture Oracle WebCenter Forms Recognition

    Read the article

  • Certify October Updates

    - by Sadia2
    Normal 0 false false false EN-US X-NONE X-NONE We have added some release and platform certifications to MOS Certify. Applications: Oracle Demantra 12.2.2 Collaboration Technologies: Oracle On Track Communication 1.0.0.0.0 Database : Oracle Database 11.2.0.4.0, Oracle Database Client 11.2.0.4.0, 11.2.0.3.0, Oracle Clusterware 12.1.0.1.0, 11.2.0.4.0, Oracle Real Application Clusters 12.1.0.1.0, 11.2.0.4.0, Oracle TimesTen In-Memory Database 11.2.2.5.0, Oracle Audit Vault and Database Firewall 12.1.1.0.0, Oracle Database Client 10.2.0.5, Oracle Secure Enterprise Search 11.2.2.2.0 E-Business Suite: Oracle E-Business Suite 12.2.2, 12.1.3, 12.1.2, 12.1.1, 12.0.4, 11.5.10.2, 11.5.9.2 Edge Applications: Oracle Transportation Management 6.3.2 Enterprise Manager: Enterprise Manager Base Platform – OMS 12.1.0.3.0 FSGBU Insurance Group: Oracle Health Insurance Back Office 10.13.2.0.0 Fusion Middleware: Oracle Application Development Framework 11.1.1.6.0, Oracle Business Intelligence Enterprise Edition 11.1.1.7.0, Oracle BI Answers 11.1.1.7.0, Oracle BI Composer 11.1.1.7.0, Oracle BI Presentation Services 11.1.1.7.0, Oracle BI Delivers 11.1.1.7.0, Oracle BI Interactive Dashboards 11.1.1.7.0, Oracle BI Scorecard and Strategy Management 11.1.1.7.0, Oracle BI Catalog Manager 11.1.1.7.0, Oracle BI Search 11.1.1.7.0, Oracle BIP Enterprise 11.1.1.7.0, Oracle BIP Scheduler 11.1.1.7.0, Oracle Real-Time Decision Center 11.1.1.7.0, Oracle Segmentation Server 11.1.1.7.0, Oracle JRE 1.7.0_45, 1.7.0_40, 1.7.0_25, 1.7.0_21, 1.7.0_17, 1.7.0_15, 1.7.0_13, 1.7.0_11, 1.7.0_10, 1.6.0_65, 1.6.0_26, Oracle JDK 1.7.0_45, 1.7.0_25, 1.7.0_17, 1.7.0_15, 1.7.0_13, 1.7.0_11, 1.6.0_65, 1.6.0_41, 1.6.0_26, Oracle Discoverer 11.1.1.7.0, 11.1.1.6.0, Discoverer Administrator 11.1.1.7.0, 11.1.1.6.0, Discoverer Desktop 11.1.1.7.0, 11.1.1.6.0, Oracle GoldenGate 12.1.2.0.0, Oracle GoldenGate Director 12.1.2.0.0, Java 1.7.0_10, Oracle Fusion Middleware 12.1.2.0.0, Oracle Data Integrator Agent 12.1.2.0.0, Oracle Data Integrator Studio 12.1.2.0.0, Oracle Data Integrator Console 12.1.2.0.0 JD Edwards EnterpriseOne: JD Edwards EnterpriseOne Enterprise Server 9.1.3.0, JD Edwards EnterpriseOne One View Reporting 9.1.3.0, JD Edwards EnterpriseOne Mobile Applications 9.0.2.0, 9.0.0.0, 9.1.2.0, JD Edwards EnterpriseOne for iPad 1.0.0.0 Linux & Server Virtualization (x86): Oracle VM Server for x86 3.2.6.0.0, 3.2.4.0.0, 3.2.3.0.0, 3.2.2.0.0, 3.2.1.0.0 MySQL: MySQL Database Server 5.6, 5.5, MySQL Cluster 7.3, 7.2, 7.1 Oracle Fusion Applications : Oracle Fusion Applications 11.1.7.0.0, 11.1.6.0.0, 11.1.5.0.0, 11.1.4.0.0 PeopleSoft: PeopleSoft PeopleTools 8.53, 8.52, 8.51, 8.5 Primavera GBU: Primavera Project Portfolio Mgmt 6.2.1, Primavera P6 Enterprise Project Portfolio Management 8.3.0.0.0 Siebel Enterprise: Siebel Application Server 8.2.2.4.0, 8.2.2.3.0, 8.2.2.2.0, 8.1.1.11.0, 8.1.1.10.0, 8.1.1.9.0, Siebel Database Server 8.2.2.4.0, 8.1.1.11.0 Siebel Web Server Extension 8.1.1.10.0 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;}

    Read the article

< Previous Page | 26 27 28 29 30 31 32 33 34 35 36  | Next Page >