Search Results

Search found 19649 results on 786 pages for 'visual studio integration'.

Page 147/786 | < Previous Page | 143 144 145 146 147 148 149 150 151 152 153 154  | Next Page >

  • Dynamic Unpivot : SSIS Nugget

    - by jamiet
    A question on the SSIS forum earlier today asked: I need to dynamically unpivot some set of columns in my source file. Every month there is one new column and its set of Values. I want to unpivot it without editing my SSIS packages that is deployed Let’s be clear about what we mean by Unpivot. It is a normalisation technique that basically converts columns into rows. By way of example it converts something like this: AccountCode Jan Feb Mar AC1 100.00 150.00 125.00 AC2 45.00 75.50 90.00 into something like this: AccountCode Month Amount AC1 Jan 100.00 AC1 Feb 150.00 AC1 Mar 125.00 AC2 Jan 45.00 AC2 Feb 75.50 AC2 Mar 90.00 The Unpivot transformation in SSIS is perfectly capable of carrying out the operation defined in this example however in the case outlined in the aforementioned forum thread the problem was a little bit different. I interpreted it to mean that the number of columns could change and in that scenario the Unpivot transformation (and indeed the SSIS dataflow in general) is rendered useless because it expects that the number of columns will not change from what is specified at design-time. There is a workaround however. Assuming all of the columns that CAN exist will appear at the end of the rows, we can (1) import all of the columns in the file as just a single column, (2) use a script component to loop over all the values in that “column” and (3) output each one as a column all of its own. Let’s go over that in a bit more detail.   I’ve prepared a data file that shows some data that we want to unpivot which shows some customers and their mythical shopping lists (it has column names in the first row): We use a Flat File Connection Manager to specify the format of our data file to SSIS: and a Flat File Source Adapter to put it into the dataflow (no need a for a screenshot of that one – its very basic). Notice that the values that we want to unpivot all exist in a column called [Groceries]. Now onto the script component where the real work goes on, although the code is pretty simple: Here I show a screenshot of this executing along with some data viewers. As you can see we have successfully pulled out all of the values into a row all of their own thus accomplishing the Dynamic Unpivot that the forum poster was after. If you want to run the demo for yourself then I have uploaded the demo package and source file up to my SkyDrive: http://cid-550f681dad532637.skydrive.live.com/self.aspx/Public/BlogShare/20100529/Dynamic%20Unpivot.zip Simply extract the two files into a folder, make sure the Connection Manager is pointing to the file, and execute! Hope this is useful. @Jamiet Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • ATG Live Webcast Feb. 24th: Using the EBS 12 SOA Adapter

    - by Bill Sawyer
    Our next ATG Live Webcast is now open for registration. The event is titled:E-Business Suite R12.x SOA Using the E-Business Suite AdapterThis live one-hour webcast will offer a review of the Service Oriented Architecture (SOA) capabilities within E-Business Suite R12 focusing on the E-Business Suite Adapter. While primarily focused on integrators and developers, understanding SOA capabilities is important for all E-Business Suite technologists and superusers.ATG Live Webcast Logistics The one-hour event will be webcast live with a dial-in access for Q&A with the Applications Technology Group (ATG) Development experts presenting the event. The basic information for the event is as follows:E-Business Suite R12.x SOA Using the E-Business Suite AdapterDate: Thursday, February 24, 2011Time: 8:00 AM - 9:00 AM Pacific Standard TimePresenters:  Neeraj Chauhan, Product Manager, ATG DevelopmentNOTE: When you register for the event, the confirmation will show the event starting at 7:30 AM Pacific Standard Time. This is to allow you time to connect to the conference call and web conference. The presentation will start at 8:00 AM Pacfic Standard Time.

    Read the article

  • SSIS - XML Source Script

    - by simonsabin
    The XML Source in SSIS is great if you have a 1 to 1 mapping between entity and table. You can do more complex mapping but it becomes very messy and won't perform. What other options do you have? The challenge with XML processing is to not need a huge amount of memory. I remember using the early versions of Biztalk with loaded the whole document into memory to map from one document type to another. This was fine for small documents but was an absolute killer for large documents. You therefore need a streaming approach. For flexibility however you want to be able to generate your rows easily, and if you've ever used the XmlReader you will know its ugly code to write. That brings me on to LINQ. The is an implementation of LINQ over XML which is really nice. You can write nice LINQ queries instead of the XMLReader stuff. The downside is that by default LINQ to XML requires a whole XML document to work with. No streaming. Your code would look like this. We create an XDocument and then enumerate over a set of annoymous types we generate from our LINQ statement XDocument x = XDocument.Load("C:\\TEMP\\CustomerOrders-Attribute.xml");   foreach (var xdata in (from customer in x.Elements("OrderInterface").Elements("Customer")                        from order in customer.Elements("Orders").Elements("Order")                        select new { Account = customer.Attribute("AccountNumber").Value                                   , OrderDate = order.Attribute("OrderDate").Value }                        )) {     Output0Buffer.AddRow();     Output0Buffer.AccountNumber = xdata.Account;     Output0Buffer.OrderDate = Convert.ToDateTime(xdata.OrderDate); } As I said the downside to this is that you are loading the whole document into memory. I did some googling and came across some helpful videos from a nice UK DPE Mike Taulty http://www.microsoft.com/uk/msdn/screencasts/screencast/289/LINQ-to-XML-Streaming-In-Large-Documents.aspx. Which show you how you can combine LINQ and the XmlReader to get a semi streaming approach. I took what he did and implemented it in SSIS. What I found odd was that when I ran it I got different numbers between using the streamed and non streamed versions. I found the cause was a little bug in Mikes code that causes the pointer in the XmlReader to progress past the start of the element and thus foreach (var xdata in (from customer in StreamReader("C:\\TEMP\\CustomerOrders-Attribute.xml","Customer")                                from order in customer.Elements("Orders").Elements("Order")                                select new { Account = customer.Attribute("AccountNumber").Value                                           , OrderDate = order.Attribute("OrderDate").Value }                                ))         {             Output0Buffer.AddRow();             Output0Buffer.AccountNumber = xdata.Account;             Output0Buffer.OrderDate = Convert.ToDateTime(xdata.OrderDate);         } These look very similiar and they are the key element is the method we are calling, StreamReader. This method is what gives us streaming, what it does is return a enumerable list of elements, because of the way that LINQ works this results in the data being streamed in. static IEnumerable<XElement> StreamReader(String filename, string elementName) {     using (XmlReader xr = XmlReader.Create(filename))     {         xr.MoveToContent();         while (xr.Read()) //Reads the first element         {             while (xr.NodeType == XmlNodeType.Element && xr.Name == elementName)             {                 XElement node = (XElement)XElement.ReadFrom(xr);                   yield return node;             }         }         xr.Close();     } } This code is specifically designed to return a list of the elements with a specific name. The first Read reads the root element and then the inner while loop checks to see if the current element is the type we want. If not we do the xr.Read() again until we find the element type we want. We then use the neat function XElement.ReadFrom to read an element and all its sub elements into an XElement. This is what is returned and can be consumed by the LINQ statement. Essentially once one element has been read we need to check if we are still on the same element type and name (the inner loop) This was Mikes mistake, if we called .Read again we would advance the XmlReader beyond the start of the Element and so the ReadFrom method wouldn't work. So with the code above you can use what ever LINQ statement you like to flatten your XML into the rowsets you want. You could even have multiple outputs and generate your own surrogate keys.        

    Read the article

  • FileNameColumnName property, Flat File Source Adapter : SSIS Nugget

    - by jamiet
    I saw a question on MSDN’s SSIS forum the other day that went something like this: I’m loading data into a table from a flat file but I want to be able to store the name of that file as well. Is there a way of doing that? I don’t want to come across as disrespecting those who took the time to reply but there was a few answers along the lines of “loop over the files using a For Each, store the file name in a variable yadda yadda yadda” when in fact there is a much much simpler way of accomplishing this; it just happens to be a little hidden away as I shall now explain! The Flat File Source Adapter has a property called FileNameColumnName which for some reason it isn’t exposed through the Flat File Source editor, it is however exposed via the Advanced Properties: You’ll see in the screenshot above that I have set FileNameColumnName=“Filename” (it doesn’t matter what name you use, anything except a non-zero string will work). What this will do is create a new column in our dataflow called “Filename” that contains, unsurprisingly, the name of the file from which the row was sourced. All very simple. This is particularly useful if you are extracting data from multiple files using the MultiFlatFile Connection Manager as it allows you to differentiate between data from each of the files as you can see in the following screenshot: So there you have it, the FileNameColumnName property; a little known secret of SSIS. I hope it proves to be useful to someone out there. @Jamiet Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Enforce SSIS naming conventions using BI-xPress

    - by jamiet
    A long long long time ago (in 2006 in fact) I published a blog post entitled Suggested Best Practises and naming conventions in which I suggested a bunch of acronyms that folks could use to prefix object names in their SSIS packages, thus allowing easier identification of those objects in log records, here is a sample of some of those suggestions: If you have adopted these naming conventions (and I am led to believe that a bunch of people have) then you might like to know that you can now check for adherence to these conventions using a tool called BI-xPress from Pragmatic Works. BI-xPress includes a feature called the Best Practices Analyzer that scans your packages and assess them according to some rules that you specify. In addition Pragmatic Works have made available a collection of these rules that adhere to the naming conventions I specified in 2006 You can download this collection however I recommend you first read the accompanying article that demonstrates the capabilities of the Best Practices Analyzer. Pretty cool stuff. @Jamiet

    Read the article

  • Applied Security for Oracle Business Intelligence Podcast

    - by Tim Dexter
    Listen to BI Security Meister, Bryan Wise talk about his recent book Applied Oracle Security and learn about the cutting edge techniques for Oracle Business Intelligence from a leading security expert. http://www.oracle.com/podcasts/author-podcasts.html - Applied Security for Oracle Business Intelligence Well worth the listen and of course the book is available at all discerning bookstores!

    Read the article

  • Web Site Performance and Assembly Versioning – Part 3 Versioning Combined Files Using Mercurial

    - by capgpilk
    Minification and Concatination of JavaScript and CSS Files Versioning Combined Files Using Subversion Versioning Combined Files Using Mercurial – this post I have worked on a project recently where there was a need to version the system (library dll, css and javascript files) by date and Mercurial revision number. This was in the format:- 0.12.524.407 {major}.{year}.{month}{date}.{mercurial revision} Each time there is an internal build using the CI server, it would label the files using this format. When it came time to do a major release, it became v1.{year}.{month}{date}.{mercurial revision}, with each public release having a major version increment. Also as a requirement, each assembly also had to have a new GUID on each build. So like in previous posts, we need to edit the csproj file, and add a couple of Default targets. 1: <?xml version="1.0" encoding="utf-8"?> 2: <Project ToolsVersion="4.0" DefaultTargets="Hg-Revision;AssemblyInfo;Build" 3: xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> 4: <PropertyGroup> Right below the closing tag of the entire project we add our two targets, the first is to get the Mercurial revision number. We first need to import the tasks for MSBuild which can be downloaded from http://msbuildhg.codeplex.com/ 1: <Import Project="..\Tools\MSBuild.Mercurial\MSBuild.Mercurial.Tasks" />   1: <Target Name="Hg-Revision"> 2: <HgVersion LocalPath="$(MSBuildProjectDirectory)" Timeout="5000" 3: LibraryLocation="C:\TortoiseHg\"> 4: <Output TaskParameter="Revision" PropertyName="Revision" /> 5: </HgVersion> 6: <Message Text="Last revision from HG: $(Revision)" /> 7: </Target> With the main Mercurial files being located at c:\TortoiseHg To get a valid GUID we need to escape from the csproj markup and call some c# code which we put in a property group for later reference. 1: <PropertyGroup> 2: <GuidGenFunction> 3: <![CDATA[ 4: public static string ScriptMain() { 5: return System.Guid.NewGuid().ToString().ToUpper(); 6: } 7: ]]> 8: </GuidGenFunction> 9: </PropertyGroup> Now we add in our target for generating the GUID. 1: <Target Name="AssemblyInfo"> 2: <Script Language="C#" Code="$(GuidGenFunction)"> 3: <Output TaskParameter="ReturnValue" PropertyName="NewGuid" /> 4: </Script> 5: <Time Format="yy"> 6: <Output TaskParameter="FormattedTime" PropertyName="year" /> 7: </Time> 8: <Time Format="Mdd"> 9: <Output TaskParameter="FormattedTime" PropertyName="daymonth" /> 10: </Time> 11: <AssemblyInfo CodeLanguage="CS" OutputFile="Properties\AssemblyInfo.cs" 12: AssemblyTitle="name" AssemblyDescription="description" 13: AssemblyCompany="none" AssemblyProduct="product" 14: AssemblyCopyright="Copyright ©" 15: ComVisible="false" CLSCompliant="true" Guid="$(NewGuid)" 16: AssemblyVersion="$(Major).$(year).$(daymonth).$(Revision)" 17: AssemblyFileVersion="$(Major).$(year).$(daymonth).$(Revision)" /> 18: </Target> So this will give use an AssemblyInfo.cs file like this just prior to calling the Build task:- 1: using System; 2: using System.Reflection; 3: using System.Runtime.CompilerServices; 4: using System.Runtime.InteropServices; 5:  6: [assembly: AssemblyTitle("name")] 7: [assembly: AssemblyDescription("description")] 8: [assembly: AssemblyCompany("none")] 9: [assembly: AssemblyProduct("product")] 10: [assembly: AssemblyCopyright("Copyright ©")] 11: [assembly: ComVisible(false)] 12: [assembly: CLSCompliant(true)] 13: [assembly: Guid("9C2C130E-40EF-4A20-B7AC-A23BA4B5F2B7")] 14: [assembly: AssemblyVersion("0.12.524.407")] 15: [assembly: AssemblyFileVersion("0.12.524.407")] Therefore giving us the correct version for the assembly. This can be referenced within your project whether web or Windows based like this:- 1: public static string AppVersion() 2: { 3: return Assembly.GetExecutingAssembly().GetName().Version.ToString(); 4: } As mentioned in previous posts in this series, you can label css and javascript files using this version number and the GetAssemblyIdentity task from the main MSBuild task library build into the .Net framework. 1: <GetAssemblyIdentity AssemblyFiles="bin\TheAssemblyFile.dll"> 2: <Output TaskParameter="Assemblies" ItemName="MyAssemblyIdentities" /> 3: </GetAssemblyIdentity> Then use this to write out the files:- 1: <WriteLinestoFile 2: File="Client\site-style-%(MyAssemblyIdentities.Version).combined.min.css" 3: Lines="@(CSSLinesSite)" Overwrite="true" />

    Read the article

  • List of Microsoft training kits (2012)

    - by DigiMortal
    Some years ago I published list of Microsoft training kits for developers. Now it’s time to make a little refresh and list current training kits available. Feel free to refer additional training kits in comments. Sharepoint 2010 Developer Training Kit SharePoint 2010 and Windows Phone 7 Training Kit SharePoint and Windows Azure Development Kit Visual Studio 2010 and .NET Framework 4 Training Kit (December 2011) SQL Server 2012 Developer Training Kit SQL Server 2008 R2 Update for Developers Training Kit (May 2011 Update) SharePoint and Silverlight Training Kit Windows Phone 7 Training Kit for Developers - RTM Refresh Windows Phone 7.5 Training Kit Silverlight 4 Training Web Camps Training Kit Identity Developer Training Kit Internet Explorer 10 Training Kit Visual Studio LightSwitch Training Kit Office 2010 Developer Training Kit - June 2011 Office 365 Developer Training Kit - June 2011 Update Dynamics CRM 2011 Developer Training Kit PHP on Windows and SQL Server Training Kit (March 2011 Update) Windows Server AppFabric Training Kit Windows Server 2008 R2 Developer Training Kit - July 2009

    Read the article

  • New MoReq standard for records managment under development - contribution phase commencing shortly

    - by shahid.rashid
    The DLM Forum is creating a new MoReq specification, MoReq2010, and Oracle will be contributing to this. We also highly encourage those of you in compliance, records management, and archiving (particularly those based outside the US) to participate in the development and review of the standard - the time commitment can be as little or as much as you please. The contribution phase is to commence this month with review planned in August. The official announcement from the DLM Forum and details on how to participate are located here.

    Read the article

  • SSIS Expression Tester Tool

    - by Davide Mauri
    Thanks to my friend's Doug blog I’ve found a very nice tool made by fellow MVP Darren Green which really helps to make SSIS develoepers life easier: http://expressioneditor.codeplex.com/Wikipage?ProjectName=expressioneditor In brief the tool allow the testing of SSIS Expression so that one can evaluate and test them before using in SSIS packages. Cool and useful! Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • How do I set up pairing email addresses?

    - by James A. Rosen
    Our team uses the Ruby gem hitch to manage pairing. You set it up with a group email address (e.g. [email protected]) and then tell it who is pairing: $ hitch james tiffany Hitch then sets your Git author configuration so that our commits look like commit 629dbd4739eaa91a720dd432c7a8e6e1a511cb2d Author: James and Tiffany <[email protected]> Date: Thu Oct 31 13:59:05 2013 -0700 Unfortunately, we've only been able to come up with two options: [email protected] doesn't exist. The downside is that if Travis CI tries to notify us that we broke the build, we don't see it. [email protected] does exist and forwards to all the developers. Now the downside is that everyone gets spammed with every broken build by every pair. We have too many possible pair to do any of the following: set up actual [email protected] email addresses or groups (n^2 email addresses) set up forwarding rules for [email protected] (n^2 forwarding rules) set up forwarding rules for [email protected] (n forwarding rules for each of n developers) Does anyone have a system that works for them?

    Read the article

  • Microsoft Team Foundation Server 2010 Service Pack 1

    - by javarg
    Last week Microsoft has released the first Service Pack for Team Foundation Server. Several issues have been fixed and included in this patch. Check out the list of fixes here. Cool stuff has been shipped with this new released, such as the expected Project Service Integration. PS: note that these annoying bugs has been fixed: Team Explorer: When you use a Visual Studio 2005 or a Visual Studio 2008 client, you encounter a red "X" on the reporting node of the team explorer. Source Control: You receive the error "System.IO.IOException: Unable to read data from the transport connection: The connection was closed." when you try to download a source

    Read the article

  • VS 2010 SP1 BETA – App.config XML Transformation Fix

    - by João Angelo
    The current version for App.config XML tranformations as described in a previous post does not support the SP1 BETA version of Visual Studio. I did some quick tests and decided to provide a different version with a compatibility fix for those already experimenting with the beta release of the service pack. This is a quick workaround to the build errors I found when using the transformations in SP1 beta and is pretty much untested since I’ll wait for the final release of the service pack to take a closer look to any possible problems. But for now, those that already installed SP1 beta can use the following transformations: VS 2010 SP1 BETA – App.config XML Transformation And those with the RTM release of Visual Studio can continue to use the original version of the transformations available from: VS 2010 RTM – App.config XML Transformation

    Read the article

  • Who Are the BI Users in Your Neighborhood?

    - by Brian Dayton
    Forrester's Boris Evelson recently wrote a blog titled "Who are the BI Personas?" that I enjoyed for a number of reasons. It's a quick read, easy to grasp and (refreshingly) focuses on the users of technology VS the technology. As Evelson admits, he meant to keep the reference chart at a high-level because there are too many different permutations and additional sub-categories to make such a chart useful. For me, I wouldn't head into the technical permutations but more the contextual use of BI and the issues that users experience.  My thoughts brought up more questions than answers such as: Context: -          HOW: With the exception of the "Power User" persona--likely some sort of business or operations analyst? -          WHEN: Are they using the information to make real-time decisions on the front lines (a customer service manager or shipping/logistics VP) or are they using this information for cumulative analysis and business planning? Or both? -          WHERE: What areas of the business are more or less likely to rely on BI across an organization? Human Resources, Operations, Facilities, Finance--- and why are some more prone to use data-driven analysis than others? Issues: -          DELAYS & DRAG ON IT?: One of the persona characteristics Evelson calls out is a reliance on IT. Every persona except for the "Power User" has a heavy reliance on IT for support. What business issues or delays does that cause to users? What is the drag on IT resources who could potentially be creating instead of reporting? -          HOW MANY CLICKS: If BI is being used within the context of a transaction (sales manager looking for upsell opportunities as an example) is that person getting the information within the context of that action or transaction? Or are they minimizing screens, logging into another application or reporting tool, running queries, etc.?   Who are the BI Users in your neighborhood or line of business? Do Evelson's personas resonate--and do the tools that he calls out (he refers to it as "BI Style") resonate with what your personas have or need? Finally, I'm very interested if BI use is viewed as  a bolt-on...or an integrated part of your daily enterprise processes?

    Read the article

  • BIP BIServer Query Debug

    - by Tim Dexter
    With some help from Bryan, I have uncovered a way of being able to debug or at least log what BIServer is doing when BIP sends it a query request. This is not for those of you querying the database directly but if you are using the BIServer and its datamodel to fetch data for a BIP report. If you have written or used the query builder against BIServer and when you run the report it chokes with a cryptic message, that you have no clue about, read on. When BIP runs a piece of BIServer logical SQL to fetch data. It does not appear to validate it, it just passes it through, so what is BIServer doing on its end? As you may know, you are not writing regular physical sql its actually logical sql e.g. select Jobs."Job Title" as "Job Title", Employees."Last Name" as "Last Name", Employees.Salary as Salary, Locations."Department Name" as "Department Name", Locations."Country Name" as "Country Name", Locations."Region Name" as "Region Name" from HR.Locations Locations, HR.Employees Employees, HR.Jobs Jobs The tables might not even be a physical tables, we don't care, that's what the BIServer and its model are for. You have put all the effort into building the model, just go get me the data from where ever it might be. The BIServer takes the logical sql and uses its vast brain to work out what the physical SQL is, executes it and passes the result back to BIP. select distinct T32556.JOB_TITLE as c1, T32543.LAST_NAME as c2, T32543.SALARY as c3, T32537.DEPARTMENT_NAME as c4, T32532.COUNTRY_NAME as c5, T32577.REGION_NAME as c6 from JOBS T32556, REGIONS T32577, COUNTRIES T32532, LOCATIONS T32569, DEPARTMENTS T32537, EMPLOYEES T32543 where ( T32532.COUNTRY_ID = T32569.COUNTRY_ID and T32532.REGION_ID = T32577.REGION_ID and T32537.DEPARTMENT_ID = T32543.DEPARTMENT_ID and T32537.LOCATION_ID = T32569.LOCATION_ID and T32543.JOB_ID = T32556.JOB_ID ) Not a very tough example I know but you get the idea. How do I know what the BIServer is up to? How can I find out what the issue might be if BIServer chokes on my query? There are a couple of steps: In the Administrator tool you need to set the logging level for the Administrator user to something greater than the default '0'. '7' is going to give you the max. Just remember to take it back down after you have finished the debug. I needed to bounce my BIServer service Now here's the secret sauce. Prefix the following to your BIP query set variable LOGLEVEL = 7; Set the log level to that you have in the admin tool Now run your BIP report. With the prefix in place; BIServer will write to the NQQuery.log file. This is located in the ./OracleBI/server/Log directory. In there you are going to find the complete process the BIServer has gone through to try and get the data back for you A quick note, if the BIServer can, its going to hit that great BIEE cache to get your data and you may not see the full log. IF this is the case. Get inot hte Administration page (via the browser login) and clear out your BIP report cursor. Then re-run. This will hopefully help out if you are trying to debug that annoying BIP report that will not run or is getting some strange data. Don't forget to turn that logging level back down once you are done. This will avoid the DBA screaming at you for sucking up all the disk space on the system.

    Read the article

  • Ardour won't rewind when jack time master

    - by Edward
    Using Ubuntu Studio 12.04, ardour will not rewind when it is set to the jack time master. I've read that this could be due to a jack/ardour version conflict, but I am not sure what the correct combo should be. The same thing happens with "ardour 2.8.14 (built from revision 13065)" and "ardour 2.8.12 (built from revision 10144)". The latter is the default installation with ubuntu studio 12.04 LTS. Linux "/proc/version" reports as Linux version 3.2.0-23-lowlatency-pae (buildd@vernadsky) (gcc version 4.6.3 (Ubuntu/Linaro 4.6.3-1ubuntu4) ) #31-Ubuntu SMP PREEMPT Wed Apr 11 04:07:36 UTC 2012 and "jackd --version" reports as: jackdmp 1.9.8 Copyright 2001-2005 Paul Davis and others. Copyright 2004-2011 Grame. jackdmp comes with ABSOLUTELY NO WARRANTY This is free software, and you are welcome to redistribute it under certain conditions; see the file COPYING for details jackdmp version 1.9.8 tmpdir /dev/shm protocol 8 Thanks for any help.

    Read the article

  • Planet feed aggregator for django

    - by marcog
    We are looking for a way to integrate a feed aggregator (planet) into a Django site. Ideally, the planet should integrate as part of a page of the site as a whole, rather than a standalone page like all other plants I've seen. We could use an iframe, but then style won't match. The best way might be something that just returns a raw list of last N feed items, which we then insert into a template. Does anyone have any suggestions of how we can achieve this?

    Read the article

  • Have SSIS' differing type systems ever caused you problems?

    - by jamiet
    One thing that has always infuriated me about SSIS is the fact that every package has three different type systems; to give you an idea of what I am talking about consider the following: The SSIS dataflow's type system is made up of types called DT_*  (e.g. DT_STR, DT_I4) The SSIS variable type system is based on .Net datatypes (e.g. String, Int32) The types available for Execute SQL Task's parameters are based on something else - I don't exactly know what (e.g. VARCHAR, LONG) Speaking euphemistically ... this is not an optimum situation (were I not speaking euphemistically I would be a lot ruder) and hence I have submitted a suggestion to Connect at [SSIS] Consolidate three type systems into one requesting that it be remedied. This accompanying blog post is not however a request for votes (though that would be nice); the reason is actually subtler than that. Let me explain. I have been submitting bugs and suggestions pertaining to SSIS for years and have, so far, submitted over 200 Connect items. If that experience has taught me anything it is this - Connect items are not generally actioned because they are considered "nice to have". No, SSIS Connect items get actioned because they cause customers grief and if I am perfectly honest I must admit that, other than being a bit gnarly, SSIS' three type system architecture has never knowingly caused me any significant problems. The reason for this blog post is to ask if any reader out there has ever encountered any problems on account of SSIS' three type systems or have you, like me, never found them to be a problem? Errors or performance degredation caused by implicit type conversions would, I believe, present a strong case for getting this situation remedied in a future version of SSIS so if you HAVE encountered such problems I would encourage you to leave a comment on the Connect submission accordingly. Let me know in the comments too - I would be interested to hear others' opinions on this. @Jamiet

    Read the article

  • SSIS Expression Tester Tool

    - by Davide Mauri
    Thanks to my friend's Doug blog I’ve found a very nice tool made by fellow MVP Darren Green which really helps to make SSIS develoepers life easier: http://expressioneditor.codeplex.com/Wikipage?ProjectName=expressioneditor In brief the tool allow the testing of SSIS Expression so that one can evaluate and test them before using in SSIS packages. Cool and useful! Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • iOS: Using Jenkins for nightly internal builds (TestFlight), plus frequent client builds

    - by Amy Worrall
    I'm an iOS dev, working for a small agency. I'm currently on a few smaller projects where I'm the only developer. We recently acquired a Jenkins server, but each project is left to fend for themselves as for how to use it. I'd like to use it for making and distributing builds. My ideal would be: Every commit is built as a single IPA that is placed in a HTTP-accessible location. (It only needs to keep the latest one, otherwise the disk would fill up — some of our apps are 500MB or more.) Once a day it makes a build, signs it with our internal provisioning profile, tacks a build number onto the end of the version number, and sends it to our internal TestFlight team. When manually prompted, it builds the latest commit, gives it a manually specified version number, signs it with the client's provisioning profile and sends it to TestFlight. I'm pretty new to Jenkins. The developer who set up the server is running it on one of our projects, so I know it has the right stuff installed to do Xcode builds. I believe he's only using it to run unit tests though, not to do any of the code signing, IPA creation or TestFlight stuff. So my questions: I've listed three distinct kinds of build. How does Jenkins cope with that? I see there's a "build triggers" section in the config for a Jenkins project, but it doesn't seem to mention different types of build. Should I just set up multiple Jenkins projects, called "App X (continuous)", "App X (nightly)" and "App X (client)"? How do I specify the provisioning profile through Jenkins? If there isn't a way, I guess I could make different configurations in the Xcode project… Has anyone else used Jenkins to actually do the release (i.e. build and push to TestFlight) of beta builds of their apps? Is it a good idea? Or should I continue just doing it manually?

    Read the article

  • DTLoggedExec 1.0 Stable Released!

    - by Davide Mauri
    After serveral years of development I’ve finally released the first non-beta version of DTLoggedExec! I’m now very confident that the product is stable and solid and has all the feature that are important to have (at least for me). DTLoggedExec 1.0 http://dtloggedexec.codeplex.com/releases/view/44689 Here’s the release notes: FIRST NON-BETA RELEASE! :) Code cleaned up Added SetPackageInfo method to ILogProvider interface to make easier future improvements Deprecated the arguments 'ProfileDataFlow', 'ProfilePath', 'ProfileFileName' Added the new argument 'ProfileDataFlowFileName' that replaces the old 'ProfileDataFlow', 'ProfilePath', 'ProfileFileName' arguments Updated database scripts to support new reports Split releases in three different packages for easier maintenance and updates: DTLoggedExec Executable, Samples & Reports Fixed Issue #25738 (http://dtloggedexec.codeplex.com/WorkItem/View.aspx?WorkItemId=25738) Fixed Issue #26479 (http://dtloggedexec.codeplex.com/WorkItem/View.aspx?WorkItemId=26479) To make things easier to maintain I’ve divided the original package in three different releases. One is the DTLoggedExec executable; samples and reports are now available in separate packages so that I can update them more frequently without having to touch the engine. Source code of everything is available through Source Code Control: http://dtloggedexec.codeplex.com/SourceControl/list/changesets As usual, comments and feebacks are more than welcome! (Just use Codeplex, please, so it will be easier for me to keep track of requests and issues) Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Embedding BIP just got easier!

    - by Tim Dexter
    To make up for yesterday's documentation faux pas, when I referenced Kan's blog as a source for the debug feature over Leslie's fine official documentation that had come out some time before Kan's blog. Apologies again Leslie! I noticed another new feature that I was unaware of in the New Features guide for 10.1.3.4.1. The ability to remove the BI Publisher header from the user interface so you can get this: instead of this. Useful? If you wanted to host BIP inside another application such as a portal or custom app, you can make the BIP UI app look much more integrated. By default you still get our 'blue' look and feel' but I have documented else where how you can change that. For instance, now you can have BIP hosted inside a BIEE instance and provide access to all the reports a user might wish to run with very little effort rather than picking and choosing what to bubble up to the dashboard. How do you do it? Get on over to the New Features guide and find out, there are some other goodies there too. Note to self, RTFM!

    Read the article

  • Entity Framework 4.0: Optimal and horrible SQL

    - by DigiMortal
    Lately I had Entity Framework 4.0 session where I introduced new features of Entity Framework. During session I found out with audience how Entity Framework 4.0 can generate optimized SQL. After session I also showed guys one horrible example about how awful SQL can be generated by Entity Framework. In this posting I will cover both examples. Optimal SQL Before going to code take a look at following model. There is class called Event and I will use this class in my query. Here is the LINQ To Entities query that uses small anonymous type. var query = from e in _context.Events             select new { Id = e.Id, Title = e.Title }; Debug.WriteLine(((ObjectQuery)query).ToTraceString()); Running this code gives us the following SQL. SELECT      [Extent1].[event_id] AS [event_id],      [Extent1].[title] AS [title]  FROM [dbo].[events] AS [Extent1] This is really small – no additional fields in SELECT clause. Nice, isn’t it? Horrible SQL Ayende Rahien blog shows us darker side of Entiry Framework 4.0 queries. You can find comparison betwenn NHibernate, LINQ To SQL and LINQ To Entities from posting What happens behind the scenes: NHibernate, Linq to SQL, Entity Framework scenario analysis. In this posting I will show you the resulting query and let you think how much better it can be done. Well, it is not something we want to see running in our servers. I hope that EF team improves generated SQL to acceptable level before Visual Studio 2010 is released. There is also morale of this example: you should always check out the queries that O/R-mapper generates. Behind the curtains it may silently generate queries that perform badly and in this case you need to optimize you data querying strategy. Conclusion Entity Framework 4.0 is new product with a lot of new features and it is clear that not everything is 100% super in its first release. But it still great step forward and I hope that on 12.04.2010 we have new promising O/R-mapper available to use in our projects. If you want to read more about Entity Framework 4.0 and Visual Studio 2010 then please feel free to follow this link to list of my Visual Studio 2010 and .NET Framework 4.0 postings.

    Read the article

  • Beat the Post-Holiday Blues with a dose of BIWA

    - by mdonohue
    You know its coming so why not plan ahead.  Come and join like minded professionals at the BIWA Summit 2013 Early Bird Registration ends December 14th for BIWA Summit 2013. This event, focused on Business Intelligence, Data Warehousing and Analytics, is hosted by the BIWA SIG of the IOUG on January 9 and 10, at the Hotel Sofitel, near Oracle headquarters in Redwood City, California. Be sure to check out the many featured speakers, including Oracle executives Balaji Yelamanchili, Vaishnavi Sashikanth, and Tom Kyte, and Ari Kaplan, sports analyst, as well as the many other speakers. Hands-on labs will give you the opportunity to try out much of the Oracle software for yourself--be sure to bring a laptop capable of running Windows Remote Desktop. Check out the Schedule page for the list of over 40 sessions on all sorts of BIWA-related topics. See the BIWA Summit 2013 web site for details and be sure to register soon, while early bird rates still apply. Klaus and Nikos will be presenting the ever popular Getting the Best Performance from your Business Intelligence Publisher Reports and Implementation and we will run 2 sessions of the BI Publisher Hands On Lab for building Reports and Data Models. Hope to see you there.

    Read the article

  • How to communicate within a company what is being Continually Deployed

    - by Francis Spor
    I work for a small development company, 20 people total in the entire company, 3 in actual development, and we've adopted CD for our commits to trunk, and it works great, from a code management and up-time side. However - we're getting flak from our support staff and marketing department that they don't feel that they're getting enough lead time on new features and notifications on bug fixes that could change behavior. Part of why we love the CD system is for us in development, it's fast, we fix the bug, add the quick feature, close the Bugz and move on with our day to the next item. All members of our company are now on HipChat at all times, and when a deployment occurs, a message is sent to a room that all company members are in, letting them know what was just deployed (it just shows the commit messages from tip back to the last recorded deployment). We in development are also attempting to make sure that when we're making a change that modifies the UI or a public facing behavior, we post a screenshot to the All Company room and explain what the behavior change is, seeking pushback or concerns. Often, the response is silence. Sometimes, it's a few minor questions, but nothing that need stop the deployment from happening. What I'm wondering is how do other users of the CD method deal with notifications of new features and changes to areas of the company that are not development - and eventually on to customers in the world? Thanks, Francis

    Read the article

< Previous Page | 143 144 145 146 147 148 149 150 151 152 153 154  | Next Page >