Search Results

Search found 3086 results on 124 pages for 'biztalk 2009'.

Page 21/124 | < Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >

  • Regex for date.

    - by Harikrishna
    What should be the regex for matching date of any format like 26FEB2009 30 Jul 2009 27 Mar 2008 29/05/2008 27 Aug 2009 What should be the regular expression for that ? Edit I have regex that matches with 26-Feb-2009 and 26 FEB 2009 with but not with 26FEB2009. So if any one know then please update it. (?:^|[^\d\w:])(?'day'\d{1,2})(?:-?st\s+|-?th\s+|-?rd\s+|-?nd\s+|-|\s+)(?'month'Jan|Feb|Mar|Apr|May|Jun|Jul|Aug|Sep|Oct|Nov|Dec)[uarychilestmbro]*(?:\s*,?\s*|-)(?:'?(?'year'\d{2})|(?'year'\d{4}))(?=$|[^\d\w])

    Read the article

  • PL/SQL Sum by hour

    - by Steve
    Hi, I have some data with start and stop date that I need to sum. I am not sure how to code for it. Here are is the data I have to use: STARTTIME,STOPTIME,EVENTCAPACITY 8/12/2009 1:15:00 PM,8/12/2009 1:59:59 PM,100 8/12/2009 2:00:00 PM,8/12/2009 2:29:59 PM,100 8/12/2009 2:30:00 PM,8/12/2009 2:59:59 PM,80 8/12/2009 3:00:00 PM,8/12/2009 3:59:59 PM,85 In this example I would need the sum from 1pm to 2pm, 2pm to 3pm and 3pm to 4pm Any suggestions are appreciated. Steve

    Read the article

  • Update WCF service reference

    - by SteveC
    In BizTalk 2006 R2, is there some way to regenerate the reference to a WCF service that has been created using the "Add Generated Items / Consume WCF Service" option? I tried just re-running the wizard, but it creates new ODX, BindingInfo, etc. files, and breaks the solution, so that's not the way :-( Searched the web, but not found any references to how to update the references, just plenty on creating from scratch

    Read the article

  • how i group do this mysql query

    - by moustafa
    i want to make charts system and i think it must be like that 1 jan 2009 = 10 post 2 jan 2009 = 2 post 4 jan 2009 = 10 post 6 jan 2009 = 60 post and i have posts table that has id,user_id,date how i can select from posts to show it like that

    Read the article

  • BtsTask to import policy

    - by Sean
    Hello, I am looking for a way to import BRE generated policy with its' vocabularies into BizTalk application from a command line (in order to script it) leveraging BtsTask command line tool. I've searched around, and couldn't find a firm answer. Thank you.

    Read the article

  • How to find the latest row for each group of data

    - by Jason
    Hi All, I have a tricky problem that I'm trying to find the most effective method to solve. Here's a simplified version of my View structure. Table: Audits AuditID | PublicationID | AuditEndDate | AuditStartDate 1 | 3 | 13/05/2010 | 01/01/2010 2 | 1 | 31/12/2009 | 01/10/2009 3 | 3 | 31/03/2010 | 01/01/2010 4 | 3 | 31/12/2009 | 01/10/2009 5 | 2 | 31/03/2010 | 01/01/2010 6 | 2 | 31/12/2009 | 01/10/2009 7 | 1 | 30/09/2009 | 01/01/2009 There's 3 query's that I need from this. I need to one to get all the data. The next to get only the history data (that is, everything but exclude the latest data item by AuditEndDate) and then the last query is to obtain the latest data item (by AuditEndDate). There's an added layer of complexity that I have a date restriction (This is on a per user/group basis) where certain user groups can only see between certain dates. You'll notice this in the where clause as AuditEndDate<=blah and AuditStartDate=blah Foreach publication, select all the data available. select * from Audits Where auditEndDate<='31/03/10' and AuditStartDate='06/06/2009'; foreach publication, select all the data but Exclude the latest data available (by AuditEndDate) select * from Audits left join (select AuditId as aid, publicationID as pid and max(auditEndDate) as pend from Audit where auditenddate <= '31/03/2009' /* user restrict / group by pid) Ax on Ax.pid=Audit.pubid where pend!=Audits.auditenddate AND auditEndDate<='31/03/10' and AuditStartDate='06/06/2009' / user restrict */ Foreach publication, select only the latest data available (by AuditEndDate) select * from Audits left join (select AuditId as aid, publicationID as pid and max(auditEndDate) as pend from Audit where auditenddate <= '31/03/2009'/* user restrict / group by pid) Ax on Ax.pid=Audit.pubid where pend=Audits.auditenddate AND auditEndDate<='31/03/10' and AuditStartDate='06/06/2009' / user restrict */ So at the moment, query 1 and 3 work fine, but query 2 just returns all the data instead of the restriction. Can anyone help me? Thanks jason

    Read the article

  • MS Excel: Can I link images using a relative path?

    - by Port Islander 2009
    I am working on an MS Excel document that contains a lot of (around 200) images. They are currently saved within the document, so the file becomes huge and working gets very slow. Linking the pictures without saving them works very well - I now have the Excel document and a folder "pictures" next to it that contains all my image files. However, when I move the document and the folder to a new location, all my pictures disappear. This seems to be because Excel saves the link information as absolute paths. (Update: Actually, according to this thread, Excel stores the link information as relative paths as well. Now I really don't know why my links break down..) Is there a convenient way to save them as relative paths or have Excel automatically update the path information? Update: It's important that the images get displayed on the sheet and can be printed. I am working with Microsoft Excel for Mac 2008 and 2011. I really appreciate your help.

    Read the article

  • OutOfMemoryException Processing Large File

    - by Krip
    We are loading a large flat file into BizTalk Server 2006 (Original release, not R2) - about 125 MB. We run a map against it and then take each row and make a call out to a stored procedure. We receive the OutOfMemoryException during orchestration processing, the Windows Service restarts, uses full 2 GB memory, and crashes again. The server is 32-bit and set to use the /3GB switch. Also I've separated the flow into 3 hosts - one for receive, the other for orchestration, and the third for sends. Anyone have any suggestions for getting this file to process wihout error? Thanks, Krip

    Read the article

  • Adding Message Part dynamically in Receive Pipeline

    - by Sean
    Hello, I tried to create a custom pipeline component that takes a message and attaches additional another part dynamically (during Disassemble stage). I haven't set up a send port, so that I can see what BizTalk is trying to process. I can see only the body part, the additional part doesn't show up. This is the code I used: var part = pc.GetMessageFactory().CreateMessagePart(); part.Data = new MemoryStream(new byte[] {1, 2, 3, 4, 5}); inmsg.AddPart("another_part", part, false); Thank you.

    Read the article

  • What is the recommended way to split messages in send pipelines?

    - by ToxicAvenger
    I need to split a bizTalk message in the send pipeline. This is easy with disassemblers in receive pipelines, but doesn't work in send pipelines (makes sense). So what is the recommended way to do it? The only easy way to do it is to write the outbound message to file, then reprocess it using a receive pipeline with a disassembler, and then send the generated messages through a outbound pipeline. Honestly, I don't need the additional roundtrip through the message box, but I don't want to create a custom send adapter. Any other suggestions? Any easy way to save messages with multiple parts using the ootb file adapter?

    Read the article

  • How to create conditional If / Else logic in a BizTalk map.

    How to create conditional logic in a BizTalk map using out of the box functoids. Example takes in a Xml file containing Films and their receipts and create a destination file whose structure id dependent on the incoming data.  read moreBy BiZTech KnowDid you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • JMX Based Monitoring - Part Four - Business App Server Monitoring

    - by Anthony Shorten
    In the last blog entry I talked about the Oracle Utilities Application Framework V4 feature for monitoring and managing aspects of the Web Application Server using JMX. In this blog entry I am going to discuss a similar new feature that allows JMX to be used for management and monitoring the Oracle Utilities business application server component. This feature is primarily focussed on performance tracking of the product. In first release of Oracle Utilities Customer Care And Billing (V1.x I am talking about), we used to use Oracle Tuxedo as part of the architecture. In Oracle Utilities Application Framework V2.0 and above, we removed Tuxedo from the architecture. One of the features that some customers used within Tuxedo was the performance tracking ability. The idea was that you enabled performance logging on the individual Tuxedo servers and then used a utility named txrpt to produce a performance report. This report would list every service called, the number of times it was called and the average response time. When I worked a performance consultant, I used this report to identify badly performing services and also gauge the overall performance characteristics of a site. When Tuxedo was removed from the architecture this information was also lost. While you can get some information from access.log and some Mbeans supplied by the Web Application Server it was not at the same granularity as txrpt or as useful. I am happy to say we have not only reintroduced this facility in Oracle Utilities Application Framework but it is now accessible via JMX and also we have added more detail into the performance tracking. Most of this new design was working with customers around the world to make sure we introduced a new feature that not only satisfied their performance tracking needs but allowed for finer grained performance analysis. As with the Web Application Server, the Business Application Server JMX monitoring is enabled by specifying a JMX port number in RMI Port number for JMX Business and initial credentials in the JMX Enablement System User ID and JMX Enablement System Password configuration options. These options are available using the configureEnv[.sh] -a utility. These credentials are shared across the Web Application Server and Business Application Server for authorization purposes. Once this is information is supplied a number of configuration files are built (by the initialSetup[.sh] utility) to configure the facility: spl.properties - contains the JMX URL, the security configuration and the mbeans that are enabled. For example, on my demonstration machine: spl.runtime.management.rmi.port=6750 spl.runtime.management.connector.url.default=service:jmx:rmi:///jndi/rmi://localhost:6750/oracle/ouaf/ejbAppConnector jmx.remote.x.password.file=scripts/ouaf.jmx.password.file jmx.remote.x.access.file=scripts/ouaf.jmx.access.file ouaf.jmx.com.splwg.ejb.service.management.PerformanceStatistics=enabled ouaf.jmx.* files - contain the userid and password. The default configuration uses the JMX default configuration. You can use additional security features by altering the spl.properties file manually or using a custom template. For more security options see JMX Security for more details. Once it has been configured and the changes reflected in the product using the initialSetup[.sh] utility the JMX facility can be used. For illustrative purposes I will use jconsole but any JSR160 complaint browser or client can be used (with the appropriate configuration). Once you start jconsole (ensure that splenviron[.sh] is executed prior to execution to set the environment variables or for remote connection, ensure java is in your path and jconsole.jar in your classpath) you specify the URL in the spl.runtime.management.connnector.url.default entry. For example: You are then able to track performance of the product using the PerformanceStatistics Mbean. The attributes of the PerformanceStatistics Mbean are counts of each object type. This is where this facility differs from txrpt. The information that is collected includes the following: The Service Type is captured so you can filter the results in terms of the type of service. For maintenance type services you can even see the transaction type (ADD, CHANGE etc) so you can see the performance of updates against read transactions. The Minimum and Maximum are also collected to give you an idea of the spread of performance. The last call is recorded. The date, time and user of the last call are recorded to give you an idea of the timeliness of the data. The Mbean maintains a set of counters per Service Type to give you a summary of the types of transactions being executed. This gives you an overall picture of the types of transactions and volumes at your site. There are a number of interesting operations that can also be performed: reset - This resets the statistics back to zero. This is an important operation. For example, txrpt is restricted to collecting statistics per hour, which is ok for most people. But what if you wanted to be more granular? This operation allows to set the collection period to anything you wish. The statistics collected will represent values since the last restart or last reset. completeExecutionDump - This is the operation that produces a CSV in memory to allow extraction of the data. All the statistics are extracted (see the Server Administration Guide for a full list). This can be then loaded into a database, a tool or simply into your favourite spreadsheet for analysis. Here is an extract of an execution dump from my demonstration environment to give you an idea of the format: ServiceName, ServiceType, MinTime, MaxTime, Avg Time, # of Calls, Latest Time, Latest Date, Latest User ... CFLZLOUL, EXECUTE_LIST, 15.0, 64.0, 22.2, 10, 16.0, 2009-12-16::11-25-36-932, ASHORTEN CILBBLLP, READ, 106.0, 1184.0, 466.3333333333333, 6, 106.0, 2009-12-16::11-39-01-645, BOBAMA CILBBLLP, DELETE, 70.0, 146.0, 108.0, 2, 70.0, 2009-12-15::12-53-58-280, BPAYS CILBBLLP, ADD, 860.0, 4903.0, 2243.5, 8, 860.0, 2009-12-16::17-54-23-862, LELLISON CILBBLLP, CHANGE, 112.0, 3410.0, 815.1666666666666, 12, 112.0, 2009-12-16::11-40-01-103, ASHORTEN CILBCBAL, EXECUTE_LIST, 8.0, 84.0, 26.0, 22, 23.0, 2009-12-16::17-54-01-643, LJACKMAN InitializeUserInfoService, READ_SYSTEM, 49.0, 962.0, 70.83777777777777, 450, 63.0, 2010-02-25::11-21-21-667, ASHORTEN InitializeUserService, READ_SYSTEM, 130.0, 2835.0, 234.85777777777778, 450, 216.0, 2010-02-25::11-21-21-446, ASHORTEN MenuLoginService, READ_SYSTEM, 530.0, 1186.0, 703.3333333333334, 9, 530.0, 2009-12-16::16-39-31-172, ASHORTEN NavigationOptionDescriptionService, READ_SYSTEM, 2.0, 7.0, 4.0, 8, 2.0, 2009-12-21::09-46-46-892, ASHORTEN ... There are other operations and attributes available. Refer to the Server Administration Guide provided with your product to understand the full et of operations and attributes. This is one of the many features I am proud that we implemented as it allows flexible monitoring of the performance of the product.

    Read the article

  • BizTalk and SQL: Alternatives to the SQL receive adapter. Using Msmq to receive SQL data

    - by Leonid Ganeline
    If we have to get data from the SQL database, the standard way is to use a receive port with SQL adapter. SQL receive adapter is a solicit-response adapter. It periodically polls the SQL database with queries. That’s only way it can work. Sometimes it is undesirable. With new WCF-SQL adapter we can use the lightweight approach but still with the same principle, the WCF-SQL adapter periodically solicits the database with queries to check for the new records. Imagine the situation when the new records can appear in very broad time limits, some - in a second interval, others - in the several minutes interval. Our requirement is to process the new records ASAP. That means the polling interval should be near the shortest interval between the new records, a second interval. As a result the most of the poll queries would return nothing and would load the database without good reason. If the database is working under heavy payload, it is very undesirable. Do we have other choices? Sure. We can change the polling to the “eventing”. The good news is the SQL server could issue the event in case of new records with triggers. Got a new record –the trigger event is fired. No new records – no the trigger events – no excessive load to the database. The bad news is the SQL Server doesn’t have intrinsic methods to send the event data outside. For example, we would rather use the adapters that do listen for the data and do not solicit. There are several such adapters-listeners as File, Ftp, SOAP, WCF, and Msmq. But the SQL Server doesn’t have methods to create and save files, to consume the Web-services, to create and send messages in the queue, does it? Can we use the File, FTP, Msmq, WCF adapters to get data from SQL code? Yes, we can. The SQL Server 2005 and 2008 have the possibility to use .NET code inside SQL code. See the SQL Integration. How it works for the Msmq, for example: ·         New record is created, trigger is fired ·         Trigger calls the CLR stored procedure and passes the message parameters to it ·         The CLR stored procedure creates message and sends it to the outgoing queue in the SQL Server computer. ·         Msmq service transfers message to the queue in the BizTalk Server computer. ·         WCF-NetMsmq adapter receives the message from this queue. For the File adapter the idea is the same, the CLR stored procedure creates and stores the file with message, and then the File adapter picks up this file. Using WCF-NetMsmq adapter to get data from SQL I am describing the full set of the deployment and development steps for the case with the WCF-NetMsmq adapter. Development: 1.       Create the .NET code: project, class and method to create and send the message to the MSMQ queue. 2.       Create the SQL code in triggers to call the .NET code. Installation and Deployment: 1.       SQL Server: a.       Register the CLR assembly with .NET (CLR) code b.      Install the MSMQ Services 2.       BizTalk Server: a.       Install the MSMQ Services b.      Create the MSMQ queue c.       Create the WCF-NetMsmq receive port. The detailed description is below. Code .NET code … using System.Xml; using System.Xml.Linq; using System.Xml.Serialization;   //namespace MyCompany.MySolution.MyProject – doesn’t work. The assembly name is MyCompany.MySolution.MyProject // I gave up with the compound namespace. Seems the CLR Integration cannot work with it L. Maybe I’m wrong.     public class Event     {         static public XElement CreateMsg(int par1, int par2, int par3)         {             XNamespace ns = "http://schemas.microsoft.com/Sql/2008/05/TypedPolling/my_storedProc";             XElement xdoc =                 new XElement(ns + "TypedPolling",                     new XElement(ns + "TypedPollingResultSet0",                         new XElement(ns + "TypedPollingResultSet0",                             new XElement(ns + "par1", par1),                             new XElement(ns + "par2", par2),                             new XElement(ns + "par3", par3),                         )                     )                 );             return xdoc;         }     }   //////////////////////////////////////////////////////////////////////// … using System.ServiceModel; using System.ServiceModel.Channels; using System.Transactions; using System.Data; using System.Data.Sql; using System.Data.SqlTypes;   public class MsmqHelper {     [Microsoft.SqlServer.Server.SqlProcedure]     // msmqAddress as "net.msmq://localhost/private/myapp.myqueue";     public static void SendMsg(string msmqAddress, string action, int par1, int par2, int par3)     {         using (TransactionScope scope = new TransactionScope(TransactionScopeOption.Suppress))         {             NetMsmqBinding binding = new NetMsmqBinding(NetMsmqSecurityMode.None);             binding.ExactlyOnce = true;             EndpointAddress address = new EndpointAddress(msmqAddress);               using (ChannelFactory<IOutputChannel> factory = new ChannelFactory<IOutputChannel>(binding, address))             {                 IOutputChannel channel = factory.CreateChannel();                 try                 {                     XElement xe = Event.CreateMsg(par1, par2, par3);                     XmlReader xr = xe.CreateReader();                     Message msg = Message.CreateMessage(MessageVersion.Default, action, xr);                     channel.Send(msg);                     //SqlContext.Pipe.Send(…); // to test                 }                 catch (Exception ex)                 { …                 }             }             scope.Complete();         }     }   SQL code in triggers   -- sp_SendMsg was registered as a name of the MsmqHelper.SendMsg() EXEC sp_SendMsg'net.msmq://biztalk_server_name/private/myapp.myqueue', 'Create', @par1, @par2, @par3   Installation and Deployment On the SQL Server Registering the CLR assembly 1.       Prerequisites: .NET 3.5 SP1 Framework. It could be the issue for the production SQL Server! 2.       For more information, please, see the link http://nielsb.wordpress.com/sqlclrwcf/ 3.       Copy files: >copy “\Windows\Microsoft.net\Framework\v3.0\Windows Communication Foundation\Microsoft.Transactions.Bridge.dll” “\Program Files\Reference Assemblies\Microsoft\Framework\v3.0 \Microsoft.Transactions.Bridge.dll” If your machine is a 64-bit, run two commands: >copy “\Windows\Microsoft.net\Framework\v3.0\Windows Communication Foundation\Microsoft.Transactions.Bridge.dll” “\Program Files (x86)\Reference Assemblies\Microsoft\Framework\v3.0 \Microsoft.Transactions.Bridge.dll” >copy “\Windows\Microsoft.net\Framework64\v3.0\Windows Communication Foundation\Microsoft.Transactions.Bridge.dll” “\Program Files\Reference Assemblies\Microsoft\Framework\v3.0 \Microsoft.Transactions.Bridge.dll” 4.       Execute the SQL code to register the .NET assemblies: -- For x64 OS: CREATE ASSEMBLY SMdiagnostics AUTHORIZATION dbo FROM 'C:\Windows\Microsoft.NET\Framework\v3.0\Windows Communication Foundation\SMdiagnostics.dll' WITH permission_set = unsafe CREATE ASSEMBLY [System.Web] AUTHORIZATION dbo FROM 'C:\Windows\Microsoft.NET\Framework64\v2.0.50727\System.Web.dll' WITH permission_set = unsafe CREATE ASSEMBLY [System.Messaging] AUTHORIZATION dbo FROM 'C:\Windows\Microsoft.NET\Framework\v2.0.50727\System.Messaging.dll' WITH permission_set = unsafe CREATE ASSEMBLY [System.ServiceModel] AUTHORIZATION dbo FROM 'C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\v3.0\System.ServiceModel.dll' WITH permission_set = unsafe CREATE ASSEMBLY [System.Xml.Linq] AUTHORIZATION dbo FROM 'C:\Program Files\Reference Assemblies\Microsoft\Framework\v3.5\System.Xml.Linq.dll' WITH permission_set = unsafe   -- For x32 OS: --CREATE ASSEMBLY SMdiagnostics AUTHORIZATION dbo FROM 'C:\Windows\Microsoft.NET\Framework\v3.0\Windows Communication Foundation\SMdiagnostics.dll' WITH permission_set = unsafe --CREATE ASSEMBLY [System.Web] AUTHORIZATION dbo FROM 'C:\Windows\Microsoft.NET\Framework\v2.0.50727\System.Web.dll' WITH permission_set = unsafe --CREATE ASSEMBLY [System.Messaging] AUTHORIZATION dbo FROM 'C:\Windows\Microsoft.NET\Framework\v2.0.50727\System.Messaging.dll' WITH permission_set = unsafe --CREATE ASSEMBLY [System.ServiceModel] AUTHORIZATION dbo FROM 'C:\Program Files\Reference Assemblies\Microsoft\Framework\v3.0\System.ServiceModel.dll' WITH permission_set = unsafe 5.       Register the assembly with the external stored procedure: CREATE ASSEMBLY [HelperClass] AUTHORIZATION dbo FROM ’<FilePath>MyCompany.MySolution.MyProject.dll' WITH permission_set = unsafe where the <FilePath> - the path of the file on this machine! 6. Create the external stored procedure CREATE PROCEDURE sp_SendMsg (        @msmqAddress nvarchar(100),        @Action NVARCHAR(50),        @par1 int,        @par2 int,        @par3 int ) AS EXTERNAL NAME HelperClear.MsmqHelper.SendMsg   Installing the MSMQ Services 1.       Check if the MSMQ service is NOT installed. To check:  Start / Administrative Tools / Computer Management, on the left pane open the “Services and Applications”, search to the “Message Queuing”. If you cannot see it, follow next steps. 2.       Start / Control Panel / Programs and Features 3.       Click “Turn Windows Features on or off” 4.       Click Features, click “Add Features” 5.       Scroll down the feature list; open the “Message Queuing” / “Message Queuing Services”; and check the “Message Queuing Server” option  6.       Click Next; Click Install; wait to the successful finish of the installation Creating the MSMQ queue We don’t need to create the queue on the “sender” side. On the BizTalk Server Installing the MSMQ Services The same is as for the SQL Server. Creating the MSMQ queue 1.       Start / Administrative Tools / Computer Management, on the left pane open the “Services and Applications”, open the “Message Queuing”, and open the “Private Queues”. 2.       Right-click the “Private Queues”; choose New; choose “Private Queue”. 3.       Type the Queue name as ’myapp.myqueue'; check the “Transactional” option. Creating the WCF-NetMsmq receive port I will not go through this step in all details. It is straightforward. URI for this receive location should be 'net.msmq://localhost/private/myapp.myqueue'. Notes ·         The biggest problem is usually on the step the “Registering the CLR assembly”. It is hard to predict where are the assemblies from the assembly list, what version should be used, x86 or x64. It is pity of such “rude” integration of the SQL with .NET. ·         In couple cases the new WCF-NetMsmq port was not able to work with the queue. Try to replace the WCF- NetMsmq port with the WCF-Custom port with netMsmqBinding. It was working fine for me. ·         To test how messages go through the queue you can turn on the Journal /Enabled option for the queue. I used the QueueExplorer utility to look to the messages in Journal. The Computer Management can also show the messages but it shows only small part of the message body and in the weird format. The QueueExplorer can do the better job; it shows the whole body and Xml messages are in good color format.

    Read the article

  • Replace .sln with MSBuild and wrap contained projects into targets

    - by Filburt
    I'd like to create a MSBuild project that reflects the project dependencies in a solution and wraps the VS projects inside reusable targets. The problem I like solve doing this is to svn-export, build and deploy a specific assembly (and its dependencies) in an BizTalk application. My question is: How can I make the targets for svn-exporting, building and deploying reusable and also reuse the wrapped projects when they are built for different dependencies? I know it would be simpler to just build the solution and deploy only the assemblies needed but I'd like to reuse the targets as much as possible. The parts The project I like to deploy <Project DefaultTargets="Deploy" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"> <PropertyGroup> <ExportRoot Condition="'$(Export)'==''">Export</ExportRoot> </PropertyGroup> <Target Name="Clean_Export"> <RemoveDir Directories="$(ExportRoot)\My.Project.Dir" /> </Target> <Target Name="Export_MyProject"> <Exec Command="svn export svn://xxx/trunk/Biztalk2009/MyProject.btproj --force" WorkingDirectory="$(ExportRoot)" /> </Target> <Target Name="Build_MyProject" DependsOnTargets="Export_MyProject"> <MSBuild Projects="$(ExportRoot)\My.Project.Dir\MyProject.btproj" Targets="Build" Properties="Configuration=Release"></MSBuild> </Target> <Target Name="Deploy_MyProject" DependsOnTargets="Build_MyProject"> <Exec Command="BTSTask AddResource -ApplicationName:CORE -Source:MyProject.dll" /> </Target> </Project> The projects it depends upon look almost exactly like this (other .btproj and .csproj).

    Read the article

  • select rows with unidentical column values

    - by Bazon
    Hi Guys, I need to create a new data frame that excludes dams that appear in "dam1" and "dam2" columns on the same fosdate (fostering date). I tried df <- df[df$dam1!=dam2,] but did not work. Dam1 and dam2 are factors which are the ids's of mothers. my df: fosdate dam1 dam2 8/09/2009 2Z523 2Z523 30/10/2009 1W509 5C080 30/10/2009 1W509 5C640 30/10/2009 1W509 1W509 1/10/2009 1W311 63927 The new data frame that I need to get is: dfnew: fosdate dam1 dam2 30/10/2009 1W509 5C080 30/10/2009 1W509 5C640 1/10/2009 1W311 63927 Would appreciate any help! Bazon

    Read the article

  • select rows with unidentical column values using R

    - by Bazon
    Hi Guys, I need to create a new data frame that excludes dams that appear in "dam1" and "dam2" columns on the same fosdate (fostering date). I tried df <- df[df$dam1!=df$dam2,] but did not work. Dam1 and dam2 are factors which are the ids's of mothers. my df: fosdate dam1 dam2 8/09/2009 2Z523 2Z523 30/10/2009 1W509 5C080 30/10/2009 1W509 5C640 30/10/2009 1W509 1W509 1/10/2009 1W311 63927 The new data frame that I need to get is: dfnew: fosdate dam1 dam2 30/10/2009 1W509 5C080 30/10/2009 1W509 5C640 1/10/2009 1W311 63927 Would appreciate any help! Bazon

    Read the article

  • Business Case for investing time developing Stubs and BizUnit Tests

    - by charlie.mott
    I was recently in a position where I had to justify why effort should be spent developing Stubbed Integration Tests for BizTalk solutions. These tests are usually developed using the BizUnit framework. I assumed that most seasoned BizTalk developers would consider this best practice. Even though Microsoft suggest use of BizUnit on MSDN, I've not found a single site listing the justifications for investing time writing stubs and BizUnit tests. Stubs Stubs should be developed to isolate your development team from external dependencies. This is described by Michael Stephenson here. Failing to do this can result in the following problems: In contract-first scenarios, the external system interface will have been defined.  But the interface may not have been setup or even developed yet for the BizTalk developers to work with. By the time you open the target location to see the data BizTalk has sent, it may have been swept away. If you are relying on the UI of the target system to see the data BizTalk has sent, what do you do if it fails to arrive? It may take time for the data to be processed or it may be scheduled to be processed later. Learning how to use the source\target systems and investigations into where things go wrong in these systems will slow down the BizTalk development effort. By the time the data is visible in a UI it may have undergone further transformations. In larger development teams working together, do you all use the same source and target instances. How do you know which data was created by whose tests? How do you know which event log error message are whose?  Another developer may have “cleaned up” your data. It is harder to write BizUnit tests that clean up the data\logs after each test run. What if your B2B partners' source or target system cannot support the sort of testing you want to do. They may not even have a development or test instance that you can work with. Their single test instance may be used by the SIT\UAT teams. There may be licencing costs of setting up an instances of the external system. The stubs I like to use are generic stubs that can accept\return any message type.  Usually I need to create one per protocol. They should be driven by BizUnit steps to: validates the data received; and select a response messages (or error response). Once built, they can be re-used for many integration tests and from project to project. I’m not saying that developers should never test against a real instance.  Every so often, you still need to connect to real developer or test instances of the source and target endpoints\services. The interface developers may ask you to send them some data to see if everything still works.  Or you might want some messages sent to BizTalk to get confidence that everything still works beyond BizTalk. Tests Automated “Stubbed Integration Tests” are usually built using the BizUnit framework. These facilitate testing of the entire integration process from source stub to target stub. It will ensure that all of the BizTalk components are configured together correctly to meet all the requirements. More fine grained unit testing of individual BizTalk components is still encouraged.  But BizUnit provides much the easiest way to test some components types (e.g. Orchestrations). Using BizUnit with the Behaviour Driven Development approach described by Mike Stephenson delivers the following benefits: source: http://biztalkbddsample.codeplex.com – Video 1. Requirements can be easily defined using Given/When/Then Requirements are close to the code so easier to manage as features and scenarios Requirements are defined in domain language The feature files can be used as part of the documentation The documentation is accurate to the build of code and can be published with a release The scenarios are effective to document the scenarios and are not over excessive The scenarios are maintained with the code There’s an abstraction between the intention and implementation of tests making them easier to understand The requirements drive the testing These same tests can also be used to drive load testing as described here. If you don't do this ... If you don't follow the above “Stubbed Integration Tests” approach, the developer will need to manually trigger the tests. This has the following risks: Developers are unlikely to check all the scenarios each time and all the expected conditions each time. After the developer leaves, these manual test steps may be lost. What test scenarios are there?  What test messages did they use for each scenario? There is no mechanism to prove adequate test coverage. A test team may attempt to automate integration test scenarios in a test environment through the triggering of tests from a source system UI. If this is a replacement for BizUnit tests, then this carries the following risks: It moves the tests downstream, so problems will be found later in the process. Testers may not check all the expected conditions within the BizTalk infrastructure such as: event logs, suspended messages, etc. These automated tests may also get in the way of manual tests run on these environments.

    Read the article

  • BI and EPM Landscape

    - by frank.buytendijk
    Most of my blog entries are not about Oracle products, and most of the latest entries are about topics such as IT strategy and enterprise architecture. However, given my background at Gartner, and at Hyperion, I still keep a close eye on what's happening in BI and EPM. One important reason is that I believe there is significant competitive value for organizations getting BI and EPM right. Davenport and Harris wrote a great book called "Competing on Analytics", in which they explain this in a very engaging and convincing way. At Oracle we have defined the concept of "management excellence" that outlines what organizations have to do to keep or create a competitive edge. It's not only in the business processes, but also in the management processes. Recently, Gartner published its 2009 market shares report for BI, Analytics, and Performance Management. Gartner identifies the same three segments that Oracle does: (1) CPM Suites (Oracle refers not to Corporate Performance Management, but Enterprise Performance Management), (2) BI Platform, and (3) Analytic Applications & Performance Management. According to Gartner, Oracle's share is increasing with revenue growing by more than 5%. Oracle currently holds the #2 market share position in the overall BI Software space based on total BI software revenue. Source: Gartner Dataquest Market Share: Business Intelligence, Analytics and Performance Management Software, Worldwide, 2009; Dan Sommer and Bhavish Sood; Apr 2010 Gartner has ranked Oracle as #1 in the CPM Suites worldwide sub-segment based on total BI software revenue, and Oracle is gaining share with revenue growing by more than 6% in 2009. Source: Gartner Dataquest Market Share: Business Intelligence, Analytics and Performance Management Software, Worldwide, 2009; Dan Sommer and Bhavish Sood; Apr 2010 The Analytic Applications & Performance Management subsegment is more fragmented. It has for instance a very large "Other Vendors" category. The largest player traditionally is SAS. Analytic Applications are often meant for very specific analytic needs in very specific industry sectors. According to Gartner, from the large vendors, again Oracle is the one who is gaining the most share - with total BI software revenue growth close to 15% in 2009. Source: Gartner Dataquest Market Share: Business Intelligence, Analytics and Performance Management Software, Worldwide, 2009; Dan Sommer and Bhavish Sood; Apr 2010 I believe this shows Oracle's integration strategy is working. In fact, integration actually is the innovation. BI and EPM have been silo technology platforms and application suites way too long. Management and measuring performance should be very closely linked to strategy execution, which is the domain of other business application areas such as CRM, ERP, and Supply Chain. BI and EPM are not about "making better decisions" anymore, but are part of a tangible action framework. Furthermore, organizations are getting more serious about ecosystem thinking. They do not evaluate single tools anymore for different application areas, but buy into a complete ecosystem of hardware, software and services. The best ecosystem is the one that offers the most options, in environments where the uncertainty is high and investments are hard to reverse. The key to successfully managing such an environment is middleware, and BI and EPM become increasingly middleware intensive. In fact, given the horizontal nature of BI and EPM, sitting on top of all business functions and applications, you could call them "upperware". Many are active in the BI and EPM space. Big players can offer a lot, but there are always many areas that are covered by specialty vendors. Oracle openly embraces those technologies within the ecosystem as well. Complete, open and integrated still accurately describes the Oracle product strategy. frank

    Read the article

  • Hibernate Exception, what wrong ? [[Exception in thread "main" org.hibernate.InvalidMappingException

    - by user195970
    I use netbean 6.7.1 to write "hello world" witch hibernate, but I get some errors, plz help me, thank you very much. my exception init: deps-module-jar: deps-ear-jar: deps-jar: Copying 1 file to F:\Documents and Settings\My Dropbox\DropboxNetBeanProjects\loginspring\build\web\WEB-INF\classes compile-single: run-main: Oct 25, 2009 2:44:05 AM org.hibernate.cfg.Environment <clinit> INFO: Hibernate 3.2.5 Oct 25, 2009 2:44:05 AM org.hibernate.cfg.Environment <clinit> INFO: hibernate.properties not found Oct 25, 2009 2:44:05 AM org.hibernate.cfg.Environment buildBytecodeProvider INFO: Bytecode provider name : cglib Oct 25, 2009 2:44:05 AM org.hibernate.cfg.Environment <clinit> INFO: using JDK 1.4 java.sql.Timestamp handling Oct 25, 2009 2:44:05 AM org.hibernate.cfg.Configuration configure INFO: configuring from resource: /hibernate.cfg.xml Oct 25, 2009 2:44:05 AM org.hibernate.cfg.Configuration getConfigurationInputStream INFO: Configuration resource: /hibernate.cfg.xml Oct 25, 2009 2:44:06 AM org.hibernate.cfg.Configuration addResource INFO: Reading mappings from resource : hibernate/Tbluser.hbm.xml Oct 25, 2009 2:44:06 AM org.hibernate.util.XMLHelper$ErrorLogger error SEVERE: Error parsing XML: XML InputStream(1) Document is invalid: no grammar found. Oct 25, 2009 2:44:06 AM org.hibernate.util.XMLHelper$ErrorLogger error SEVERE: Error parsing XML: XML InputStream(1) Document root element "hibernate-mapping", must match DOCTYPE root "null". Exception in thread "main" org.hibernate.InvalidMappingException: Could not parse mapping document from resource hibernate/Tbluser.hbm.xml at org.hibernate.cfg.Configuration.addResource(Configuration.java:569) at org.hibernate.cfg.Configuration.parseMappingElement(Configuration.java:1587) at org.hibernate.cfg.Configuration.parseSessionFactory(Configuration.java:1555) at org.hibernate.cfg.Configuration.doConfigure(Configuration.java:1534) at org.hibernate.cfg.Configuration.doConfigure(Configuration.java:1508) at org.hibernate.cfg.Configuration.configure(Configuration.java:1428) at org.hibernate.cfg.Configuration.configure(Configuration.java:1414) at hibernate.CreateTest.main(CreateTest.java:22) Caused by: org.hibernate.InvalidMappingException: Could not parse mapping document from invalid mapping at org.hibernate.cfg.Configuration.addInputStream(Configuration.java:502) at org.hibernate.cfg.Configuration.addResource(Configuration.java:566) ... 7 more Caused by: org.xml.sax.SAXParseException: Document is invalid: no grammar found. at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.createSAXParseException(ErrorHandlerWrapper.java:195) at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.error(ErrorHandlerWrapper.java:131) at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:384) at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:318) at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.scanStartElement(XMLNSDocumentScannerImpl.java:250) at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl$NSContentDriver.scanRootElementHook(XMLNSDocumentScannerImpl.java:626) at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(XMLDocumentFragmentScannerImpl.java:3095) at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$PrologDriver.next(XMLDocumentScannerImpl.java:921) at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:648) at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.next(XMLNSDocumentScannerImpl.java:140) at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:510) at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:807) at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:737) at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:107) at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1205) at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:522) at org.dom4j.io.SAXReader.read(SAXReader.java:465) at org.hibernate.cfg.Configuration.addInputStream(Configuration.java:499) ... 8 more Java Result: 1 BUILD SUCCESSFUL (total time: 1 second) hibernate.cfg.xml <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE hibernate-configuration PUBLIC "-//Hibernate/Hibernate Configuration DTD 3.0//EN" "http://hibernate.sourceforge.net/hibernate-configuration-3.0.dtd"> <hibernate-configuration> <session-factory> <property name="hibernate.dialect">org.hibernate.dialect.MySQLDialect</property> <property name="hibernate.connection.driver_class">com.mysql.jdbc.Driver</property> <property name="hibernate.connection.url">jdbc:mysql://localhost:3306/hibernate</property> <property name="hibernate.connection.username">root</property> </session-factory> </hibernate-configuration> Tbluser.hbm.xml <?xml version="1.0"?> <!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hibernate Mapping DTD 3.0//EN" "http://hibernate.sourceforge.net/hibernate-mapping-3.0.dtd"> <!-- Generated Oct 25, 2009 2:37:30 AM by Hibernate Tools 3.2.1.GA --> <hibernate-mapping> <class name="hibernate.Tbluser" table="tbluser" catalog="hibernate"> <id name="userId" type="java.lang.Integer"> <column name="userID" /> <generator class="identity" /> </id> <property name="username" type="string"> <column name="username" length="50" /> </property> <property name="password" type="string"> <column name="password" length="50" /> </property> <property name="email" type="string"> <column name="email" length="50" /> </property> <property name="phone" type="string"> <column name="phone" length="50" /> </property> <property name="groupId" type="java.lang.Integer"> <column name="groupID" /> </property> </class> </hibernate-mapping> Tbluser.java package hibernate; // Generated Oct 25, 2009 2:37:30 AM by Hibernate Tools 3.2.1.GA /** * Tbluser generated by hbm2java */ public class Tbluser implements java.io.Serializable { private Integer userId; private String username; private String password; private String email; private String phone; private Integer groupId; public Tbluser() { } public Tbluser(String username, String password, String email, String phone, Integer groupId) { this.username = username; this.password = password; this.email = email; this.phone = phone; this.groupId = groupId; } public Integer getUserId() { return this.userId; } public void setUserId(Integer userId) { this.userId = userId; } public String getUsername() { return this.username; } public void setUsername(String username) { this.username = username; } public String getPassword() { return this.password; } public void setPassword(String password) { this.password = password; } public String getEmail() { return this.email; } public void setEmail(String email) { this.email = email; } public String getPhone() { return this.phone; } public void setPhone(String phone) { this.phone = phone; } public Integer getGroupId() { return this.groupId; } public void setGroupId(Integer groupId) { this.groupId = groupId; } }

    Read the article

  • Help using mod_jk to forward to backend app server

    - by ravun
    I had mod-jk working a while ago but after switching servers and modifying some files, it no longer works. I am using mod_jk-1.2.28 with JBoss 4.2.3 as the backend. In the JBoss server.xml file I have the AJP 1.3 connector defined on port 8009 and I am binding jboss to the server's new IP address. The app I am trying to forward to is deployed as: [TomcatDeployer] deploy, ctxPath=/ManualAlerts, warUrl=.../tmp/deploy/tmp8097651929280250028ManualAlertsApp.ear-contents/ManualAlerts-exp.war/ On the web server, I have worker.properties with a worker set for the JBoss address and port 8009. The mod-jk.conf has JkMount /ManualAlerts/* worker1. Shouldn't this forward all requests to the web server with the URL http://address/ManualAlerts/ to the backend app named ManualAlerts? The mod-jk.log shows: [Sat Oct 31 14:19:28 2009][30709:3086014224] [error] ajp_send_request::jk_ajp_common.c (1507): (worker1) connecting to backend failed. Tomcat is probably not started or is listening on the wrong port (errno=115) [Sat Oct 31 14:19:28 2009][30709:3086014224] [info] ajp_service::jk_ajp_common.c (2447): (worker1) sending request to tomcat failed (recoverable), because of error during request sending (attempt=2) [Sat Oct 31 14:19:28 2009][30709:3086014224] [error] ajp_service::jk_ajp_common.c (2466): (worker1) connecting to tomcat failed. [Sat Oct 31 14:19:28 2009][30709:3086014224] [info] service::jk_lb_worker.c (1384): service failed, worker worker1 is in error state [Sat Oct 31 14:19:28 2009][30709:3086014224] [info] service::jk_lb_worker.c (1464): All tomcat instances are busy or in error state [Sat Oct 31 14:19:28 2009][30709:3086014224] [error] service::jk_lb_worker.c (1469): All tomcat instances failed, no more workers left Running netstat -an on the app server shows jboss listening on 8009 and the local address is the app server's address. In the mod-jk.log it shows connect to (XXX.XXX.XXX.XXX:8009) failed, and the app-server address is correct here, too. I cannot figure out what's causing the issue.

    Read the article

  • Accessing XML file using JavaScript And ASP.net |VB code

    - by Bubba
    Am trying to read in data from an xml file but using javascript which is embedded into my asp.net|vb code. I am new to asp.net but coming from a programming background. so I declared the xml objects for the appropriate browsers, as well as the name of the local xml to read data from, I then start by appending the create the table tag and then append it to the div tag in hack5.aspx I declare the variable that will represent/ hold the xml returned data object. I then run a for loop , before creating a row tag and then appending it to the div tag in hack5.aspx I then create the a row tag and then appending it to the div tag in hack5.aspx | then create a TextNode which is passed to variable, then create a td and append to div . then lastly append the textnode to td this format is the same for creating another 13 td tags that are to hold the data. The main problem is when I run the script - I see nothing display on my screen . no errors are shown, but with your sample code runs smoothly. So the first file hack5.aspx is as follows: <%@ Page Language="VB" AutoEventWireup="false" CodeFile="hack5.aspx.vb" Inherits="_Default" %> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" > <head runat="server"> <title>Diplaying MessageBox from ASP.NET</title> </head> <body> <form id="form1" runat="server"> <div id="showtime" > </div> </form> </body> </html> The next file hack5.aspx.vb is as follows: Partial Class _Default Inherits System.Web.UI.Page Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load Dim scriptString as String = "<script language=JavaScript> if (window.XMLHttpRequest) " scriptString += " { " scriptString += " xhttp=new XMLHttpRequest(); " scriptString += " } " scriptString += " else " scriptString += " { " scriptString += " xhttp=new ActiveXObject('Microsoft.XMLHTTP'); " scriptString += " } " scriptString += " xhttp.open('GET','yes.xml',false); " scriptString += " xhttp.send(null);" scriptString += " xmlDoc= xhttp.responseXML; " scriptString += " var table1 = document.createElement('table'); " scriptString += " document.getElementById('showtime').appendChild(table1); " scriptString += " var x=xmlDoc.getElementsByTagName('Table'); " scriptString += " for (i=0;i<x.length;i++) " scriptString += " { " scriptString += " var assessment = document.createTextNode(x[i].getElementsByTagName('Assessment')[0].childNodes[0].nodeValue);" scriptString += " var row1 = document.createElement('tr'); " scriptString += " document.getElementById('showtime').appendChild(row1); " scriptString += " var column1 = document.createElement('td'); " scriptString += " document.getElementById('showtime').appendChild(column1); " scriptString += " column1.appendChild(assessment); " scriptString += " var Issue_Date = document.createTextNode(x[i].getElementsByTagName('Issue_Date')[0].childNodes[0].nodeValue);" scriptString += " var column2 = document.createElement('td'); " scriptString += " document.getElementById('showtime').appendChild(column2); " scriptString += " column2.appendChild(Issue_Date); " scriptString += " var Due_Date = document.createTextNode(x[i].getElementsByTagName('Due_Date')[0].childNodes[0].nodeValue);" scriptString += " var column3 = document.createElement('td'); " scriptString += " document.getElementById('showtime').appendChild(column3); " scriptString += " column3.appendChild(Due_Date); " scriptString += " var Interest = document.createTextNode(x[i].getElementsByTagName('Interest')[0].childNodes[0].nodeValue);" scriptString += " var column4 = document.createElement('td'); " scriptString += " document.getElementById('showtime').appendChild(column4); " scriptString += " column4.appendChild(Interest); " scriptString += " var Summary = document.createTextNode(x[i].getElementsByTagName('Summary')[0].childNodes[0].nodeValue);" scriptString += " var column5 = document.createElement('td'); " scriptString += " document.getElementById('showtime').appendChild(column5); " scriptString += " column5.appendChild(Summary);" scriptString += " var Amount_Due= document.createTextNode(x[i].getElementsByTagName('Amount_Due')[0].childNodes[0].nodeValue);" scriptString += " var column6 = document.createElement('td'); " scriptString += " document.getElementById('showtime').appendChild(column6); " scriptString += " column6.appendChild(Amount_Due);" scriptString += " var IEduty = document.createTextNode(x[i].getElementsByTagName('IEduty')[0].childNodes[0].nodeValue);" scriptString += " var column7 = document.createElement('td'); " scriptString += " document.getElementById('showtime').appendChild(column7); " scriptString += " column7.appendChild(IEduty);" scriptString += " var LEsurtax = document.createTextNode(x[i].getElementsByTagName('LEsurtax')[0].childNodes[0].nodeValue);" scriptString += " var column8 = document.createElement('td'); " scriptString += " document.getElementById('showtime').appendChild(column8); " scriptString += " column8.appendChild(LEsurtax);" scriptString += " var CEsurtax = document.createTextNode(x[i].getElementsByTagName('CEsurtax')[0].childNodes[0].nodeValue);" scriptString += " var column9 = document.createElement('td'); " scriptString += " document.getElementById('showtime').appendChild(column9); " scriptString += " column9.appendChild(CEsurtax);" scriptString += " var EXduty = document.createTextNode(x[i].getElementsByTagName('EXduty')[0].childNodes[0].nodeValue);" scriptString += " var column10 = document.createElement('td'); " scriptString += " document.getElementById('showtime').appendChild(column10); " scriptString += " column10.appendChild(EXduty);" scriptString += " var IMvat = document.createTextNode(x[i].getElementsByTagName('IMvat')[0].childNodes[0].nodeValue);" scriptString += " var column11 = document.createElement('td'); " scriptString += " document.getElementById('showtime').appendChild(column11); " scriptString += " column11.appendChild(IMvat);" scriptString += " var SYSfee = document.createTextNode(x[i].getElementsByTagName('SYSfee')[0].childNodes[0].nodeValue);" scriptString += " var column12 = document.createElement('td'); " scriptString += " document.getElementById('showtime').appendChild(column12); " scriptString += " column12.appendChild(SYSfee);" scriptString += " var AItax = document.createTextNode(x[i].getElementsByTagName('AItax')[0].childNodes[0].nodeValue);" scriptString += " var column13 = document.createElement('td'); " scriptString += " document.getElementById('showtime').appendChild(column13); " scriptString += " column13.appendChild(AItax);" scriptString += " var Cduty = document.createTextNode(x[i].getElementsByTagName('Cduty')[0].childNodes[0].nodeValue);" scriptString += " var column14 = document.createElement('td'); " scriptString += " document.getElementById('showtime').appendChild(column14); " scriptString += " column14.appendChild(Cduty);" scriptString += " } " scriptString += " <" scriptString += "/" scriptString += "script>" If(Not ClientScript.IsStartupScriptRegistered("clientScript")) ClientScript.RegisterClientScriptBlock(Me.GetType(),"clientScript", scriptString) End If End Sub End Class And finally the xml file is as follows: <?xml version="1.0" encoding="utf-8" ?> <DataSet xmlns="http://tempuri.org/"> <xs:schema id="NewDataSet" xmlns="" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:msdata="urn:schemas-microsoft-com:xml-msdata"> <xs:element name="NewDataSet" msdata:IsDataSet="true" msdata:UseCurrentLocale="true"> <xs:complexType> <xs:choice minOccurs="0" maxOccurs="unbounded"> <xs:element name="Table"> <xs:complexType> <xs:sequence> <xs:element name="UserName" type="xs:string" minOccurs="0" /> <xs:element name="Password" type="xs:string" minOccurs="0" /> <xs:element name="UserLevel" type="xs:string" minOccurs="0" /> <xs:element name="FName" type="xs:string" minOccurs="0" /> <xs:element name="LName" type="xs:string" minOccurs="0" /> <xs:element name="Branch" type="xs:string" minOccurs="0" /> <xs:element name="Department" type="xs:string" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> </xs:choice> </xs:complexType> </xs:element> </xs:schema> <diffgr:diffgram xmlns:msdata="urn:schemas-microsoft-com:xml-msdata" xmlns:diffgr="urn:schemas-microsoft-com:xml-diffgram-v1"> <NewDataSet xmlns=""> <Table diffgr:id="Table1" msdata:rowOrder="0"> <Assessment>CHR/A157/2009</Assessment> <Issue_Date>20/10/2009</Issue_Date> <Due_Date>01/11/2009</Due_Date> <Interest>2.00</Interest> <Summary>BENTLEY 2009</Summary> <Amount_Due>28000000.00</Amount_Due> <IEduty>3000000.00</IEduty> <LEsurtax>4000000.00</LEsurtax> <CEsurtax>5000000.00</CEsurtax> <EXduty>0.00</EXduty> <IMvat>5000000.00</IMvat> <SYSfee>8000000.00</SYSfee> <AItax>2000000.00</AItax> <Cduty>1000000.00</Cduty> </Table> <Table diffgr:id="Table1" msdata:rowOrder="1"> <Assessment>CHR/A167/2009</Assessment> <Issue_Date>20/10/2009</Issue_Date> <Due_Date>01/11/2009</Due_Date> <Interest>2.00</Interest> <Summary>BENTLEY 2009</Summary> <Amount_Due>24000000.00</Amount_Due> <IEduty>3000000.00</IEduty> <LEsurtax>4000000.00</LEsurtax> <CEsurtax>5000000.00</CEsurtax> <EXduty>0.00</EXduty> <IMvat>1000000.00</IMvat> <SYSfee>8000000.00</SYSfee> <AItax>2000000.00</AItax> <Cduty>1000000.00</Cduty> </Table> <Table diffgr:id="Table1" msdata:rowOrder="2"> <Assessment>CHR/A196/2009</Assessment> <Issue_Date>11/11/2009</Issue_Date> <Due_Date>21/11/2009</Due_Date> <Interest>2.00</Interest> <Summary>BENTLEY 2009</Summary> <Amount_Due>20000000.00</Amount_Due> <IEduty>3000000.00</IEduty> <LEsurtax>4000000.00</LEsurtax> <CEsurtax>5000000.00</CEsurtax> <EXduty>0.00</EXduty> <IMvat>1000000.00</IMvat> <SYSfee>4000000.00</SYSfee> <AItax>2000000.00</AItax> <Cduty>1000000.00</Cduty> </Table> </NewDataSet> </diffgr:diffgram> </DataSet>

    Read the article

  • Type of Blobs

    - by kaleidoscope
    With the release of Windows Azure November 2009 CTP, now we have two types of blobs. Block Blob - This blob type is in place since PDC 2008 and is optimized for streaming workloads. [Max Size allowed : 200GB] Page Blob - With November 2009 CTP release, a new blob type is added which is optimized for random read / writes called Page Blob. [Max Size allowed : 1TB] More details can be found at: http://geekswithblogs.net/IUnknown/archive/2009/11/16/azure-november-ctp-announced.aspx Amit, S

    Read the article

  • JVM throws OutOfMemory during gc though there are plenty memory left...

    - by Shu L.
    I have my java application configured to use 5G memory. I got an OutOfMemory out of blue. I inspected the gc log and found plenty of memory left: young generation occupies 4% allocated space, tenure generation occupancy is 5% and perm generation is 43%. I am puzzled why JVM throws an OutOfMemory at the gc time. Does anyone know why this is happening? Your help is greatly appreciated. JVM memory and gc settings: -server -Xms5g -Xmx5g -Xss256k -XX:NewSize=2g -XX:MaxNewSize=2g -XX:+UseParallelOldGC -XX:+UseTLAB -XX:SurvivorRatio=8 -XX:TargetSurvivorRatio=90 -XX:+DisableExplicitGC gc.log 2009-09-19T03:34:59.741+0000: 92836.778: [GC Desired survivor size 152567808 bytes, new threshold 1 (max 15) [PSYoungGen: 1941492K-144057K(1947072K)] 3138022K-1340830K(5092800K), 0.1947640 secs] [Times: user=0.61 sys=0.01, real=0.19 secs] 2009-09-19T03:35:29.918+0000: 92866.954: [GC Desired survivor size 152109056 bytes, new threshold 1 (max 15) [PSYoungGen: 1941625K-144049K(1948608K)] 3138398K-1341080K(5094336K), 0.1942000 secs] [Times: user=0.61 sys=0.01, real=0.20 secs] 2009-09-19T03:35:56.883+0000: 92893.920: [GC Desired survivor size 156565504 bytes, new threshold 1 (max 15) [PSYoungGen: 1567994K-115427K(1915072K)] 2765026K-1312820K(5060800K), 0.1586320 secs] [Times: user=0.50 sys=0.01, real=0.16 secs] 2009-09-19T03:35:57.042+0000: 92894.079: [GC Desired survivor size 179961856 bytes, new threshold 1 (max 15) [PSYoungGen: 115427K-0K(1898560K)] 1312820K-1313987K(5044288K), 0.0775650 secs] [Times: user=0.42 sys=0.19, real=0.08 secs] 2009-09-19T03:35:57.120+0000: 92894.157: [Full GC [PSYoungGen: 0K-0K(1898560K)] [ParOldGen: 1313987K-159522K(3145728K)] 1313987K-159522K(5044288K) [PSPermGen: 20025K-19942K(40256K)], 0.56923 00 secs] [Times: user=2.18 sys=0.05, real=0.57 secs] 2009-09-19T03:35:57.690+0000: 92894.726: [GC Desired survivor size 197066752 bytes, new threshold 1 (max 15) [PSYoungGen: 0K-0K(1745728K)] 159522K-159522K(4891456K), 0.0072590 secs] [Times: user=0.01 sys=0.00, real=0.00 secs] 2009-09-19T03:35:57.698+0000: 92894.734: [Full GC [PSYoungGen: 0K-0K(1745728K)] [ParOldGen: 159522K-158627K(3145728K)] 159522K-158627K(4891456K) [PSPermGen: 19942K-19934K(45504K)], 0.3280480 secs] [Times: user=1.46 sys=0.00, real=0.33 secs] Heap PSYoungGen total 1745728K, used 87233K [0x00002aab73650000, 0x00002aabf3650000, 0x00002aabf3650000) eden space 1745664K, 4% used [0x00002aab73650000,0x00002aab78b80778,0x00002aabddf10000) from space 64K, 0% used [0x00002aabddf10000,0x00002aabddf10000,0x00002aabddf20000) to space 192448K, 0% used [0x00002aabe7a60000,0x00002aabe7a60000,0x00002aabf3650000) ParOldGen total 3145728K, used 158627K [0x00002aaab3650000, 0x00002aab73650000, 0x00002aab73650000) object space 3145728K, 5% used [0x00002aaab3650000,0x00002aaabd138d28,0x00002aab73650000) PSPermGen total 45504K, used 19965K [0x00002aaaae250000, 0x00002aaab0ec0000, 0x00002aaab3650000) object space 45504K, 43% used [0x00002aaaae250000,0x00002aaaaf5cf668,0x00002aaab0ec0000) I am on 64-bit Linux and JRE 1.6.0_10: $uname -a Linux x 2.6.24-etchnhalf.1-amd64 #1 SMP Tue Oct 14 03:11:45 UTC 2008 x86_64 GNU/Linux $java -version java version "1.6.0_10" Java(TM) SE Runtime Environment (build 1.6.0_10-b33) Java HotSpot(TM) 64-Bit Server VM (build 11.0-b15, mixed mode)

    Read the article

  • Repeat Customers Each Year (Retention)

    - by spazzie
    I've been working on this and I don't think I'm doing it right. |D Our database doesn't keep track of how many customers we retain so we looked for an alternate method. It's outlined in this article. It suggests you have this table to fill in: Year Number of Customers Number of customers Retained in 2009 Percent (%) Retained in 2009 Number of customers Retained in 2010 Percent (%) Retained in 2010 .... 2008 2009 2010 2011 2012 Total The table would go out to 2012 in the headers. I'm just saving space. It tells you to find the total number of customers you had in your starting year. To do this, I used this query since our starting year is 2008: select YEAR(OrderDate) as 'Year', COUNT(distinct(billemail)) as Customers from dbo.tblOrder where OrderDate >= '2008-01-01' and OrderDate <= '2008-12-31' group by YEAR(OrderDate) At the moment we just differentiate our customers by email address. Then you have to search for the same names of customers who purchased again in later years (ours are 2009, 10, 11, and 12). I came up with this. It should find people who purchased in both 2008 and 2009. SELECT YEAR(OrderDate) as 'Year',COUNT(distinct(billemail)) as Customers FROM dbo.tblOrder o with (nolock) WHERE o.BillEmail IN (SELECT DISTINCT o1.BillEmail FROM dbo.tblOrder o1 with (nolock) WHERE o1.OrderDate BETWEEN '2008-1-1' AND '2009-1-1') AND o.BillEmail IN (SELECT DISTINCT o2.BillEmail FROM dbo.tblOrder o2 with (nolock) WHERE o2.OrderDate BETWEEN '2009-1-1' AND '2010-1-1') --AND o.OrderDate BETWEEN '2008-1-1' AND '2013-1-1' AND o.BillEmail NOT LIKE '%@halloweencostumes.com' AND o.BillEmail NOT LIKE '' GROUP BY YEAR(OrderDate) So I'm just finding the customers who purchased in both those years. And then I'm doing an independent query to find those who purchased in 2008 and 2010, then 08 and 11, and then 08 and 12. This one finds 2008 and 2010 purchasers: SELECT YEAR(OrderDate) as 'Year',COUNT(distinct(billemail)) as Customers FROM dbo.tblOrder o with (nolock) WHERE o.BillEmail IN (SELECT DISTINCT o1.BillEmail FROM dbo.tblOrder o1 with (nolock) WHERE o1.OrderDate BETWEEN '2008-1-1' AND '2009-1-1') AND o.BillEmail IN (SELECT DISTINCT o2.BillEmail FROM dbo.tblOrder o2 with (nolock) WHERE o2.OrderDate BETWEEN '2010-1-1' AND '2011-1-1') --AND o.OrderDate BETWEEN '2008-1-1' AND '2013-1-1' AND o.BillEmail NOT LIKE '%@halloweencostumes.com' AND o.BillEmail NOT LIKE '' GROUP BY YEAR(OrderDate) So you see I have a different query for each year comparison. They're all unrelated. So in the end I'm just finding people who bought in 2008 and 2009, and then a potentially different group that bought in 2008 and 2010, and so on. For this to be accurate, do I have to use the same grouping of 2008 buyers each time? So they bought in 2009 and 2010 and 2011, and 2012? This is where I'm worried and not sure how to proceed or even find such data. Any advice would be appreciated! Thanks!

    Read the article

< Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >