Search Results

Search found 25148 results on 1006 pages for 'distributed source contr'.

Page 313/1006 | < Previous Page | 309 310 311 312 313 314 315 316 317 318 319 320  | Next Page >

  • Automation testing tool for Regression testing of desktop application

    - by user285037
    Hi I am working on a desktop application which uses Infragistic grids. We need to automate the regression tests for same. QTP alone does not support this, we need to buy new plug in for same which my company is not very much interested in. Do we have any open source tool for automating regression testing of desktop application? Application is in Dot net but i do not think it makes much of a difference. Please suggests, i have zeroed in for test complete but again it is licensed one. I need some open source.

    Read the article

  • SQLiteDataAdapter Fill exception

    - by Lirik
    I'm trying to use the OleDb CSV parser to load some data from a CSV file and insert it into a SQLite database, but I get an exception with the OleDbAdapter.Fill method and it's frustrating: An unhandled exception of type 'System.Data.ConstraintException' occurred in System.Data.dll Additional information: Failed to enable constraints. One or more rows contain values violating non-null, unique, or foreign-key constraints. Here is the source code: public void InsertData(String csvFileName, String tableName) { String dir = Path.GetDirectoryName(csvFileName); String name = Path.GetFileName(csvFileName); using (OleDbConnection conn = new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + dir + @";Extended Properties=""Text;HDR=No;FMT=Delimited""")) { conn.Open(); using (OleDbDataAdapter adapter = new OleDbDataAdapter("SELECT * FROM " + name, conn)) { QuoteDataSet ds = new QuoteDataSet(); adapter.Fill(ds, tableName); // <-- Exception here InsertData(ds, tableName); // <-- Inserts the data into the my SQLite db } } } class Program { static void Main(string[] args) { SQLiteDatabase target = new SQLiteDatabase(); string csvFileName = "D:\\Innovations\\Finch\\dev\\DataFeed\\YahooTagsInfo.csv"; string tableName = "Tags"; target.InsertData(csvFileName, tableName); Console.ReadKey(); } } The "YahooTagsInfo.csv" file looks like this: tagId,tagName,description,colName,dataType,realTime 1,s,Symbol,symbol,VARCHAR,FALSE 2,c8,After Hours Change,afterhours,DOUBLE,TRUE 3,g3,Annualized Gain,annualizedGain,DOUBLE,FALSE 4,a,Ask,ask,DOUBLE,FALSE 5,a5,Ask Size,askSize,DOUBLE,FALSE 6,a2,Average Daily Volume,avgDailyVolume,DOUBLE,FALSE 7,b,Bid,bid,DOUBLE,FALSE 8,b6,Bid Size,bidSize,DOUBLE,FALSE 9,b4,Book Value,bookValue,DOUBLE,FALSE I've tried the following: Removing the first line in the CSV file so it doesn't confuse it for real data. Changing the TRUE/FALSE realTime flag to 1/0. I've tried 1 and 2 together (i.e. removed the first line and changed the flag). None of these things helped... One constraint is that the tagId is supposed to be unique. Here is what the table look like in design view: Can anybody help me figure out what is the problem here? Update: I changed the HDR property from HDR=No to HDR=Yes and now it doesn't give me an exception: OleDbConnection conn = new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + dir + @";Extended Properties=""Text;HDR=Yes;FMT=Delimited"""); I assumed that if HDR=No and I removed the header (i.e. first line), then it should work... strangely it didn't work. In any case, now I'm no longer getting the exception. The new problem is here: public void InsertData(QuoteDataSet data, String tableName) { using (SQLiteConnection conn = new SQLiteConnection(_connectionString)) { conn.Open(); using (SQLiteDataAdapter sqliteAdapter = new SQLiteDataAdapter("SELECT * FROM " + tableName, conn)) { Console.WriteLine("Num rows updated is " + sqliteAdapter.Update(data, tableName)); } } } Now the Num rows updated is 0... any hints?

    Read the article

  • Silverlight 4 MediaElement play sound

    - by NealWalters
    I converted a local sound file to a resource, which built this in my XAML: <UserControl.Resources> <my:Uri x:Key="SoundFiles">file:///c:/Audio/HebrewDemo/Shalom.wav</my:Uri> </UserControl.Resources> I did this by pasting a local disk mp3 filename into source, then clicked on the "dot" by source and chose "Extract Value to Resource". When I run, it tells me that "Uri" is not valid, and sure enough, in the Intellisense, I see other elements that start with "uri" but not just URI by itself. In the real world I want to specify a dynamic mp3 file name. For example, I might have a database of foreign language words used for flashcards, I want to play a sound file on a URL. But I thought I would try to walk before running...

    Read the article

  • using java.util.Scanner to read a file byte by byte

    - by openidsucks
    I'm trying to read a one line file character by character using java.util.Scanner. However I'm getting this exception": Exception in thread "main" java.util.InputMismatchException: For input string: "contents of my file" at java.util.Scanner.nextByte(Scanner.java:1861) at java.util.Scanner.nextByte(Scanner.java:1814) at p008.main(p008.java:18) <-- line where I do scanner.nextByte() Here's my code: public static void main(String[] args) throws FileNotFoundException { File source = new File("file.txt"); Scanner scanner = new Scanner(source); while(scanner.hasNext()) { System.out.println((char)scanner.nextByte()); } scanner.close() } Does anyone have any ideas as to what I might be doing wrong? Edit: I realized I wrote hasNext() instead of hasNextByte(). However if I do that it doesn't print out anything.

    Read the article

  • Grails, app-engine, jpa - TargetInvocationException

    - by John
    I'm trying to learn about grails with Google App Engine and JPA by following a few tutorials: http://www.morkeleb.com/2009/08/12/grails-and-google-appengine-beginners-guide/ http://inhouse32.appspot.com/index.html http://grails.org/plugin/app-engine I've got grails 1.3.0 RC 2, and App Engine SDK 1.3.3, and I'm using Windows 7. The steps that I try are: grails create-app appname cd appname grails install-plugin app-engine. I answer jpa when asked about jdo/jpa. It appears to install the gorm-jpa plugin automatically, although the tutorials all suggest installing gorm-jpa manually. grails install-plugin gorm-jpa (just in case) grails create-domain-class test.Person Edit the grails-app/domain/test/Person.groovy to add name and address fields: package test import javax.persistence.*; // import com.google.appengine.api.datastore.Key; @Entity class Person implements Serializable { @Id @GeneratedValue(strategy = GenerationType.IDENTITY) Long id @Basic String name @Basic String address static constraints = { id visible:false } } grails generate-all test.Person I get errors during this final step: C:\Users\John\Workspaces\STS\appname>grails generate-all test.Person Welcome to Grails 1.3.0.RC2 - http://grails.org/ Licensed under Apache Standard License 2.0 Grails home is set to: C:\Users\John\Downloads\grails-1.3.0.RC2\grails-1.3.0.RC2 Base Directory: C:\Users\John\Workspaces\STS\appname Resolving dependencies... Dependencies resolved in 493ms. Running script C:\Users\John\Downloads\grails-1.3.0.RC2\grails-1.3.0.RC2\scripts\GenerateAll.groovy Environment set to development [copy] Copied 4 empty directories to 2 empty directories under C:\Users\John\.grails\1.3.0.RC2\projects\appname\resources [copy] Copied 4 empty directories to 2 empty directories under C:\Users\John\.grails\1.3.0.RC2\projects\appname\resources [copy] Copied 1 empty directory to 1 empty directory under C:\Users\John\.grails\1.3.0.RC2\projects\appname\resources [mkdir] Created dir: C:\Users\John\Workspaces\STS\appname\web-app\WEB-INF\classes [groovyc] Compiling 12 source files to C:\Users\John\Workspaces\STS\appname\web-app\WEB-INF\classes Note: C:\Users\John\.grails\1.3.0.RC2\projects\appname\plugins\gorm-jpa-0.7.1\src\java\org\grails\jpa\domain\JpaGrailsDomainClass.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. [groovyc] Compiling 8 source files to C:\Users\John\Workspaces\STS\appname\web-app\WEB-INF\classes [mkdir] Created dir: C:\Users\John\.grails\1.3.0.RC2\projects\appname\resources\grails-app\i18n [native2ascii] Converting 13 files from C:\Users\John\Workspaces\STS\appname\grails-app\i18n to C:\Users\John\.grails\1.3.0.RC2\projects\appname\resources\grails-app\i18n [mkdir] Created dir: C:\Users\John\.grails\1.3.0.RC2\projects\appname\resources\plugins\gorm-jpa-0.7.1\grails-app\i18n [mkdir] Created dir: C:\Users\John\.grails\1.3.0.RC2\projects\appname\resources\plugins\app-engine-0.8.10\grails-app\i18n [native2ascii] Converting 1 file from C:\Users\John\.grails\1.3.0.RC2\projects\appname\plugins\gorm-jpa-0.7.1\grails-app\i18n to C:\Users\John\.grails\1.3.0.RC2\projects\appname\resources\plugins\gorm -jpa-0.7.1\grails-app\i18n [native2ascii] Converting 1 file from C:\Users\John\.grails\1.3.0.RC2\projects\appname\plugins\app-engine-0.8.10\grails-app\i18n to C:\Users\John\.grails\1.3.0.RC2\projects\appname\resources\plugins\a pp-engine-0.8.10\grails-app\i18n [copy] Copying 1 file to C:\Users\John\Workspaces\STS\appname\web-app\WEB-INF\classes [copy] Copying 2 files to C:\Users\John\.grails\1.3.0.RC2\projects\appname\resources [copy] Copied 2 empty directories to 2 empty directories under C:\Users\John\.grails\1.3.0.RC2\projects\appname\resources [copy] Copying 1 file to C:\Users\John\.grails\1.3.0.RC2\projects\appname [mkdir] Created dir: C:\Users\John\Workspaces\STS\appname\web-app\plugins\app-engine-0.8.10 [copy] Copying 1 file to C:\Users\John\Workspaces\STS\appname\web-app\plugins\app-engine-0.8.10 [copy] Copying 1 file to C:\Users\John\Workspaces\STS\appname\web-app\WEB-INF [mkdir] Created dir: C:\Users\John\Workspaces\STS\appname\web-app\WEB-INF\lib [copy] Copying 64 files to C:\Users\John\Workspaces\STS\appname\web-app\WEB-INF\lib Configuring persistence for AppEngine [mkdir] Created dir: C:\Users\John\Workspaces\STS\appname\web-app\WEB-INF\classes\META-INF [copy] Copying 1 file to C:\Users\John\Workspaces\STS\appname\web-app\WEB-INF\classes\META-INF [mkdir] Created dir: C:\Users\John\Workspaces\STS\appname\web-app\WEB-INF\plugins\app-engine-0.8.10 [copy] Copying 2 files to C:\Users\John\Workspaces\STS\appname\web-app\WEB-INF\plugins\app-engine-0.8.10 [mkdir] Created dir: C:\Users\John\Workspaces\STS\appname\web-app\WEB-INF\plugins\gorm-jpa-0.7.1 [copy] Copying 2 files to C:\Users\John\Workspaces\STS\appname\web-app\WEB-INF\plugins\gorm-jpa-0.7.1 Packaging AppEngine jar files Enhancing JDO classes [enhance] DataNucleus Enhancer (version 1.1.4) : Enhancement of classes [enhance] DataNucleus Enhancer completed with success for 1 classes. Timings : input=589 ms, enhance=200 ms, total=789 ms. Consult the log for full details [groovyc] Compiling 1 source file to C:\Users\John\Workspaces\STS\appname\web-app\WEB-INF\classes [copy] Copying 1 file to C:\Users\John\.grails\1.3.0.RC2\projects\appname [copy] Copying 1 file to C:\Users\John\Workspaces\STS\appname\web-app\WEB-INF Configuring persistence for AppEngine Packaging AppEngine jar files Enhancing JDO classes [enhance] DataNucleus Enhancer (version 1.1.4) : Enhancement of classes [enhance] DataNucleus Enhancer completed with success for 1 classes. Timings : input=585 ms, enhance=28 ms, total=613 ms. Consult the log for full details Generating views for domain class test.Person ... java.lang.reflect.InvocationTargetException at SimpleTemplateScript1.run(SimpleTemplateScript1.groovy:43) at _GrailsGenerate_groovy.generateForDomainClass(_GrailsGenerate_groovy:85) at _GrailsGenerate_groovy$_run_closure1.doCall(_GrailsGenerate_groovy:50) at GenerateAll$_run_closure1.doCall(GenerateAll.groovy:42) at gant.Gant$_dispatch_closure5.doCall(Gant.groovy:381) at gant.Gant$_dispatch_closure7.doCall(Gant.groovy:415) at gant.Gant$_dispatch_closure7.doCall(Gant.groovy) at gant.Gant.withBuildListeners(Gant.groovy:427) at gant.Gant.this$2$withBuildListeners(Gant.groovy) at gant.Gant$this$2$withBuildListeners.callCurrent(Unknown Source) at gant.Gant.dispatch(Gant.groovy:415) at gant.Gant.this$2$dispatch(Gant.groovy) at gant.Gant.invokeMethod(Gant.groovy) at gant.Gant.executeTargets(Gant.groovy:590) at gant.Gant.executeTargets(Gant.groovy:589) Caused by: java.lang.NoClassDefFoundError: org/hibernate/MappingException ... 15 more Caused by: java.lang.ClassNotFoundException: org.hibernate.MappingException at org.codehaus.groovy.tools.RootLoader.findClass(RootLoader.java:156) at java.lang.ClassLoader.loadClass(ClassLoader.java:307) at org.codehaus.groovy.tools.RootLoader.loadClass(RootLoader.java:128) at java.lang.ClassLoader.loadClass(ClassLoader.java:248) ... 15 more Error running generate-all: null What am I doing wrong?

    Read the article

  • Strange "INavigatorContent" error compiling in 4.0

    - by Stephano
    I've recently decided to try an upgrade to 4.0. The only error I still can't work out is this one: "The children of Halo navigators must implement INavigatorContent" I seem to be getting it on all my ViewStacks that have validators. <mx:ViewStack xmlns:mx="http://www.adobe.com/2006/mxml"> <mx:NumberValidator id="systolicValidator" source="{systolic}" required="true" property="text" minValue="10" maxValue="300" domain="int"/> <mx:NumberValidator id="diastolicValidator" source="{diastolic}" required="true" property="text" minValue="10" maxValue="200" domain="int"/> <mx:TextInput id="systolic"/> <mx:TextInput id="diastolic"/> ... The error gets thrown on the validator tags. My compiler is set to "flex 3 compatibility mode" and my theme is set to Halo (default). This seems like it should be a really straight forward fix, so I hate to spin my wheels on it for too long. Any ideas what I might be missing?

    Read the article

  • Programatically determining which node in an XML document caused validation against its XML Schema t

    - by jd1212
    My input is a well-formed XML document and a corresponding XML Schema document. What I would like to do is determine the location within the XML document that causes it to fail validation against the XML Schema document. I could not figure out how to do this using the standard validation approach in Java: SchemaFactory schemaFactory = SchemaFactory.newInstance(XMLConstants.W3C_XML_SCHEMA_NS_URI); Schema schema = schemaFactory.newSchema(... /* the .xsd source */); Validator validator = schema.newValidator(); DocumentBuilderFactory ... DocumentBuilder ... Document document = DocumentBuilder.parse(... /* the .xml source */); try { validator.validate(new DOMSource(document)); ... } catch (SAXParseException e) { ... } I have toyed with the idea of getting at least the line and column number from SAXParseException, but they're always set to -1, -1 on validation error.

    Read the article

  • Ball to Ball Collision - Detection and Handling

    - by Simucal
    With the help of the Stack Overflow community I've written a pretty basic-but fun physics simulator. You click and drag the mouse to launch a ball. It will bounce around and eventually stop on the "floor". My next big feature I want to add in is ball to ball collision. The ball's movement is broken up into a x and y speed vector. I have gravity (small reduction of the y vector each step), I have friction (small reduction of both vectors each collision with a wall). The balls honestly move around in a surprisingly realistic way. I guess my question has two parts: What is the best method to detect ball to ball collision? Do I just have an O(n^2) loop that iterates over each ball and checks every other ball to see if it's radius overlaps? What equations do I use to handle the ball to ball collisions? Physics 101 How does it effect the two balls speed x/y vectors? What is the resulting direction the two balls head off in? How do I apply this to each ball? Handling the collision detection of the "walls" and the resulting vector changes were easy but I see more complications with ball-ball collisions. With walls I simply had to take the negative of the appropriate x or y vector and off it would go in the correct direction. With balls I don't think it is that way. Some quick clarifications: for simplicity I'm ok with a perfectly elastic collision for now, also all my balls have the same mass right now, but I might change that in the future. In case anyone is interested in playing with the simulator I have made so far, I've uploaded the source here (EDIT: Check the updated source below). Edit: Resources I have found useful 2d Ball physics with vectors: 2-Dimensional Collisions Without Trigonometry.pdf 2d Ball collision detection example: Adding Collision Detection Success! I have the ball collision detection and response working great! Relevant code: Collision Detection: for (int i = 0; i < ballCount; i++) { for (int j = i + 1; j < ballCount; j++) { if (balls[i].colliding(balls[j])) { balls[i].resolveCollision(balls[j]); } } } This will check for collisions between every ball but skip redundant checks (if you have to check if ball 1 collides with ball 2 then you don't need to check if ball 2 collides with ball 1. Also, it skips checking for collisions with itself). Then, in my ball class I have my colliding() and resolveCollision() methods: public boolean colliding(Ball ball) { float xd = position.getX() - ball.position.getX(); float yd = position.getY() - ball.position.getY(); float sumRadius = getRadius() + ball.getRadius(); float sqrRadius = sumRadius * sumRadius; float distSqr = (xd * xd) + (yd * yd); if (distSqr <= sqrRadius) { return true; } return false; } public void resolveCollision(Ball ball) { // get the mtd Vector2d delta = (position.subtract(ball.position)); float d = delta.getLength(); // minimum translation distance to push balls apart after intersecting Vector2d mtd = delta.multiply(((getRadius() + ball.getRadius())-d)/d); // resolve intersection -- // inverse mass quantities float im1 = 1 / getMass(); float im2 = 1 / ball.getMass(); // push-pull them apart based off their mass position = position.add(mtd.multiply(im1 / (im1 + im2))); ball.position = ball.position.subtract(mtd.multiply(im2 / (im1 + im2))); // impact speed Vector2d v = (this.velocity.subtract(ball.velocity)); float vn = v.dot(mtd.normalize()); // sphere intersecting but moving away from each other already if (vn > 0.0f) return; // collision impulse float i = (-(1.0f + Constants.restitution) * vn) / (im1 + im2); Vector2d impulse = mtd.multiply(i); // change in momentum this.velocity = this.velocity.add(impulse.multiply(im1)); ball.velocity = ball.velocity.subtract(impulse.multiply(im2)); } Source Code: Complete source for ball to ball collider. Binary: Compiled binary in case you just want to try bouncing some balls around. If anyone has some suggestions for how to improve this basic physics simulator let me know! One thing I have yet to add is angular momentum so the balls will roll more realistically. Any other suggestions? Leave a comment!

    Read the article

  • .Net Intermittent System.Web.Services.Protocols.SoapHeaderException

    - by ScottE
    We have a .net 3.5 web app that consumes third party web services. The proxy was created by adding a web reference to their wsdl. This proxy is not compiled. Our error logging is picking up frequent but intermittent exceptions: An exception of type 'System.Web.Services.Protocols.SoapHeaderException' occurred and was caught If I follow the url to the page that generated the exception, I can't recreate it. Edit: Here is most of the exception - where it bubbled up from Message : Internal Error Type : System.Web.Services.Protocols.SoapHeaderException, System.Web.Services, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a Source : System.Web.Services Help link : Actor : Code : http://schemas.xmlsoap.org/soap/envelope/:Client Detail : Lang : Node : Role : SubCode : Data : System.Collections.ListDictionaryInternal TargetSite : System.Object[] ReadResponse(System.Web.Services.Protocols.SoapClientMessage, System.Net.WebResponse, System.IO.Stream, Boolean) Stack Trace : at System.Web.Services.Protocols.SoapHttpClientProtocol.ReadResponse(SoapClientMessage message, WebResponse response, Stream responseStream, Boolean asyncCall) at System.Web.Services.Protocols.SoapHttpClientProtocol.Invoke(String methodName, Object[] parameters) at Vendor.getSearch(getSearchRequest getSearchRequest) in c:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\root\be43c34e\b09edc7e\App_WebReferences.pww-cf-q.0.cs:line 73 Edit 2: Inner exceptions: I sometimes get the following inner exceptions logged: Message : Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host. Type : System.IO.IOException, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089 Source : System Help link : Data : System.Collections.ListDictionaryInternal TargetSite : Int32 Read(Byte[], Int32, Int32) Stack Trace : at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size) at System.Net.FixedSizeReader.ReadPacket(Byte[] buffer, Int32 offset, Int32 count) at System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.CheckCompletionBeforeNextReceive(ProtocolToken message, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.StartSendBlob(Byte[] incoming, Int32 count, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.ForceAuthentication(Boolean receiveFirst, Byte[] buffer, AsyncProtocolRequest asyncRequest) at System.Net.Security.SslState.ProcessAuthentication(LazyAsyncResult lazyResult) at System.Net.TlsStream.CallProcessAuthentication(Object state) at System.Threading.ExecutionContext.runTryCode(Object userData) at System.Runtime.CompilerServices.RuntimeHelpers.ExecuteCodeWithGuaranteedCleanup(TryCode code, CleanupCode backoutCode, Object userData) at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state) at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state) at System.Net.TlsStream.ProcessAuthentication(LazyAsyncResult result) at System.Net.TlsStream.Write(Byte[] buffer, Int32 offset, Int32 size) at System.Net.PooledStream.Write(Byte[] buffer, Int32 offset, Int32 size) at System.Net.ConnectStream.WriteHeaders(Boolean async) And/Or: Message : An existing connection was forcibly closed by the remote host Type : System.Net.Sockets.SocketException, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089 Source : System Help link : ErrorCode : 10054 SocketErrorCode : ConnectionReset NativeErrorCode : 10054 Data : System.Collections.ListDictionaryInternal TargetSite : Int32 Receive(Byte[], Int32, Int32, System.Net.Sockets.SocketFlags) Stack Trace : at System.Net.Sockets.Socket.Receive(Byte[] buffer, Int32 offset, Int32 size, SocketFlags socketFlags) at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size) Update We're still working on it. Originally there was a route issue, which was resolved. We're still getting the inner exception with socket errors. We had MS support involved today, and they looked at some traces and network captures. The web service host does round-robin DNS, and they may be responding on a different IP address for the syn syn/ack from one ip, and the next from a different ip. This is not good. This is likely quite specific to our situation, but perhaps it applies to others as well. Microsoft Network Monitor and an application trace got us the information we needed.

    Read the article

  • JSF Portlets in Liferay on JBoss

    - by JBirch
    I'm currently looking at working with and deploying JSF portlets into Liferay 6.0.5, sitting on JBoss 5.1.0. I ran into a lot of trouble trying to port some JSF-y/Seam-y/EJB-y stuff I had lying around, so I thought I'd start simple and work my way up. I could generate generic portlets using the NetBeans Maven archetype for Liferay portlets absolutely fine, but it's rather irrelevant because I wanted JSF portlets I took an example JSF portlet from http://www.liferay.com/downloads/liferay-portal/community-plugins/-/software_catalog/products/5546866 and attempted to deploy into a clean, vanilla installation of Liferay 6.0.5/JBoss 5.1.0 to no avail. The log messages are reproduced at the end of this. This particular example was actually tested for GlassFish and Tomcat, so it's not particularly helpful considering I'm deplying into JBoss. I tried ripping it apart and removing the jsf implementation contained within as there is a jsf implementation shipped with JBoss (Mojarra 1.2_12, in this case). 03:16:17,173 INFO [PortletAutoDeployListener] Copying portlets for /usr/local/[REDACTED]/liferay/liferay-portal-6.0.5/deploy/richfaces-sun-jsf1.2-facelets-portlet-1.2.war Expanding: /usr/local/[REDACTED]/liferay/liferay-portal-6.0.5/deploy/richfaces-sun-jsf1.2-facelets-portlet-1.2.war into /tmp/20110201031617188 Copying 1 file to /tmp/20110201031617188/WEB-INF Copying 1 file to /tmp/20110201031617188/WEB-INF/classes Copying 1 file to /tmp/20110201031617188/WEB-INF/classes Copying 47 files to /usr/local/[REDACTED]/liferay/liferay-portal-6.0.5/jboss-5.1.0/server/default/deploy/richfaces-sun-jsf1.2-facelets-portlet.war Copying 1 file to /usr/local/[REDACTED]/liferay/liferay-portal-6.0.5/jboss-5.1.0/server/default/deploy/richfaces-sun-jsf1.2-facelets-portlet.war Deleting directory /tmp/20110201031617188 03:16:20,075 INFO [PortletAutoDeployListener] Portlets for /usr/local/[REDACTED]/liferay/liferay-portal-6.0.5/deploy/richfaces-sun-jsf1.2-facelets-portlet-1.2.war copied successfully. Deployment will start in a few seconds. 03:16:23,632 INFO [TomcatDeployment] deploy, ctxPath=/richfaces-sun-jsf1.2-facelets-portlet 03:16:24,446 INFO [PortletHotDeployListener] Registering portlets for richfaces-sun-jsf1.2-facelets-portlet 03:16:24,492 INFO [faces] Init GenericFacesPortlet for portlet 1 03:16:24,495 INFO [faces] Bridge class name is org.jboss.portletbridge.AjaxPortletBridge 03:16:24,509 INFO [faces] The bridge does not support doHeaders method 03:16:24,510 INFO [faces] GenericFacesPortlet for portlet 1 initialized 03:16:24,555 INFO [PortletHotDeployListener] 1 portlet for richfaces-sun-jsf1.2-facelets-portlet is available for use 03:16:24,627 SEVERE [webapp] Initialization of the JSF runtime either failed or did not occurr. Review the server''s log for details. java.lang.InstantiationException: org.jboss.portletbridge.context.FacesContextFactoryImpl at java.lang.Class.newInstance0(Class.java:340) at java.lang.Class.newInstance(Class.java:308) at javax.faces.FactoryFinder.getImplGivenPreviousImpl(FactoryFinder.java:537) at javax.faces.FactoryFinder.getImplementationInstance(FactoryFinder.java:394) at javax.faces.FactoryFinder.access$400(FactoryFinder.java:135) at javax.faces.FactoryFinder$FactoryManager.getFactory(FactoryFinder.java:717) at javax.faces.FactoryFinder.getFactory(FactoryFinder.java:239) at javax.faces.webapp.FacesServlet.init(FacesServlet.java:164) at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1048) at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:950) at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4122) at org.apache.catalina.core.StandardContext.start(StandardContext.java:4421) at org.jboss.web.tomcat.service.deployers.TomcatDeployment.performDeployInternal(TomcatDeployment.java:310) at org.jboss.web.tomcat.service.deployers.TomcatDeployment.performDeploy(TomcatDeployment.java:142) at org.jboss.web.deployers.AbstractWarDeployment.start(AbstractWarDeployment.java:461) at org.jboss.web.deployers.WebModule.startModule(WebModule.java:118) at org.jboss.web.deployers.WebModule.start(WebModule.java:97) at sun.reflect.GeneratedMethodAccessor286.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.jboss.mx.interceptor.ReflectedDispatcher.invoke(ReflectedDispatcher.java:157) at org.jboss.mx.server.Invocation.dispatch(Invocation.java:96) at org.jboss.mx.server.Invocation.invoke(Invocation.java:88) at org.jboss.mx.server.AbstractMBeanInvoker.invoke(AbstractMBeanInvoker.java:264) at org.jboss.mx.server.MBeanServerImpl.invoke(MBeanServerImpl.java:668) at org.jboss.system.microcontainer.ServiceProxy.invoke(ServiceProxy.java:206) at $Proxy38.start(Unknown Source) at org.jboss.system.microcontainer.StartStopLifecycleAction.installAction(StartStopLifecycleAction.java:42) at org.jboss.system.microcontainer.StartStopLifecycleAction.installAction(StartStopLifecycleAction.java:37) at org.jboss.dependency.plugins.action.SimpleControllerContextAction.simpleInstallAction(SimpleControllerContextAction.java:62) at org.jboss.dependency.plugins.action.AccessControllerContextAction.install(AccessControllerContextAction.java:71) at org.jboss.dependency.plugins.AbstractControllerContextActions.install(AbstractControllerContextActions.java:51) at org.jboss.dependency.plugins.AbstractControllerContext.install(AbstractControllerContext.java:348) at org.jboss.system.microcontainer.ServiceControllerContext.install(ServiceControllerContext.java:286) at org.jboss.dependency.plugins.AbstractController.install(AbstractController.java:1631) at org.jboss.dependency.plugins.AbstractController.incrementState(AbstractController.java:934) at org.jboss.dependency.plugins.AbstractController.resolveContexts(AbstractController.java:1082) at org.jboss.dependency.plugins.AbstractController.resolveContexts(AbstractController.java:984) at org.jboss.dependency.plugins.AbstractController.change(AbstractController.java:822) at org.jboss.dependency.plugins.AbstractController.change(AbstractController.java:553) at org.jboss.system.ServiceController.doChange(ServiceController.java:688) at org.jboss.system.ServiceController.start(ServiceController.java:460) at org.jboss.system.deployers.ServiceDeployer.start(ServiceDeployer.java:163) at org.jboss.system.deployers.ServiceDeployer.deploy(ServiceDeployer.java:99) at org.jboss.system.deployers.ServiceDeployer.deploy(ServiceDeployer.java:46) at org.jboss.deployers.spi.deployer.helpers.AbstractSimpleRealDeployer.internalDeploy(AbstractSimpleRealDeployer.java:62) at org.jboss.deployers.spi.deployer.helpers.AbstractRealDeployer.deploy(AbstractRealDeployer.java:50) at org.jboss.deployers.plugins.deployers.DeployerWrapper.deploy(DeployerWrapper.java:171) at org.jboss.deployers.plugins.deployers.DeployersImpl.doDeploy(DeployersImpl.java:1439) at org.jboss.deployers.plugins.deployers.DeployersImpl.doInstallParentFirst(DeployersImpl.java:1157) at org.jboss.deployers.plugins.deployers.DeployersImpl.doInstallParentFirst(DeployersImpl.java:1178) at org.jboss.deployers.plugins.deployers.DeployersImpl.install(DeployersImpl.java:1098) at org.jboss.dependency.plugins.AbstractControllerContext.install(AbstractControllerContext.java:348) at org.jboss.dependency.plugins.AbstractController.install(AbstractController.java:1631) at org.jboss.dependency.plugins.AbstractController.incrementState(AbstractController.java:934) at org.jboss.dependency.plugins.AbstractController.resolveContexts(AbstractController.java:1082) at org.jboss.dependency.plugins.AbstractController.resolveContexts(AbstractController.java:984) at org.jboss.dependency.plugins.AbstractController.change(AbstractController.java:822) at org.jboss.dependency.plugins.AbstractController.change(AbstractController.java:553) at org.jboss.deployers.plugins.deployers.DeployersImpl.process(DeployersImpl.java:781) at org.jboss.deployers.plugins.main.MainDeployerImpl.process(MainDeployerImpl.java:702) at org.jboss.system.server.profileservice.repository.MainDeployerAdapter.process(MainDeployerAdapter.java:117) at org.jboss.system.server.profileservice.hotdeploy.HDScanner.scan(HDScanner.java:362) at org.jboss.system.server.profileservice.hotdeploy.HDScanner.run(HDScanner.java:255) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441) at java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:181) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:205) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:885) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:907) at java.lang.Thread.run(Thread.java:619) 03:16:24,629 INFO [2-facelets-portlet]] Marking servlet FacesServlet as unavailable 03:16:24,630 ERROR [2-facelets-portlet]] Servlet /richfaces-sun-jsf1.2-facelets-portlet threw load() exception javax.servlet.UnavailableException: Initialization of the JSF runtime either failed or did not occurr. Review the server''s log for details. at javax.faces.webapp.FacesServlet.init(FacesServlet.java:172) at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1048) at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:950) at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4122) at org.apache.catalina.core.StandardContext.start(StandardContext.java:4421) at org.jboss.web.tomcat.service.deployers.TomcatDeployment.performDeployInternal(TomcatDeployment.java:310) at org.jboss.web.tomcat.service.deployers.TomcatDeployment.performDeploy(TomcatDeployment.java:142) at org.jboss.web.deployers.AbstractWarDeployment.start(AbstractWarDeployment.java:461) at org.jboss.web.deployers.WebModule.startModule(WebModule.java:118) at org.jboss.web.deployers.WebModule.start(WebModule.java:97) at sun.reflect.GeneratedMethodAccessor286.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.jboss.mx.interceptor.ReflectedDispatcher.invoke(ReflectedDispatcher.java:157) at org.jboss.mx.server.Invocation.dispatch(Invocation.java:96) at org.jboss.mx.server.Invocation.invoke(Invocation.java:88) at org.jboss.mx.server.AbstractMBeanInvoker.invoke(AbstractMBeanInvoker.java:264) at org.jboss.mx.server.MBeanServerImpl.invoke(MBeanServerImpl.java:668) at org.jboss.system.microcontainer.ServiceProxy.invoke(ServiceProxy.java:206) at $Proxy38.start(Unknown Source) at org.jboss.system.microcontainer.StartStopLifecycleAction.installAction(StartStopLifecycleAction.java:42) at org.jboss.system.microcontainer.StartStopLifecycleAction.installAction(StartStopLifecycleAction.java:37) at org.jboss.dependency.plugins.action.SimpleControllerContextAction.simpleInstallAction(SimpleControllerContextAction.java:62) at org.jboss.dependency.plugins.action.AccessControllerContextAction.install(AccessControllerContextAction.java:71) at org.jboss.dependency.plugins.AbstractControllerContextActions.install(AbstractControllerContextActions.java:51) at org.jboss.dependency.plugins.AbstractControllerContext.install(AbstractControllerContext.java:348) at org.jboss.system.microcontainer.ServiceControllerContext.install(ServiceControllerContext.java:286) at org.jboss.dependency.plugins.AbstractController.install(AbstractController.java:1631) at org.jboss.dependency.plugins.AbstractController.incrementState(AbstractController.java:934) at org.jboss.dependency.plugins.AbstractController.resolveContexts(AbstractController.java:1082) at org.jboss.dependency.plugins.AbstractController.resolveContexts(AbstractController.java:984) at org.jboss.dependency.plugins.AbstractController.change(AbstractController.java:822) at org.jboss.dependency.plugins.AbstractController.change(AbstractController.java:553) at org.jboss.system.ServiceController.doChange(ServiceController.java:688) at org.jboss.system.ServiceController.start(ServiceController.java:460) at org.jboss.system.deployers.ServiceDeployer.start(ServiceDeployer.java:163) at org.jboss.system.deployers.ServiceDeployer.deploy(ServiceDeployer.java:99) at org.jboss.system.deployers.ServiceDeployer.deploy(ServiceDeployer.java:46) at org.jboss.deployers.spi.deployer.helpers.AbstractSimpleRealDeployer.internalDeploy(AbstractSimpleRealDeployer.java:62) at org.jboss.deployers.spi.deployer.helpers.AbstractRealDeployer.deploy(AbstractRealDeployer.java:50) at org.jboss.deployers.plugins.deployers.DeployerWrapper.deploy(DeployerWrapper.java:171) at org.jboss.deployers.plugins.deployers.DeployersImpl.doDeploy(DeployersImpl.java:1439) at org.jboss.deployers.plugins.deployers.DeployersImpl.doInstallParentFirst(DeployersImpl.java:1157) at org.jboss.deployers.plugins.deployers.DeployersImpl.doInstallParentFirst(DeployersImpl.java:1178) at org.jboss.deployers.plugins.deployers.DeployersImpl.install(DeployersImpl.java:1098) at org.jboss.dependency.plugins.AbstractControllerContext.install(AbstractControllerContext.java:348) at org.jboss.dependency.plugins.AbstractController.install(AbstractController.java:1631) at org.jboss.dependency.plugins.AbstractController.incrementState(AbstractController.java:934) at org.jboss.dependency.plugins.AbstractController.resolveContexts(AbstractController.java:1082) at org.jboss.dependency.plugins.AbstractController.resolveContexts(AbstractController.java:984) at org.jboss.dependency.plugins.AbstractController.change(AbstractController.java:822) at org.jboss.dependency.plugins.AbstractController.change(AbstractController.java:553) at org.jboss.deployers.plugins.deployers.DeployersImpl.process(DeployersImpl.java:781) at org.jboss.deployers.plugins.main.MainDeployerImpl.process(MainDeployerImpl.java:702) at org.jboss.system.server.profileservice.repository.MainDeployerAdapter.process(MainDeployerAdapter.java:117) at org.jboss.system.server.profileservice.hotdeploy.HDScanner.scan(HDScanner.java:362) at org.jboss.system.server.profileservice.hotdeploy.HDScanner.run(HDScanner.java:255) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441) at java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:181) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:205) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:885) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:907) at java.lang.Thread.run(Thread.java:619)

    Read the article

  • Ado.net performance:What does SNIReadSync do?

    - by Beatles1692
    We have a query that takes 2 seconds to run in Sql Server Management Studio but it takes 13 seconds to be shown on a client screen. I used dotTrace to profile my source code and noticed there is this SNIReadSync method (part of ADO.net assemblies)that takes a lot of time to do its job(9 seconds).I ran my source over server so I could omit the network effects and the result was the same. It doesn't matter if I'm using OleDBConnection or SqlConnection. It doesn't matter if I'm using a DataReader or a DataSet. Connection pooling does not solve this issue(as my result shows). I googled this issue and I couldn't find an answer to the question that what this method is actually doing and how we can improve it. here's what I found on StakOverFlow that's not helpful either: http://stackoverflow.com/questions/1610874/snireadsync-executing-between-120-500-ms-for-a-simple-query-what-do-i-look-for

    Read the article

  • Automapper use in a MVVM application

    - by Echiban
    I am building a MVVM application. The model / entity (I am using NHibernate) is already done, and I am thinking of using AutoMapper to map between the ViewModel and Model. However this clause scares the jebus out of me: (from http://www.lostechies.com/blogs/jimmy_bogard/archive/2009/01/22/automapper-the-object-object-mapper.aspx) Blockquote AutoMapper enforces that for each type map (source/destination pair), all of the properties on the destination type are matched up with something on the source type To me, the logical choice is to map from model to viewmodel, (and I'll let viewmodel manually assign to model), but the quote basically kills the idea since the viewmodel will definitely have properties that don't exist on the model. How have you been using Automapper in a MVVM app? Please help!

    Read the article

  • libevent, windows and .NET programming

    - by Chris
    I experiment with a lot of open source software and I've noticed a fair amount of server type applications in the open source world use libevent to facilitate event-based processing rather than spawn multiple threads to handle requests. I also do a lot of .NET programming (it's my core job function), and I'm interested in learning how libevent relates to the .NET event model. Are events in .NET the equivalent of libevent for C programs? Should I try to learn libevent and attempt to use it in custom .NET server applications or is using the standard .NET event model basically the same thing?

    Read the article

  • Selenium Webdriver (FirefoxDriver)

    - by Samareri
    Hey all, I'm using Selenium Webdriver to do some robottesting. Since some functions appear to only work in Firefox, I'm obligated to use Firefoxdriver. Now and then, something weird happens. Starting up te driver driver = new FirefoxDriver(); driver.get(URL); gets firefox to startup but not to go to the specified url. The strange thing is that it works on another computer with the same preferences set in Firefox. I solved this problem once by changing to another version of firefox, but this time this doesn't do the trick for me, it did however worked for the other developers. Yes, the error started for all developers on the same time, same day... My first question is: is it a firefox problem or Webdriver problem. Second question: how is it possible that it works on other pc's? Any help would be very appreciated Thanks Error: org.openqa.selenium.WebDriverException: Could not parse "". System info: os.name: 'Windows XP', os.arch: 'x86', os.version: '5.1', java.version: '1.6.0_18' Driver info: driver.version: firefox at org.openqa.selenium.firefox.Response.<init>(Response.java:53) at org.openqa.selenium.firefox.internal.AbstractExtensionConnection.nextResponse(AbstractExtensionConnection.java:258) at org.openqa.selenium.firefox.internal.AbstractExtensionConnection.readLoop(AbstractExtensionConnection.java:220) at org.openqa.selenium.firefox.internal.AbstractExtensionConnection.waitForResponseFor(AbstractExtensionConnection.java:213) at org.openqa.selenium.firefox.internal.AbstractExtensionConnection.sendMessageAndWaitForResponse(AbstractExtensionConnection.java:162) at org.openqa.selenium.firefox.FirefoxDriver.executeCommand(FirefoxDriver.java:329) at org.openqa.selenium.firefox.FirefoxDriver.sendMessage(FirefoxDriver.java:312) at org.openqa.selenium.firefox.FirefoxDriver.sendMessage(FirefoxDriver.java:308) at org.openqa.selenium.firefox.FirefoxDriver.fixId(FirefoxDriver.java:350) at org.openqa.selenium.firefox.FirefoxDriver.<init>(FirefoxDriver.java:130) at org.openqa.selenium.firefox.FirefoxDriver.<init>(FirefoxDriver.java:109) at be.....MMCRobotTest.login(MMCRobotTest.java:98) at be.....MMCRobotTestAttribute.testNewAttribute(MMCRobotTestAttribute.java:12) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at junit.framework.TestCase.runTest(TestCase.java:164) at junit.framework.TestCase.runBare(TestCase.java:130) at junit.framework.TestResult$1.protect(TestResult.java:106) at junit.framework.TestResult.runProtected(TestResult.java:124) at junit.framework.TestResult.run(TestResult.java:109) at junit.framework.TestCase.run(TestCase.java:120) at junit.framework.TestSuite.runTest(TestSuite.java:230) at junit.framework.TestSuite.run(TestSuite.java:225) at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130) at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197) Caused by: org.json.JSONException: A JSONObject text must begin with '{' at character 0 at org.json.JSONTokener.syntaxError(JSONTokener.java:496) at org.json.JSONObject.<init>(JSONObject.java:180) at org.json.JSONObject.<init>(JSONObject.java:403) at org.openqa.selenium.firefox.Response.<init>(Response.java:41) ... 30 more

    Read the article

  • WPF RowDetailsTemplate width issue

    - by Ed Courtenay
    Apologies if this is a dupe, but I can't seem to find a rational solution for what must be a fairly simple issue. <Window x:Class="FeedTest.MainWindow" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" Title="MainWindow" Height="350" Width="525"> <Window.Resources> <XmlNamespaceMappingCollection x:Key="map"> <XmlNamespaceMapping Prefix="media" Uri="http://search.yahoo.com/mrss/" /> </XmlNamespaceMappingCollection> <XmlDataProvider x:Key="newsFeed" XPath="//item[string-length(title)>0]" Source="http://newsrss.bbc.co.uk/rss/newsonline_uk_edition/uk/rss.xml" /> <DataTemplate x:Key="rowDetailTemplate"> <Border BorderThickness="2"> <StackPanel Orientation="Horizontal"> <Image Source="{Binding XPath=media:thumbnail/@url}" Width="66" Height="49" /> <StackPanel Orientation="Vertical" Margin="5"> <TextBlock Text="{Binding XPath=description}" TextWrapping="Wrap" /> </StackPanel> </StackPanel> </Border> </DataTemplate> <Style TargetType="{x:Type DataGrid}"> <Setter Property="GridLinesVisibility" Value="None" /> </Style> </Window.Resources> <Grid Binding.XmlNamespaceManager="{StaticResource map}"> <DataGrid AutoGenerateColumns="False" ItemsSource="{Binding Source={StaticResource newsFeed}}" RowDetailsVisibilityMode="VisibleWhenSelected" RowDetailsTemplate="{StaticResource rowDetailTemplate}"> <DataGrid.Columns> <DataGridTextColumn Header="Title" Binding="{Binding XPath=title}" MinWidth="150" Width="*" /> </DataGrid.Columns> </DataGrid> </Grid> The attached XAML gets a news feed, and displays the title of each item in a DataGrid. Selecting an item shows the RowDetailsTemplate which is where my problem lies - why does the RowDetailsTemplate expand beyond the width of the containing DataGrid (thus forcing a horizontal scrollbar), and more importantly, how do I stop it doing this? Many thanks.

    Read the article

  • The new workflow management of Oracle´s Hyperion Planning: Define more details with Planning Unit Hierarchies and Promotional Paths

    - by Alexandra Georgescu
    After having been almost unchanged for several years, starting with the 11.1.2 release of Oracle´s Hyperion Planning the Process Management has not only got a new name: “Approvals” now is offering the possibility to further split Planning Units (comprised of a unique Scenario-Version-Entity combination) into more detailed combinations along additional secondary dimensions, a so called Planning Unit Hierarchy, and also to pre-define a path of planners, reviewers and approvers, called Promotional Path. I´d like to introduce you to changes and enhancements in this new process management and arouse your curiosity for checking out more details on it. One reason of using the former process management in Planning was to limit data entry rights to one person at a time based on the assignment of a planning unit. So the lowest level of granularity for this assignment was, for a given Scenario-Version combination, the individual entity. Even if in many cases one person wasn´t responsible for all data being entered into that entity, but for only part of it, it was not possible to split the ownership along another additional dimension, for example by assigning ownership to different accounts at the same time. By defining a so called Planning Unit Hierarchy (PUH) in Approvals this gap is now closed. Complementing new Shared Services roles for Planning have been created in order to manage set up and use of Approvals: The Approvals Administrator consisting of the following roles: Approvals Ownership Assigner, who assigns owners and reviewers to planning units for which Write access is assigned (including Planner responsibilities). Approvals Supervisor, who stops and starts planning units and takes any action on planning units for which Write access is assigned. Approvals Process Designer, who can modify planning unit hierarchy secondary dimensions and entity members for which Write access is assigned, can also modify scenarios and versions that are assigned to planning unit hierarchies and can edit validation rules on data forms for which access is assigned. (this includes as well Planner and Ownership Assigner responsibilities) Set up of a Planning Unit Hierarchy is done under the Administration menu, by selecting Approvals, then Planning Unit Hierarchy. Here you create new PUH´s or edit existing ones. The following window displays: After providing a name and an optional description, a pre-selection of entities can be made for which the PUH will be defined. Available options are: All, which pre-selects all entities to be included for the definitions on the subsequent tabs None, manual entity selections will be made subsequently Custom, which offers the selection for an ancestor and the relative generations, that should be included for further definitions. Finally a pattern needs to be selected, which will determine the general flow of ownership: Free-form, uses the flow/assignment of ownerships according to Planning releases prior to 11.1.2 In Bottom-up, data input is done at the leaf member level. Ownership follows the hierarchy of approval along the entity dimension, including refinements using a secondary dimension in the PUH, amended by defined additional reviewers in the promotional path. Distributed, uses data input at the leaf level, while ownership starts at the top level and then is distributed down the organizational hierarchy (entities). After ownership reaches the lower levels, budgets are submitted back to the top through the approval process. Proceeding to the next step, now a secondary dimension and the respective members from that dimension might be selected, in order to create more detailed combinations underneath each entity. After selecting the Dimension and a Parent Member, the definition of a Relative Generation below this member assists in populating the field for Selected Members, while the Count column shows the number of selected members. For refining this list, you might click on the icon right beside the selected member field and use the check-boxes in the appearing list for deselecting members. -------------------------------------------------------------------------------------------------------- TIP: In order to reduce maintenance of the PUH due to changes in the dimensions included (members added, moved or removed) you should consider to dynamically link those dimensions in the PUH with the dimension hierarchies in the planning application. For secondary dimensions this is done using the check-boxes in the Auto Include column. For the primary dimension, the respective selection criteria is applied by right-clicking the name of an entity activated as planning unit, then selecting an item of the shown list of include or exclude options (children, descendants, etc.). Anyway in order to apply dimension changes impacting the PUH a synchronization must be run. If this is really necessary or not is shown on the first screen after selecting from the menu Administration, then Approvals, then Planning Unit Hierarchy: under Synchronized you find the statuses Yes, No or Locked, where the last one indicates, that another user is just changing or synchronizing the PUH. Select one of the not synchronized PUH´s (status No) and click the Synchronize option in order to execute. -------------------------------------------------------------------------------------------------------- In the next step owners and reviewers are assigned to the PUH. Using the icons with the magnifying glass right besides the columns for Owner and Reviewer the respective assignments can be made in the ordermthat you want them to review the planning unit. While it is possible to assign only one owner per entity or combination of entity+ member of the secondary dimension, the selection for reviewers might consist of more than one person. The complete Promotional Path, including the defined owners and reviewers for the entity parents, can be shown by clicking the icon. In addition optional users might be defined for being notified about promotions for a planning unit. -------------------------------------------------------------------------------------------------------- TIP: Reviewers cannot change data, but can only review data according to their data access permissions and reject or promote planning units. -------------------------------------------------------------------------------------------------------- In order to complete your PUH definitions click Finish - this saves the PUH and closes the window. As a final step, before starting the approvals process, you need to assign the PUH to the Scenario-Version combination for which it should be used. From the Administration menu select Approvals, then Scenario and Version Assignment. Expand the PUH in order to see already existing assignments. Under Actions click the add icon and select scenarios and versions to be assigned. If needed, click the remove icon in order to delete entries. After these steps, set up is completed for starting the approvals process. Start, stop and control of the approvals process is now done under the Tools menu, and then Manage Approvals. The new PUH feature is complemented by various additional settings and features; some of them at least should be mentioned here: Export/Import of PUHs: Out of Office agent: Validation Rules changing promotional/approval path if violated (including the use of User-defined Attributes (UDAs)): And various new and helpful reviewer actions with corresponding approval states. About the Author: Bernhard Kinkel started working for Hyperion Solutions as a Presales Consultant and Consultant in 1998 and moved to Hyperion Education Services in 1999. He joined Oracle University in 2007 where he is a Principal Education Consultant. Based on these many years of working with Hyperion products he has detailed product knowledge across several versions. He delivers both classroom and live virtual courses. His areas of expertise are Oracle/Hyperion Essbase, Oracle Hyperion Planning and Hyperion Web Analysis.

    Read the article

  • Reading EventLog C# Errors

    - by Robert
    I have this code in my ASP.NET application written in C# that is trying to read the eventlog, but it returns an error. EventLog aLog = new EventLog(); aLog.Log = "Application"; aLog.MachineName = "."; // Local machine foreach (EventLogEntry entry in aLog.Entries) { if (entry.Source.Equals("tvNZB")) Label_log.Text += "<p>" + entry.Message; } One of the entries it returns is "The description for Event ID '0' in Source 'tvNZB' cannot be found. The local computer may not have the necessary registry information or message DLL files to display the message, or you may not have permission to access them. The following information is part of the event:'Service started successfully.'" I only want the 'Service started successfully'. Any ideas?

    Read the article

  • Use Dojo Drag and Drop together with Dojo Moveable

    - by Select0r
    Hi, I'm using Dojo.dnd to transfer items between to areas. The problem is: the items will snap into place once I drop them, but I'd like to have them stay where I drop them, but only for one area. Here's a little code to explain this better: <div id="dropZone" class="dropZone"> <div id="itemNodes"></div> <div id="targetZone" dojoType="dojo.dnd.Source"></div> </div> "dropZone" is a DIV that contains two dojo.dnd.Source-areas, "itemNodes" (created programmatically) and "targetZone". Items (DIVs with images) should be dragged from a simple list out of "itemNodes" into "targetZone" and stay where they are dropped. As soon as they are dragged out of "targetZone" they should snap back to the list inside "itemNodes". Here's the code I use to create the items: var nodelist = new dojo.dnd.Source("itemNodes"); {Smarty-Loop} nodelist.insertNodes(false, ['<img class="dragItem" src="{$items->info.itemtext}" alt="{$items->info.itemtext}" border="0" />']); {/Smarty-Loop} But this way I just have two lists of items, the items dropped into "targetZone" won't stay where I dropped them. I've tried a loop dojo.query(".dojoDndItem").forEach(function(node) to grab all items and change them to a "moveable"-type: using dojo.dnd.move.constrainedMoveable will change the items so they can always be moved around (even in "itemNodes") using dojo.dnd.move.boxConstrainedMoveable and defining the "box" to the borders of "targetZone" makes it possible to just move the items around inside "targetZone", but as soon as I drop them, I can't grab and move them back out So here's the question: is it possible to create two dnd.Sources where I can move items back and forth and let the items be "moveable" only in one of the sources?Or is there a workaround like making the items moveable and if they're not dropped into "targetZone" they'll be moved back to the list in "itemNodes" automatically? Once the page is submitted, I have to save the position of every item that has been placed into "targetZone". (The next step will be positioning the items inside "targetZone" on page load if the grid has already been filled before, but I'd be happy to just get the thing working in the first place.) Any hint is appreciated. Greetings, Select0r

    Read the article

  • Windows Azure Evolution &ndash; Welcome to VS2012

    - by Shaun
    When the Microsoft released the first preview version of Windows 8 and Visual Studio, many people in the community were asking if the windows azure tool is available to it. The answer was “NO”. Microsoft promised that the windows azure tool will only support the Visual Studio 2010 but when the 2012 was final released, windows azure tool should be work. But now alone with the new windows azure platform was published we got the latest Windows Azure SDK 1.7, which is compatible to the Visual Studio 2012 RC.   You can retrieve the latest version of the Windows Azure SDK through Web Platform Installer, which I think it’s the easiest and simplest way to download and install, since besides the SDK itself it also needs some other components. To download the latest windows azure SDK from Web Platform Installer, just go to the windows azure website and clicked the Develop, .NET and click the blue “install” button. Then you need to select which version of Visual Studio you want to use, Visual Studio 2010 or Visual Studio 2012 RC. After selected the current version you will download an EXE file. This file will lead you to install the Web Platform Installer 4.0 (if you haven’t installed) and the latest windows azure SDK. You can see the version name is June 2012, 1.7. Finally the WebPI will detect the dependent components you need to download and begin to install. But if you want to challenge yourself you can download the components and install them manually. The standalone installations are listed in this page with the instruction on how to install them with necessary pre-requirements.   Once you finished the installation you can open the Visual Studio 2012 RC and as usual, it need to be run as administrator. If you clicked the New Project link from the start page, navigated to Cloud category you will find that there no project template available. Is there anything wrong? So, if you changed the target framework from the default .NET 4.5 to .NET 4 you will see the azure project template. This is because, currently the windows azure instance does not support .NET 4.5. After clicked OK you will see the role creation window, which is similar as what you have seen before. But there are some new role templates in this SDK. Firstly you will have ASP.NET MVC 4 web role available, which means you can create ASP.NET MVC 4 applications for internet, intranet, mobile and WebAPI on the cloud. Then there are two new worker role templates, “Cache Worker Role” and “Worker Role with Service Bus Queue”. “Worker Role with Service Bus Queue” is a worker role which had added necessary references to access the Windows Azure Service Bus Queue. It also have some basic sample code in the worker role class which could read messages from the queue when started. The “Cache Worker Role” is a worker role which has the in-memory distributed cache feature enabled by default. This feature is different than the Windows Azure Caching. It allows the role instance to use its memory as a in-memory distributed cache clusters. By using this feature you can have one or more worker roles as some dedicate cache clusters. Alternatively, you can make part of your web role and worker role’s memory as the cache clusters as well. Let’s just create an ASP.NET MVC 4 Web Role, and click F5 to run it under the local emulator. If you have been working with azure for a while you should know that I need to setup the local storage emulator before running locally if it’s a fresh azure SDK installation. But in this version when we started our azure project the Visual Studio will check if the storage emulator had been initialized. If not, it will run the initializer automatically. And as you can see, in this version the storage emulator relies on the SQL Server 2012 Local DB feature. It will create the emulator database and tables in the default local database. You can set the storage emulator to use a standard SQL Server default instance by using the command “dsinit /instance:.”. The “dsinit” tool now is located at %PROGRAM FILES%\Microsoft SDKs\Windows Azure\Emulator\devstore After the Visual Studio complied and deployed the package our website should be shown in the browser. This is the MVC 4 Web Role home page on my Windows 8 machine in IE10. Another thing you might notice is that, in this version the compute emulator utilizes IIS Express to host the web roles instead of the full IIS. You can add breakpoint in the code and debug, and you can use the local storage emulator to test your code for accessing the storage service. All of them are same as what your are doing now on SDK 1.6. You can switch to use IIS to run your web role in local emulator. Just open the windows azure porject property windows, in the Web page select “Use IIS Web Server”. For more information about this please have a look on Nuno’s blog post. In the role property page in Visual Studio there’s no massive changes. You can configure your role settings such as the endpoints, certificates and local storage, etc.. One thing was added is the Caching tab. Here you can specify enable the caching feature or not, and how much memory you want to use as the cache cluster. I will introduce more details about it in the future posts. The publish and package feature are also no change. You can publish your project to azure directly through Visual Studio 2012, while you can create the package and upload manually. Below is the SDK version of my deployment which is 1.7.30602.1703 in the developer portal.   Summary In this post I introduced about the new Windows Azure SDK 1.7 especially on how it works on the latest Visual Studio 2012 RC. There’s no significant changes in the visual studio tool in this version but some small enhancement such as ASP.NET MVC 4, Cache Worker Role, using SQL 2012 Local DB and IIS Express, etc..   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Database design for summarized data

    - by holden
    I have a new table I'm going to add to a bunch of other summarized data, basically to take some of the load off by calculating weekly avgs. My question is whether I would be better off with one model over the other. One model with days of the week as a column with an additional column for price or another model as a series of fields for the DOW each taking a price. I'd like to know which would save me in speed and/or headaches? Or at least the trade off. IE. ID OBJECT_ID MON TUE WED THU FRI SAT SUN SOURCE OR ID OBJECT_ID DAYOFWEEK PRICE SOURCE

    Read the article

  • nhibernate activerecord linq Contains problem

    - by Robert Ivanc
    Hi, I am having problems with the following query in Castle ActiveRecord 2.12: var q = from o in SodisceFMClientVAR.Queryable where taxnos2.Contains(o.TaxFileNo) select o; taxNos2 is an array of strings. When run I get an exception: + InnerException {"Index was out of range. Must be non-negative and less than the size of the collection.\r\nParameter name: index"} System.Exception {System.ArgumentOutOfRangeException} StackTrace " at Castle.ActiveRecord.ActiveRecordBase.ExecuteQuery(IActiveRecordQuery query)\r\n at Castle.ActiveRecord.Linq.LinqResultWrapper`1.Populate()\r\n at Castle.ActiveRecord.Linq.LinqResultWrapper`1.GetEnumerator()\r\n at NHibernate.Linq.Query`1.GetEnumerator()\r\n at System.Linq.Buffer`1..ctor(IEnumerable`1 source)\r\n at System.Linq.Enumerable.ToArray[TSource](IEnumerable`1 source)\r\n at prosoft.skb.insolventnostDataAccess.InsolventnostDataAccAR.GetOurUsersListLS(ICollection`1 taxNos) in C:\\svn\\skb\\insolventnostWithAR\\prosoft.skb.insolventnostDataAccess\\InsolventnostDataAR.cs:line 214\r\n at prosoft.skb.insolventnostDataFromWS.InsolventnostFromWS.filterByOurUsers(IEnumerable`1 odprtiPostopki) in C:\\svn\\skb\\insolventnostWithAR\\prosoft.skb.insolventnostDataFromWS\\InsolventnostFromWS.cs:line 237\r\n at prosoft.skb.insolventnostDataFromWS.InsolventnostFromWS.SyncData() in C:\\svn\\skb\\insolventnostWithAR\\prosoft.skb.insolventnostDataFromWS\\InsolventnostFromWS.cs:line 53" string Does Contains even work in linq for nhibernate? I couldn't find anything via google... Is there a workaround? Thanks!

    Read the article

  • Problem with Google Analytics for Android : "Dispatcher thinks it finished, but there were 543 faile

    - by PHP_Jedi
    Anyone know how to solve this problem? 03-23 13:03:20.585: WARN/googleanalytics(3430): Problem with socket or streams. 03-23 13:03:20.585: WARN/googleanalytics(3430): java.net.SocketException: Broken pipe 03-23 13:03:20.585: WARN/googleanalytics(3430): at org.apache.harmony.luni.platform.OSNetworkSystem.sendStreamImpl(Native Method) 03-23 13:03:20.585: WARN/googleanalytics(3430): at org.apache.harmony.luni.platform.OSNetworkSystem.sendStream(OSNetworkSystem.java:498) 03-23 13:03:20.585: WARN/googleanalytics(3430): at org.apache.harmony.luni.net.PlainSocketImpl.write(PlainSocketImpl.java:585) 03-23 13:03:20.585: WARN/googleanalytics(3430): at org.apache.harmony.luni.net.SocketOutputStream.write(SocketOutputStream.java:59) 03-23 13:03:20.585: WARN/googleanalytics(3430): at org.apache.http.impl.io.AbstractSessionOutputBuffer.flushBuffer(AbstractSessionOutputBuffer.java:87) 03-23 13:03:20.585: WARN/googleanalytics(3430): at org.apache.http.impl.io.AbstractSessionOutputBuffer.flush(AbstractSessionOutputBuffer.java:94) 03-23 13:03:20.585: WARN/googleanalytics(3430): at org.apache.http.impl.AbstractHttpClientConnection.doFlush(AbstractHttpClientConnection.java:168) 03-23 13:03:20.585: WARN/googleanalytics(3430): at org.apache.http.impl.AbstractHttpClientConnection.flush(AbstractHttpClientConnection.java:173) 03-23 13:03:20.585: WARN/googleanalytics(3430): at com.google.android.apps.analytics.PipelinedRequester.sendRequests(Unknown Source) 03-23 13:03:20.585: WARN/googleanalytics(3430): at com.google.android.apps.analytics.NetworkDispatcher$DispatcherThread$AsyncDispatchTask.dispatchSomePendingEvents(Unknown Source) 03-23 13:03:20.585: WARN/googleanalytics(3430): at com.google.android.apps.analytics.NetworkDispatcher$DispatcherThread$AsyncDispatchTask.run(Unknown Source) 03-23 13:03:20.585: WARN/googleanalytics(3430): at android.os.Handler.handleCallback(Handler.java:587) 03-23 13:03:20.585: WARN/googleanalytics(3430): at android.os.Handler.dispatchMessage(Handler.java:92) 03-23 13:03:20.585: WARN/googleanalytics(3430): at android.os.Looper.loop(Looper.java:123) 03-23 13:03:20.585: WARN/googleanalytics(3430): at android.os.HandlerThread.run(HandlerThread.java:60) 03-23 13:03:21.088: WARN/googleanalytics(3430): Dispatcher thinks it finished, but there were 543 failed events Specially the last line explain why there is lost so much data, as the dispatcher thinks it is done, but have 543 events not dispatched... The application have a good internet connection and there is no problem reaching the app server-side api. I see in analytics that lots of startups and click-events the past few days are lost, even I know the traffic is normal since i can see statistics from the the server api. In the analytics reports I see a day by day under-reporting. So the problems seems to be spreading/growing to all the devices using this application. Im wondering why google does not answer this in their mail-groups - several people have complained about this...well, well... I found this thread relevant: http://stackoverflow.com/questions/682560/java-net-socketexception-broken-pipe But, I'm still not sure if there is anything I can do to fix it or not. If there is nothing I can do to fix it, I guess its not my fault that it got broken. But i got a feeling it is, since the problem got dramatically worse on the last deploy to Android market. Anyone else with experience on Google Analytics for android ?

    Read the article

  • Overlay WriteableBitmap with color

    - by rajenk
    I'm trying to overlay a WriteableBitmap with a certain color in Silverlight. I have a black and white base image, which I'm using to create smaller WriteableBitmap images for a new composite image and I want to overlay either the black or white part of the source image cut-out with a certain color before adding it to the composite image. What I'm doing now is: var cutOut = new WriteableBitmap(8, 14); /* cut out the image here */ cutOut.Render(sourceImage, transform); // sourceImage is the base image cutOutImage.Source = cutOut; // cutOutImage is an Image element in XAML compositeImage.Render(cutOutImage, transform2); // compositeImage is the final WriteableBitmap that is shown on screen I tried the methods on http://blogs.silverarcade.com/silverlight-games-101/15/silverlight-blitting-and-blending-with-silverlights-writeablebitmap/ and using the extension methods from hxxp://writeablebitmapex.codeplex.com/, but I cannot seem to get a color overlay on the cutOut image before rendering it to the compositeImage. Does anyone know of a good method to do this? Thanks in advance.

    Read the article

  • Cloud Based Load Testing Using TF Service &amp; VS 2013

    - by Tarun Arora [Microsoft MVP]
    Originally posted on: http://geekswithblogs.net/TarunArora/archive/2013/06/30/cloud-based-load-testing-using-tf-service-amp-vs-2013.aspx One of the new features announced as part of the Visual Studio 2013 Ultimate Preview is ‘Cloud Based Load Testing’. In this blog post I’ll walk you through, What is Cloud Based Load Testing? How have I been using this feature? – Success story! Where can you find more resources on this feature? What is Cloud Based Load Testing? It goes without saying that performance testing your application not only gives you the confidence that the application will work under heavy levels of stress but also gives you the ability to test how scalable the architecture of your application is. It is important to know how much is too much for your application! Working with various clients in the industry I have realized that the biggest barriers in Load Testing & Performance Testing adoption are, High infrastructure and administration cost that comes with this phase of testing Time taken to procure & set up the test infrastructure Finding use for this infrastructure investment after completion of testing Is cloud the answer? 100% Visual Studio Compatible Scalable and Realistic Start testing in < 2 minutes Intuitive Pay only for what you need Use existing on premise tests on cloud There are a lot of vendors out there offering Cloud Based Load Testing, to name a few, Load Storm Soasta Blaze Meter Blitz And others… The question you may want to ask is, why should you go with Microsoft’s Cloud based Load Test offering. If you are a Microsoft shop or already have investments in Microsoft technologies, you’ll see great benefit in the natural integration this offers with existing Microsoft products such as Visual Studio and Windows Azure. For example, your existing Web tests authored in Visual Studio 2010 or Visual Studio 2012 will run on the cloud without requiring any modifications what so ever. Microsoft’s cloud test rig also supports API based testing, for example, if you are building a WPF application which consumes WCF services, you can write unit tests to invoke the WCF service, these tests can be run on the cloud test rig and loaded with ‘N’ concurrent users for performance testing. If you have your assets already hosted in the Azure and possibly in the same data centre as the Cloud test rig, your Azure app will not incur a usage cost because of the generated traffic since the traffic is coming from the same data centre. The licensing or pricing information on Microsoft’s cloud based Load test service is yet to be announced, but I would expect this to be priced attractively to match the market competition.   The only additional configuration required for running load tests on Microsoft Cloud based Load Tests service is to select the Test run location as Run tests using Visual Studio Team Foundation Service, How have I been using Microsoft’s Cloud based Load Test Service? I have been part of the Microsoft Cloud Based Load Test Service advisory council for the last 7 months. This gave the opportunity to see the product shape up from concept to working solution. I was also the first person outside of Microsoft to try this offering out. This gave me the opportunity to test real world application at various clients using the Microsoft Load Test Service and provide real world feedback to the Microsoft product team. One of the most recent systems I tested using the Load Test Service has been an insurance quote generation engine. This insurance quote generation engine is,   hosted in Windows Azure expected to get quote requests from across the globe expected to handle 5 Million quote requests in a day (not clear how this load will be distributed across the day) There was no way, I could simulate such kind of load from on premise without standing up additional hardware. But Microsoft’s Cloud based Load Test service allowed me to test my key performance testing scenarios, i.e. Simulate expected Load, Endurance Testing, Threshold Testing and Testing for Latency. Simulating expected load: approach to devising a load pattern My approach to devising a load test pattern has been to run the test scenario with 1 user to figure out the response time. Then work out how many users are required to reach the target load. So, for example, to invoke 1 quote from the quote engine software takes 0.5 seconds. Now if you do the math,   1 quote request by 1 user = 0.5 seconds   quotes generated by 1 user in 24 hour = 1 * (((2 * 60) * 60) * 24) = 172,800   quotes generated by 30 users in 24 hours = 172,800 * 30 =  5,184,000 This was a very simple example, if your application requires more concurrent users to test scenario’s such as caching, etc then you can devise your own load pattern, some examples of load test patterns can be found here.  Endurance Testing To test for endurance, I loaded the quote generation engine with an expected fixed user load and ran the test for very long duration such as over 48 hours and observed the affect of the long running test on the Azure infrastructure. Currently Microsoft Load Test service does not support metrics from the machine under test. I used Azure diagnostics to begin with, but later started using Cerebrata Azure Diagnostics Manager to capture the metrics of the machine under test. Threshold Testing To figure out how much user load the application could cope with before falling on its belly, I opted to step load the quote generation engine by incrementing user load with different variations of incremental user load per minute till the application crashed out and forced an IIS reset. Testing for Latency Currently the Microsoft Load Test service does not support generating geographically distributed load, I however, deployed the insurance quote generation engine in different Azure data centres and ran the same set of performance tests to measure for latency. Because I could compare load test results from different runs by exporting the results to excel (this feature is provided out of the box right from Visual Studio 2010) I could see the different in response times. More resources on Microsoft Cloud based Load Test Service A few important links to get you started, Download Visual Studio Ultimate 2013 Preview Getting started guide for load testing using Team Foundation Service Troubleshooting guide for FAQs and known issues Team Foundation Service forum for questions and support Detailed demo and presentation (link to Tech-Ed session recording) Detailed demo and presentation (link to Build session recording) There a few limits on the usage of Microsoft Cloud based Load Test service that you can read about here. If you have any feedback on Microsoft Cloud based Load Test service, feel free to share it with the product team via the Visual Studio User Voice forum. I hope you found this useful. Thank you for taking the time out and reading this blog post. If you enjoyed the post, remember to subscribe to http://feeds.feedburner.com/TarunArora. Stay tuned!

    Read the article

  • The underlying provider failed on Open

    - by senzacionale
    I was using .mdf for connect to DB and entityClient. Now i want to change connection string that trehe will be no .mdf Is this conectionstring correct <connectionStrings> <!--<add name="conString" connectionString="metadata=res://*/conString.csdl|res://*/conString.ssdl|res://*/conString.msl;provider=System.Data.SqlClient;provider connection string=&quot;Data Source=.\SQL2008;AttachDbFilename=|DataDirectory|\NData.mdf;Integrated Security=True;Connect Timeout=30;User Instance=True;MultipleActiveResultSets=True&quot;" providerName="System.Data.EntityClient" />--> <add name="conString" connectionString="metadata=res://*/conString.csdl|res://*/conString.ssdl|res://*/conString.msl;provider=System.Data.SqlClient;provider connection string=&quot;Data Source=.\SQL2008;Initial Catalog=NData;Integrated Security=True;Connect Timeout=30;User Instance=True;MultipleActiveResultSets=True&quot;" providerName="System.Data.EntityClient" /> becouse i always get error !The underlying provider failed on Open!

    Read the article

< Previous Page | 309 310 311 312 313 314 315 316 317 318 319 320  | Next Page >