Search Results

Search found 8429 results on 338 pages for 'batch processing'.

Page 50/338 | < Previous Page | 46 47 48 49 50 51 52 53 54 55 56 57  | Next Page >

  • Does LINQ require significantly more processing cycles and memory than lower-level data iteration techniques?

    - by Matthew Patrick Cashatt
    Background I am recently in the process of enduring grueling tech interviews for positions that use the .NET stack, some of which include silly questions like this one, and some questions that are more valid. I recently came across an issue that may be valid but I want to check with the community here to be sure. When asked by an interviewer how I would count the frequency of words in a text document and rank the results, I answered that I would Use a stream object put the text file in memory as a string. Split the string into an array on spaces while ignoring punctuation. Use LINQ against the array to .GroupBy() and .Count(), then OrderBy() said count. I got this answer wrong for two reasons: Streaming an entire text file into memory could be disasterous. What if it was an entire encyclopedia? Instead I should stream one block at a time and begin building a hash table. LINQ is too expensive and requires too many processing cycles. I should have built a hash table instead and, for each iteration, only added a word to the hash table if it didn't otherwise exist and then increment it's count. The first reason seems, well, reasonable. But the second gives me more pause. I thought that one of the selling points of LINQ is that it simply abstracts away lower-level operations like hash tables but that, under the veil, it is still the same implementation. Question Aside from a few additional processing cycles to call any abstracted methods, does LINQ require significantly more processing cycles to accomplish a given data iteration task than a lower-level task (such as building a hash table) would?

    Read the article

  • WCF GZip Compression Request/Response Processing

    - by IanT8
    How do I get a WCF client to process server responses which have been GZipped or Deflated by IIS? On IIS, I've followed the instructions here on how to make IIS 6 gzip all responses (where the request contained "Accept-Encoding: gzip, deflate") emitted by .svc wcf services. On the client, I've followed the instructions here and here on how to inject this header into the web request: "Accept-Encoding: gzip, deflate". Fiddler2 shows the response is binary and not plain old Xml. The client crashes with an exception which basically says there's no Xml header, which ofcourse is true. In my IClientMessageInspector, the app crashes before AfterReceiveReply is called. Some further notes: (1) I can't change the WCF service or client as they are supplied by a 3rd party. I can however attach behaviors and/or message inspectors via configuration if this is the right direction to take. (2) I don't want to compress/uncompress just the soap body, but the entire message. Any ideas/solutions? * SOLVED * It was not possible to write a WCF extension to achieve these goals. Instead I followed this CodeProject article which advocate a helper class: public class CompressibleHttpRequestCreator : IWebRequestCreate { public CompressibleHttpRequestCreator() { } WebRequest IWebRequestCreate.Create(Uri uri) { HttpWebRequest httpWebRequest = Activator.CreateInstance(typeof(HttpWebRequest), BindingFlags.CreateInstance | BindingFlags.Public | BindingFlags.NonPublic | BindingFlags.Instance, null, new object[] { uri, null }, null) as HttpWebRequest; if (httpWebRequest == null) { return null; } httpWebRequest.AutomaticDecompression =DecompressionMethods.GZip | DecompressionMethods.Deflate; return httpWebRequest; } } and also, an addition to the application configuration file: <configuration> <system.net> <webRequestModules> <remove prefix="http:"/> <add prefix="http:" type="Pajocomo.Net.CompressibleHttpRequestCreator, Pajocomo" /> </webRequestModules> </system.net> </configuration> What seems to be happening is that WCF eventually asks some factory or other deep down in system.net to provide an HttpWebRequest instance, and we provide the helper that will be asked to create the required instance. In the WCF client configuration file, a simple basicHttpBinding is all that is required, without the need for any custom extensions. When the application runs, the client Http request contains the header "Accept-Encoding: gzip, deflate", the server returns a gzipped web response, and the client transparently decompresses the http response before handing it over to WCF. When I tried to apply this technique to Web Services I found that it did NOT work. Although the helper class was executed in the same was as when used by the WCF client, the http request did not contain the "Accept-Encoding: ..." header. To make this work for Web Services, I had to edit the Web Proxy class, and add this method: protected override System.Net.WebRequest GetWebRequest(Uri uri) { System.Net.HttpWebRequest rq = (System.Net.HttpWebRequest)base.GetWebRequest(uri); rq.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate; return rq; } Note that it did not matter whether the CompressibleHttpRequestCreator and block from the application config file were present or not. For web services, only overriding GetWebRequest in the Web Service Proxy worked.

    Read the article

  • Handling nmake errorlevel/return codes

    - by tlianza
    Hi all, I have an nmake-based project which in turn calls the asp compiler, which can throw an error, which nmake seems to recognize: NMAKE : fatal error U1077: 'C:\Windows\Microsoft.NET\Framework\v2.0.50727\aspnet_compiler.exe' : return code '0x1' However, when I call nmake from within a batch file, the environment variable %ERRORLEVEL% remains set at zero: nmake /NOLOGO echo BUILD RETURNING: %ERRORLEVEL% If I control-c the nmake task, I do end up getting a non-zero ERRORLEVEL (it's set to 2) so my assumption is that I'm able to catch errors okay, but nmake isn't bubbling up the non-zero exit code from it's task. Or, at least, I'm mis-trapping it. Any help would be appreciated.

    Read the article

  • again about JPA/Hibernate bulk(batch) insert

    - by abovesun
    Here is simple example I've created after reading several topics about jpa bulk inserts, I have 2 persistent objects User, and Site. One user could have many site, so we have one to many relations here. Suppose I want to create user and create/link several sites to user account. Here is how code looks like, considering my willing to use bulk insert for Site objects. User user = new User("John Doe"); user.getSites().add(new Site("google.com", user)); user.getSites().add(new Site("yahoo.com", user)); EntityTransaction tx = entityManager.getTransaction(); tx.begin(); entityManager.persist(user); tx.commit(); But when I run this code (I'm using hibernate as jpa implementation provider) I see following sql output: Hibernate: insert into User (id, name) values (null, ?) Hibernate: call identity() Hibernate: insert into Site (id, url, user_id) values (null, ?, ?) Hibernate: call identity() Hibernate: insert into Site (id, url, user_id) values (null, ?, ?) Hibernate: call identity() So, I means "real" bulk insert not works or I am confused? Here is source code for this example project, this is maven project so you have only download and run mvn install to check output.

    Read the article

  • jQuery DataTables server side processing and ASP.Net

    - by Chad
    I'm trying to use the server side functionality of the jQuery Datatables plugin with ASP.Net. The ajax request is returning valid JSON, but nothing is showing up in the table. I originally had problems with the data I was sending in the ajax request. I was getting a "Invalid JSON primative" error. I discovered that the data needs to be in a string instead of JSON serialized, as described in this post: http://encosia.com/2008/06/05/3-mistakes-to-avoid-when-using-jquery-with-aspnet-ajax/. I wasn't quite sure how to fix that, so I tried adding this in the ajax request: "data": "{'sEcho': '" + aoData.sEcho + "'}" If the aboves eventually works I'll add the other parameters later. Right now I'm just trying to get something to show up in my table. The returning JSON looks ok and validates, but the sEcho in the post is undefined, and I think thats why no data is being loaded into the table. So, what am I doing wrong? Am I even on the right track or am I being stupid? Does anyone ran into this before or have any suggestions? Here's my jQuery: $(document).ready(function() { $("#grid").dataTable({ "bJQueryUI": true, "sPaginationType": "full_numbers", "bServerSide":true, "sAjaxSource": "GridTest.asmx/ServerSideTest", "fnServerData": function(sSource, aoData, fnCallback) { $.ajax({ "type": "POST", "dataType": 'json', "contentType": "application/json; charset=utf-8", "url": sSource, "data": "{'sEcho': '" + aoData.sEcho + "'}", "success": fnCallback }); } }); }); HTML: <table id="grid"> <thead> <tr> <th>Last Name</th> <th>First Name</th> <th>UserID</th> </tr> </thead> <tbody> <tr> <td colspan="5" class="dataTables_empty">Loading data from server</td> </tr> </tbody> </table> Webmethod: <WebMethod()> _ Public Function ServerSideTest() As Data Dim list As New List(Of String) list.Add("testing") list.Add("chad") list.Add("testing") Dim container As New List(Of List(Of String)) container.Add(list) list = New List(Of String) list.Add("testing2") list.Add("chad") list.Add("testing") container.Add(list) HttpContext.Current.Response.ContentType = "application/json" Return New Data(HttpContext.Current.Request("sEcho"), 2, 2, container) End Function Public Class Data Private _iTotalRecords As Integer Private _iTotalDisplayRecords As Integer Private _sEcho As Integer Private _sColumns As String Private _aaData As List(Of List(Of String)) Public Property sEcho() As Integer Get Return _sEcho End Get Set(ByVal value As Integer) _sEcho = value End Set End Property Public Property iTotalRecords() As Integer Get Return _iTotalRecords End Get Set(ByVal value As Integer) _iTotalRecords = value End Set End Property Public Property iTotalDisplayRecords() As Integer Get Return _iTotalDisplayRecords End Get Set(ByVal value As Integer) _iTotalDisplayRecords = value End Set End Property Public Property aaData() As List(Of List(Of String)) Get Return _aaData End Get Set(ByVal value As List(Of List(Of String))) _aaData = value End Set End Property Public Sub New(ByVal sEcho As Integer, ByVal iTotalRecords As Integer, ByVal iTotalDisplayRecords As Integer, ByVal aaData As List(Of List(Of String))) If sEcho <> 0 Then Me.sEcho = sEcho Me.iTotalRecords = iTotalRecords Me.iTotalDisplayRecords = iTotalDisplayRecords Me.aaData = aaData End Sub Returned JSON: {"__type":"Data","sEcho":0,"iTotalRecords":2,"iTotalDisplayRecords":2,"aaData":[["testing","chad","testing"],["testing2","chad","testing"]]}

    Read the article

  • Batch select with SQLAlchemy

    - by muckabout
    I have a large set of values V, some of which are likely to exist in a table T. I would like to insert into the table those which are not yet inserted. So far I have the code: for value in values: s = self.conn.execute(mytable.__table__.select(mytable.value == value)).first() if not s: to_insert.append(value) I feel like this is running slower than it should. I have a few related questions: Is there a way to construct a select statement such that you provide a list (in this case, 'values') to which sqlalchemy responds with records which match that list? Is this code overly expensive in constructing select objects? Is there a way to construct a single select statement, then parameterize at execution time?

    Read the article

  • iPhone Image Processing--matrix convolution

    - by James
    I am implementing a matrix convolution blur on the iPhone. The following code converts the UIImage supplied as an argument of the blur function into a CGImageRef, and then stores the RGBA values in a standard C char array. CGImageRef imageRef = imgRef.CGImage; int width = imgRef.size.width; int height = imgRef.size.height; CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); unsigned char *pixels = malloc((height) * (width) * 4); NSUInteger bytesPerPixel = 4; NSUInteger bytesPerRow = bytesPerPixel * (width); NSUInteger bitsPerComponent = 8; CGContextRef context = CGBitmapContextCreate(pixels, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big); CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef); CGContextRelease(context); Then the pixels values stored in the pixels array are convolved, and stored in another array. unsigned char *results = malloc((height) * (width) * 4); Finally, these augmented pixel values are changed back into a CGImageRef, converted to a UIImage, and the returned at the end of the function with the following code. context = CGBitmapContextCreate(results, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big); CGImageRef finalImage = CGBitmapContextCreateImage(context); UIImage *newImage = [UIImage imageWithCGImage:CGBitmapContextCreateImage(context)]; CGImageRelease(finalImage); NSLog(@"edges found"); free(results); free(pixels); CGColorSpaceRelease(colorSpace); return newImage; This works perfectly, once. Then, once the image is put through the filter again, very odd, unprecedented pixel values representing input pixel values that don't exist, are returned. Is there any reason why this should work the first time, but then not afterward? Beneath is the entirety of the function. -(UIImage*) blur:(UIImage*)imgRef { CGImageRef imageRef = imgRef.CGImage; int width = imgRef.size.width; int height = imgRef.size.height; CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); unsigned char *pixels = malloc((height) * (width) * 4); NSUInteger bytesPerPixel = 4; NSUInteger bytesPerRow = bytesPerPixel * (width); NSUInteger bitsPerComponent = 8; CGContextRef context = CGBitmapContextCreate(pixels, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big); CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef); CGContextRelease(context); height = imgRef.size.height; width = imgRef.size.width; float matrix[] = {0,0,0,0,1,0,0,0,0}; float divisor = 1; float shift = 0; unsigned char *results = malloc((height) * (width) * 4); for(int y = 1; y < height; y++){ for(int x = 1; x < width; x++){ float red = 0; float green = 0; float blue = 0; int multiplier=1; if(y>0 && x>0){ int index = (y-1)*width + x; red = matrix[0]*multiplier*(float)pixels[4*(index-1)] + matrix[1]*multiplier*(float)pixels[4*(index)] + matrix[2]*multiplier*(float)pixels[4*(index+1)]; green = matrix[0]*multiplier*(float)pixels[4*(index-1)+1] + matrix[1]*multiplier*(float)pixels[4*(index)+1] + matrix[2]*multiplier*(float)pixels[4*(index+1)+1]; blue = matrix[0]*multiplier*(float)pixels[4*(index-1)+2] + matrix[1]*multiplier*(float)pixels[4*(index)+2] + matrix[2]*multiplier*(float)pixels[4*(index+1)+2]; index = (y)*width + x; red = red+ matrix[3]*multiplier*(float)pixels[4*(index-1)] + matrix[4]*multiplier*(float)pixels[4*(index)] + matrix[5]*multiplier*(float)pixels[4*(index+1)]; green = green + matrix[3]*multiplier*(float)pixels[4*(index-1)+1] + matrix[4]*multiplier*(float)pixels[4*(index)+1] + matrix[5]*multiplier*(float)pixels[4*(index+1)+1]; blue = blue + matrix[3]*multiplier*(float)pixels[4*(index-1)+2] + matrix[4]*multiplier*(float)pixels[4*(index)+2] + matrix[5]*multiplier*(float)pixels[4*(index+1)+2]; index = (y+1)*width + x; red = red+ matrix[6]*multiplier*(float)pixels[4*(index-1)] + matrix[7]*multiplier*(float)pixels[4*(index)] + matrix[8]*multiplier*(float)pixels[4*(index+1)]; green = green + matrix[6]*multiplier*(float)pixels[4*(index-1)+1] + matrix[7]*multiplier*(float)pixels[4*(index)+1] + matrix[8]*multiplier*(float)pixels[4*(index+1)+1]; blue = blue + matrix[6]*multiplier*(float)pixels[4*(index-1)+2] + matrix[7]*multiplier*(float)pixels[4*(index)+2] + matrix[8]*multiplier*(float)pixels[4*(index+1)+2]; red = red/divisor+shift; green = green/divisor+shift; blue = blue/divisor+shift; if(red<0){ red=0; } if(green<0){ green=0; } if(blue<0){ blue=0; } if(red>255){ red=255; } if(green>255){ green=255; } if(blue>255){ blue=255; } int realPos = 4*(y*imgRef.size.width + x); results[realPos] = red; results[realPos + 1] = green; results[realPos + 2] = blue; results[realPos + 3] = 1; }else { int realPos = 4*((y)*(imgRef.size.width) + (x)); results[realPos] = 0; results[realPos + 1] = 0; results[realPos + 2] = 0; results[realPos + 3] = 1; } } } context = CGBitmapContextCreate(results, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big); CGImageRef finalImage = CGBitmapContextCreateImage(context); UIImage *newImage = [UIImage imageWithCGImage:CGBitmapContextCreateImage(context)]; CGImageRelease(finalImage); free(results); free(pixels); CGColorSpaceRelease(colorSpace); return newImage;} THANKS!!!

    Read the article

  • OperationalError "unable to open database file" processing query results with SQLAlchemy and SQLite3

    - by Peter
    I'm running into this little problem that I hope is just a dumb user error. It looks like some sort of a size limit with a query to a SQLite database. I managed to reproduce the issue with an in-memory DB and a simple script shown below. I can make it work by either reducing the number of records in the DB; or by reducing the size of each record; or by dropping the order_by() call. I am using Python 2.5.5 and SQLAlchemy 0.6.0 in a Cygwin environment. Thanks! #!/usr/bin/python from sqlalchemy.orm import sessionmaker import sqlalchemy import sqlalchemy.orm class Person(object): def __init__(self, name): self.name = name engine = sqlalchemy.create_engine('sqlite:///:memory:') Session = sessionmaker(bind=engine) metadata = sqlalchemy.schema.MetaData(bind=engine) person_table = sqlalchemy.Table('person', metadata, sqlalchemy.Column('id', sqlalchemy.types.Integer, primary_key=True), sqlalchemy.Column('name', sqlalchemy.types.String)) metadata.create_all(engine) sqlalchemy.orm.mapper(Person, person_table) session = Session() session.add_all([Person("012345678901234567890123456789012") for i in range(5000)]) session.commit() persons = session.query(Person).order_by(Person.name).all() print "count =", len(persons) session.close() The all() call to the query result fails with the OperationalError exception: Traceback (most recent call last): File "./stress.py", line 27, in <module> persons = session.query(Person).order_by(Person.name).all() File "/usr/lib/python2.5/site-packages/sqlalchemy/orm/query.py", line 1343, in all return list(self) File "/usr/lib/python2.5/site-packages/sqlalchemy/orm/query.py", line 1451, in __iter__ return self._execute_and_instances(context) File "/usr/lib/python2.5/site-packages/sqlalchemy/orm/query.py", line 1456, in _execute_and_instances mapper=self._mapper_zero_or_none()) File "/usr/lib/python2.5/site-packages/sqlalchemy/orm/session.py", line 737, in execute clause, params or {}) File "/usr/lib/python2.5/site-packages/sqlalchemy/engine/base.py", line 1109, in execute return Connection.executors[c](self, object, multiparams, params) File "/usr/lib/python2.5/site-packages/sqlalchemy/engine/base.py", line 1186, in _execute_clauseelement return self.__execute_context(context) File "/usr/lib/python2.5/site-packages/sqlalchemy/engine/base.py", line 1215, in __execute_context context.parameters[0], context=context) File "/usr/lib/python2.5/site-packages/sqlalchemy/engine/base.py", line 1284, in _cursor_execute self._handle_dbapi_exception(e, statement, parameters, cursor, context) File "/usr/lib/python2.5/site-packages/sqlalchemy/engine/base.py", line 1282, in _cursor_execute self.dialect.do_execute(cursor, statement, parameters, context=context) File "/usr/lib/python2.5/site-packages/sqlalchemy/engine/default.py", line 277, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.OperationalError: (OperationalError) unable to open database file u'SELECT person.id AS person_id, person.name AS person_name \nFROM person ORDER BY person.name' ()

    Read the article

  • MDB 2 CSV batch.

    - by darhipo
    Does anybody know what I could use to write a script to convert all MDB files in a directory to CSV files? I'm working on Windows but have been using Cygwin for some work.

    Read the article

  • details on the following Natural Language Processing terms ?

    - by wefwgeweg
    Named Entity Extraction (extract ppl, cities, organizations) Content Tagging (extract topic tags by scanning doc) Structured Data Extraction Topic Categorization (taxonomy classification by scanning doc....bayesian ) Text extraction (HTML page cleaning) are there libraries that i can use to do any of the above functions of NLP ? dont really feel like forking out cash to AlchemyAPI

    Read the article

  • Spring-security not processing pre/post annotations

    - by wuntee
    Trying to get pre/post annotations working with a web application, but for some reason nothing is happening with spring-security. Can anyone see what im missing? web.xml contextConfigLocation /WEB-INF/rvaContext-business.xml /WEB-INF/rvaContext-security.xml <context-param> <param-name>log4jConfigLocation</param-name> <param-value>/WEB-INF/log4j.properties</param-value> </context-param> <!-- Spring security filter --> <filter> <filter-name>springSecurityFilterChain</filter-name> <filter-class>org.springframework.web.filter.DelegatingFilterProxy</filter-class> </filter> <filter-mapping> <filter-name>springSecurityFilterChain</filter-name> <url-pattern>/*</url-pattern> </filter-mapping> <listener> <listener-class>org.springframework.web.context.ContextLoaderListener</listener-class> </listener> <!-- - Publishes events for session creation and destruction through the application - context. Optional unless concurrent session control is being used. --> <listener> <listener-class>org.springframework.security.web.session.HttpSessionEventPublisher</listener-class> </listener> <listener> <listener-class>org.springframework.web.util.Log4jConfigListener</listener-class> </listener> <servlet> <servlet-name>rva</servlet-name> <servlet-class>org.springframework.web.servlet.DispatcherServlet</servlet-class> <load-on-startup>1</load-on-startup> </servlet> <servlet-mapping> <servlet-name>rva</servlet-name> <url-pattern>/rva/*</url-pattern> </servlet-mapping> rvaContext-secuity.xml: <?xml version="1.0" encoding="UTF-8"?> <beans:beans xmlns="http://www.springframework.org/schema/security" xmlns:beans="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd http://www.springframework.org/schema/security http://www.springframework.org/schema/security/spring-security-3.0.xsd"> <global-method-security pre-post-annotations="enabled"/> <http use-expressions="true"> <form-login /> <logout /> <remember-me /> <!-- Uncomment to limit the number of sessions a user can have --> <session-management invalid-session-url="/timeout.jsp"> <concurrency-control max-sessions="1" error-if-maximum-exceeded="true" /> </session-management> <form-login login-page="rva/login" /> </http> ... LoginController class: @Controller @RequestMapping("/login") public class LoginController { @RequestMapping(method = RequestMethod.GET) public String login(ModelMap map){ map.addAttribute("title", "Login: AD Credentials"); return("login"); } @RequestMapping("/secure") @PreAuthorize("hasRole('ROLE_USER')") public String secure(ModelMap map){ return("secure"); } } In the logs, there is nothing even related to spring-security: logs: INFO: Initializing Spring FrameworkServlet 'rva' INFO [org.springframework.web.servlet.DispatcherServlet] - FrameworkServlet 'rva': initialization started INFO [org.springframework.web.context.support.XmlWebApplicationContext] - Refreshing WebApplicationContext for namespace 'rva-servlet': startup date [Fri Mar 26 10:28:51 MDT 2010]; parent: Root WebApplicationContext INFO [org.springframework.beans.factory.xml.XmlBeanDefinitionReader] - Loading XML bean definitions from ServletContext resource [/WEB-INF/rva-servlet.xml] INFO [org.springframework.beans.factory.support.DefaultListableBeanFactory] - Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@a2fc31: defining beans [loginController,org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,freemarkerConfig,viewResolver]; parent: org.springframework.beans.factory.support.DefaultListableBeanFactory@cc74e7 INFO [org.springframework.web.servlet.view.freemarker.FreeMarkerConfigurer] - ClassTemplateLoader for Spring macros added to FreeMarker configuration INFO [org.springframework.web.servlet.mvc.annotation.DefaultAnnotationHandlerMapping] - Mapped URL path [/login/secure] onto handler [com.cable.comcast.neto.nse.rva.controller.LoginController@79b32a] INFO [org.springframework.web.servlet.mvc.annotation.DefaultAnnotationHandlerMapping] - Mapped URL path [/login/secure.*] onto handler [com.cable.comcast.neto.nse.rva.controller.LoginController@79b32a] INFO [org.springframework.web.servlet.mvc.annotation.DefaultAnnotationHandlerMapping] - Mapped URL path [/login/secure/] onto handler [com.cable.comcast.neto.nse.rva.controller.LoginController@79b32a] INFO [org.springframework.web.servlet.mvc.annotation.DefaultAnnotationHandlerMapping] - Mapped URL path [/login] onto handler [com.cable.comcast.neto.nse.rva.controller.LoginController@79b32a] INFO [org.springframework.web.servlet.mvc.annotation.DefaultAnnotationHandlerMapping] - Mapped URL path [/login.*] onto handler [com.cable.comcast.neto.nse.rva.controller.LoginController@79b32a] INFO [org.springframework.web.servlet.mvc.annotation.DefaultAnnotationHandlerMapping] - Mapped URL path [/login/] onto handler [com.cable.comcast.neto.nse.rva.controller.LoginController@79b32a] INFO [org.springframework.web.servlet.DispatcherServlet] - FrameworkServlet 'rva': initialization completed in 417 ms Mar 26, 2010 10:28:52 AM org.apache.coyote.http11.Http11Protocol start INFO: Starting Coyote HTTP/1.1 on http-8080 Mar 26, 2010 10:28:52 AM org.apache.jk.common.ChannelSocket init INFO: JK: ajp13 listening on /0.0.0.0:8009 Mar 26, 2010 10:28:52 AM org.apache.jk.server.JkMain start INFO: Jk running ID=0 time=0/31 config=null Mar 26, 2010 10:28:52 AM org.apache.catalina.startup.Catalina start INFO: Server startup in 1873 ms WARN [org.springframework.web.servlet.PageNotFound] - No mapping found for HTTP request with URI [/rva-web/] in DispatcherServlet with name 'rva'

    Read the article

  • Processing an Integer Buried in a String in XSL

    - by justkt
    I have a stored time value which is of the format H:mm:ss. The hours may be any value from 0 up through several days. This data is sent in an XML tag and processed by XSL to be displayed. The display that I want is of the format: D days, HH:mm:ss (hours/minutes) Where the last tag shows hours if HH is greater than 0, minutes if it is 0. Given the original HH, which may be more than 24, I know I need the floor of HH / 24 to get the days value. Then the original HH % 24 gives me the leftover hours. I have also handled the minutes and hours question using xsl:when and xsl:if. It's getting days and hours from the hours value that has me stumped. EDIT So far, I'm looking at doing the following: Variable declaration <xsl:variable name="time"><xsl:value-of select="time" /><xsl:variable> <xsl:variable name="days"><xsl:value-of select="floor(substring-before(time, ':') / 24)" /></xsl:variable> <xsl:variable name="hours"><xsl:value-of select="substring-before(time, ':') mod 24" /></xsl:variable> <xsl:variable name="minutes"><xsl:value-of select="substring-after(time, ':')" /></xsl:variable> Use <xsl:if test="$days > 0"> <xsl:value-of select="$days" /> days </xsl:if> <xsl:value-of select="$hours" />:<xsl:value-of select="$minutes" /> <xsl:choose> <xsl:when test="$hours > 0"> hour<xsl:if test="$hours > 1">s</xsl:if> </xsl:when> <xsl:otherwise> minute<xsl:if test="$minute != '01:00'">s</xsl:if> </xsl:otherwise> </xsl:choose> And for clarification, a sample time would be <time>26:15:00</time> for 1 day 2:15 hours.

    Read the article

  • Dashboard for collaborative science / data processing projects

    - by rescdsk
    Hi, Continuous Integration servers like Hudson are a pretty amazing addition to software development. I work in an academic research lab, and I'd love to apply similar principles to scientific data analysis. I want a dashboard-like view of which collections of data are fine, which ones are failing their tests (simple shell scripts, mostly), and so on. A lot like the Chromium dashboard (WARNING: page takes a long time to load). It takes work from at least 4 people, and maybe 10 or 12 hours of computer time, to bring our data (from behavioral studies) from its raw form to its final, easily-analyzed form. I've tried Hudson and buildbot, but neither is really appropriate to our workflow. We just want to run a bunch of tests on maybe fifty independent collections of subject data, and display the results nicely. SO! Does anyone have a recommendation of a way to generate this kind of report easily? Or, can you think of a good way to shoehorn this kind of workflow into a continuous integration server? Or, can you recommend a unit testing dashboard that could deal with tests that are little shell scripts rather than little functions? Thank you!

    Read the article

  • Background processing in rails

    - by hashpipe
    Hi, This might seem like a FAQ on stackoverflow, but my requirements are a little different. While I have previously used BackgroundRB and DJ for running background processes in ruby, my requirement this time is to run some heavy analytics and mathematical computations on a huge set of data, and I need to do this only about the first 15 days of the month. Going by this, I am tempted to use cron and run a ruby script to accomplish this goal. What I would like to know / understand is: 1 - Is using cron a good idea (cause I'm not a system admin, and so while I have basic idea of cron, I'm not overly confident of doing it perfectly) 2 - Can we somehow modify DJ to run only on the first 15 days of the month (with / without using cron), and then just stop and exit once all the jobs in the queue for the day are over (don't want it to ping the DB every time for a new job...whatever the jobs will be in the queue when DJ starts, that will be all). I'm not sure if I have put the question in the right manner, but any help in this direction will be much appreciated. Thanks

    Read the article

  • Image Application in WPF and Perfomance.

    - by Harsha
    Hello All, I am planning to build Image processing application using WPF. Brightness /Contrast and Histogram are main operation of this application. I have downloaded the application " Foundations: Bitmaps and Pixel Bits" from http://msdn.microsoft.com/en-us/magazine/cc534995.aspx . But when I tried to open the images which are more than 1200x1600, It is very slow. How to increase the performance. Is any one worked on Image processing in WPF. Please suggest me how to solve this perfomance issue in WPF for image(more than 1600x1200) operation. Thanks you, Harsha

    Read the article

  • Oracle performance problems with large batch of XSL operations

    - by FrustratedWithFormsDesigner
    I have a system that is performing many XSL transformations on XMLType objects. The problem is that the system gradually slows down over time, and sometimes crashes when it runs out of memory. It seems that the slow down (and possibly memory crash) is around the dbms_xslprocessor.processXSL function call, which gradually takes longer and longer to complete. The code looks like this: v_doc dbms_xmldom.DOMDocument; v_transformer dbms_xmldom.DOMDocument; v_XSLprocessor dbms_xslprocessor.Processor; v_stylesheet dbms_xslprocessor.Stylesheet; v_clob clob; ... transformer := PKG_STUFF.getXSL(); v_transformer := dbms_xmldom.newDOMDocument(transformer); v_XSLprocessor := Dbms_Xslprocessor.newProcessor; v_stylesheet := dbms_xslprocessor.newStylesheet(v_transformer, ''); ... for source_data in (select id in source_tbl) loop begin v_doc := PKG_CONVERT.convert(in_id => source_data.id); --start time of operation v_begin_op_time := dbms_utility.get_time; --reset the CLOB v_clob := ' '; --Apply XSL Transform dbms_xslprocessor.processXSL(p => v_XSLprocessor, ss => v_stylesheet, xmldoc => v_Doc, cl => v_clob); v_doc := dbms_xmldom.newDOMDocument(XMLType(v_clob)); --end time v_end_op_time := dbms_utility.get_time; --calculate duration v_time_taken := (((v_end_op_time - v_begin_op_time))); --log the duration PKG_LOG.log_message('Time taken to transform XML: '||v_time_taken); ... ... DBMS_XMLDOM.freeDocument(v_Doc); DBMS_LOB.freetemporary(lob_loc => v_clob); end loop; The time taken to transform the XML is slowly creeping up (I suppose it might also be the call to dbms_xmldom.newDOMDocument, but I had thought that to be fairly straightforward). I have no idea why.... :( (Oracle 10g)

    Read the article

  • Complex String Processing - well complex to me

    - by percent20
    I am calling a web service and all I get back is a giant blob of text. I am left to process it myself. Problem is not all lines are necessarily the same. They each have 2 or 3 sections to them and they are similar. Here are the most common examples text1 [text2] /text3/ text1/test3 text1[text2]/text3 text1 [text2] /text /3 here/ I am not exactly sure how to approach this problem. I am not too good at doing anything advanced as far as manipulating strings. I was thinking using a regular expression might work, but not too sure on that either. If I can get each of these 3 sections broken up it is easier from there to do the rest. its just there doesn't seem to be any uniformity to the main 3 sections that I know how to work with.

    Read the article

  • Rogue black-box java application not responding to standard input redirect

    - by Stefan Kendall
    I have an external java application (blackbox), which requires authentication. I need to run this application in a batch setting, but it seems to be reading from standard input in some nonstandard way. That is, if I set the calling of the program to redirect STDIN to a file (... <password.txt) or pipe data to it (echo mypasword | ...), it does not recognize the input. As I run it, also, it seems to intercept Cntrl+c and Cntrl+d and Cntrl+z as legitimate password characters, so it must be doing something odd and not just reading from standard in. Any idea what this application could be doing to read in input? I need to be able to send it information programmatically, and I'm stumped for the moment.

    Read the article

  • $ajaxForm reply back from processing page using jquery and php

    - by Jean
    Hello, I have a page called guestbook.php in which contains $('#guest_form').ajaxForm({}); When the form is triggered it goes to a save.php page which contains and values inserted if($_POST['x']){ $xx = $_POST['x']; $yy = $_POST['y']; $zz = $_POST['z']; $query_one = "INSERT INTO xxx (x1,yl,z1,z2) values ('$xx','$yy','$zz','00000')"; mysql_select_db($database_1, $1); $Result = mysql_query($query_guest_one, $1) or die(mysql_error()); So far so good. Now I run a select query based on the insert and display it in a div on the guestbook.php page. That is where I cannot do it. All help appreciated. Thanks Jean

    Read the article

  • Windows Batch Script Question

    - by Scott
    So I need a Windows Script that I can tell it a directory to go through and it will parse all sub-directories and while in each subdir, will archive all files with a certain file extension and keep it in the same subdir, then move onto the next one. What's the best way to go about this? Perl Automation Scripting, AutoIt? Any sample code you guys can give me?

    Read the article

  • Worker process reached its allowed processing time

    - by Confused
    We are experiencing this issue approximately once a month. It is very hard to pinpoint the cause so any help would be appreciated. This causes the App pool to stop and brings the site down. We have gone through all log files and have concluded nothing. We are using the 2.0.3 version on IIS 6.

    Read the article

  • Asynchronous message queues and processing like Amazon Simple Queue service in django

    - by becomingGuru
    There are many activities on an application that need things like: Send email, Post to twitter thumbnail an image, into several sizes call a cron to find connected relationships A good way to do these tasks is to write into an asynchronous queue on which operations are performed. What django application can be used to implement such functionality, as the one Amazon Simple Queue service offers, locally? I have come across celery. Right thing? Anything else that exists, like this?

    Read the article

  • ADO.NET Batch Insert with over 2000 parameters

    - by Liming
    Hello all, I'm using Enterprise library, but the idea is the same. I have a SqlStringCommand and the sql is constructed using StringBuilder in the forms of "insert into table (column1, column2, column3) values (@param1-X, @param2-X, @parm3-X)"+" " where "X" represents a "for loop" about 700 rows StringBuilder sb = new StringBuilder(); for(int i=0; i<700; i++) { sb.Append("insert into table (column1, column2, column3) values (@param1-"+i+", @param2-"+i, +",@parm3-"+i+") " ); } followed by constructing a command object injecting all the parameters w/ values into it. Essentially, 700 rows with 3 parameters, I ended up with 2100 parameters for this "one sql" Statement. It ran fine for about a few days and suddenly I got this error =============================================================== A severe error occurred on the current command. The results, if any, should be discarded. at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString) at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async) at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, DbAsyncResult result) at System.Data.SqlClient.SqlCommand.InternalExecuteNon Any pointers are greatly appreciated.

    Read the article

  • Batch rename file extensions, including subdirectories

    - by Alan
    I'm renaming empty file extensions with this command: rename *. *.bla However, I have a folder with hundreds of such subfolders, and this command requires me to manually navigate to each subfolder and run it. Is there a command that I can run from just one upper level folder that will include all the files in the subfolders?

    Read the article

  • AD Stopping a Script and Writing a Value to a User's AD Account PPT Presentation

    - by Steven Maxon
    ‘This will launch the PPT in a GPO Dim ppt Set ppt = CreateObject("PowerPoint.Application") ppt.Visible = True ppt.Presentations.Open "C:\Scripts\Test.pptx" ‘This is the batch file at the end of the PPT that records the date, time, computer name and username echo "Logon Date:%date%,Logon Time:%time%,Computer Name:%computername%,User Name:%username%" >> \\servertest\g$\Tracking\LOGON.TXT ‘This is what I need but can’t find: I need the script to check a value in the Active Directory user’s account in the Web page: attribute that would shut off the script if the user has already competed reading the presentation. Could be as simple as writing XXXX. I need the value XXXX written to the Active Directory user’s account in the Web page: attribute when they finish reading the presentation after they click on the bat file so the script will not run again when they log in.

    Read the article

< Previous Page | 46 47 48 49 50 51 52 53 54 55 56 57  | Next Page >