Search Results

Search found 25180 results on 1008 pages for 'post processing'.

Page 474/1008 | < Previous Page | 470 471 472 473 474 475 476 477 478 479 480 481  | Next Page >

  • Scriptom (groovy) leaves Excel process running - am I doing something wrong?

    - by Alex Stoddard
    I am using the Scriptom extension to Groovy 1.7.0 to automate some processing using Excel 2007 under Windows XP. This always seems to leave an Excel process running despite my calling quit on the excel activeX object. (There is a passing reference to this phenomenon in the Scriptom example documentation too.) Code looks like: import org.codehaus.groovy.scriptom.ActiveXObject; def xls = new ActiveXObject("Excel.Application") xls.Visible = true // do xls stuff xls.Quit() The visible excel window does disappear but an EXCEL process is left in the task manager (and more processes pile up with each run of the script). There are no error message or exceptions. Can anyone explain why the Excel process is left behind and is there any way to prevent it from happening?

    Read the article

  • asp.net Configuration Error on host

    - by zey
    I uploaded my asp.net site to hosting site , and my site browse correctly . But when I go to login url , it's show me the error Configuration Error Description: An error occurred during the processing of a configuration file required to service this request. Please review the specific error details below and modify your configuration file appropriately. Parser Error Message: It is an error to use a section registered as allowDefinition='MachineToApplication' beyond application level. This error can be caused by a virtual directory not being configured as an application in IIS. Source Error: Line 23: </assemblies> Line 24: </compilation> Line 25: <authentication mode="Forms"> Line 26: <forms loginUrl="~/Account/Login.aspx" timeout="2880" /> Line 27: </authentication> How can I fix it ?

    Read the article

  • Errors detected when loading a vim plugin from .vimrc

    - by Tejinder
    I have installed vim 7.3 on debian system along with some vimrc i have downloaded from internet. It used to work fine on my other debian machine but here i get these error messages while i load the vim editor. Here are the errors: Error detected while processing /home/tejinder/.vim/plugin/gundo.vim: line 196: E319: Sorry, the command is not available in this version: python << ENDPYTHON line 197: E492: Not an editor command: def asciiedges(seen, rev, parents): line 199: E121: Undefined variable: rev E15: Invalid expression: rev not in seen: line 221: E133: :return not inside a function line 231: E133: :return not inside a function line 233: E133: :return not inside a function line 235: E133: :return not inside a function line 238: E690: Missing "in" after :for line 347: E690: Missing "in" after :for line 356: E690: Missing "in" after :for line 453: E690: Missing "in" after :for line 464: E690: Missing "in" after :for line 469: E133: :return not inside a function line 795: E170: Missing :endfor Press ENTER or type command to continue If anyone could figure out thats going on, please guide me. Thanks a lot. Here is vimrc source: https://github.com/mitsuhiko/dotfiles/tree/master/vim

    Read the article

  • PowerShell: Read text, regex sort, write output to file and formatting

    - by Bill Hunter
    I am a Powershell novice and have run into a challenge in reading, sorting, and outputting a csv file. The input csv has no headers, the data is as follows: 05/25/2010,18:48:33,Stop,a1usak,10.128.212.212 05/25/2010,18:48:36,Start,q2uhal,10.136.198.231 05/25/2010,18:48:09,Stop,s0upxb,10.136.198.231 I use the following piping construct to read the file, sort and output to a file: (Get-Content d:\vpnData\u62gvpn2.csv) | %{,[regex]::Split($, ",")} | sort @{Expression={$[3]}},@{Expression={$_[1]}} | out-file d:\vpnData\u62gvpn3.csv The new file is written with the following format: 05/25/2010 07:41:57 Stop a0uaar 10.128.196.160 05/25/2010 12:24:24 Start a0uaar 10.136.199.51 05/25/2010 20:00:56 Stop a0uaar 10.136.199.51 What I would like to see in the output file is a similar format to the original input file with comma dilimiters: 05/25/2010,07:41:57,Stop,a0uaar,10.128.196.160 05/25/2010,12:24:24,Start,a0uaar,10.136.199.51 05/25/2010,20:00:56,Stop,a0uaar,10.136.199.51 But I can't quite seem to get there. I'm almost of the mind that I'll have to write another segment to read the newly produced file and reset its contents to the preferred format for further processing. Thoughts?

    Read the article

  • A good approach to db planing for reporting service

    - by Itay Moav
    The scenario: Big system (~200 tables). 60,000 users. Complex reports that will require me to do multiple queries for each report and even those will be complex queries with inner queries all over the place + some processing in PHP. I have seen an approach, which I am not sure about: Having one centralized, de-normalized, table that registers any activity in the system which is reportable. This table will hold mostly foreign keys, so she should be fairly compact and fast. So, for example (My system is a virtual learning management system), A user enrolls to course, the table stores the user id, date, course id, organization id, activity type (enrollment). Of course I also store this data in a normalized DB, which the actual application uses. Pros I see: easy, maintainable queries and code to process data and fast retrieval. Cons: there is a danger of the de-normalized table to be out of sync with the real DB. Is this approach worth considering, or (preferably from experience) is total $#%#%t?

    Read the article

  • "Streaming" MJPG using python.

    - by tyler
    I have a webcam that I want to do some image processing on using Python. It's coming through as a Motion-JPEG. I want to try to process the stuff "live," but really what I want to do is this: Open the URL, start data streaming to some buffer... Read x bytes (where x is image size) to an image Process that image Display in result panel Return to number 2 The problem is that, while I do have the resolution, I have no idea how many bytes to read. I've tried googling the M-JPEG specification but can't find anything on if the images are separated by some header or what. Anybody have any ideas?

    Read the article

  • action method is not called in JSF

    - by gurupriyan.e
    This is my Phase Listener public class AdminPhaseListener implements PhaseListener { private static final long serialVersionUID = -1161541721597667238L; public void afterPhase(PhaseEvent e) { System.out.println("after Phase " + e.getPhaseId()); } public void beforePhase(PhaseEvent e) { System.out.println("before Phase " + e.getPhaseId()); if(e.getPhaseId()== PhaseId.RESTORE_VIEW) { } } public PhaseId getPhaseId() { return PhaseId.ANY_PHASE; }} On click of a Command Button in my page, i call an action method and do some processing but the action method is not called at all, but in the server log , i could see the messages printed by my PhaseListener for all the Phases. If my view was not changed, It would have stopped after the RESTORE_VIEW Phase right? any thoughts?

    Read the article

  • Using delayed_job to process file uploads across multiple servers

    - by Steve Klabnik
    Does anyone have any good resources on how to do this? Basically, I'm working on a project (in Rails) where people can upload files. They might be big. I'd like to process them using delayed_job before sending them to S3. I'd also like to do this processing on a separate job queue server, rather than on the webserver itself. I'd rather not have to upload the files to the webserver, then transfer them to the job queue server, and then upload them to S3 if I don't have to. Thanks.

    Read the article

  • Inlines in Django Admin

    - by Oli
    I have two models, Order and UserProfile. Each Order has a ForeignKey to UserProfile, to associate it with that user. On the django admin page for each Order, I'd like to display the UserProfile associated with it, for easy processing of information. I have tried inlines: class UserInline(admin.TabularInline): model = UserProfile class ValuationRequestAdmin(admin.ModelAdmin): list_display = ('address1', 'address2', 'town', 'date_added') list_filter = ('town', 'date_added') ordering = ('-date_updated',) inlines = [ UserInline, ] But it complains that UserProfile "has no ForeignKey to" Order - which it doesn't, it's the other way around. Is there a way to do what I want?

    Read the article

  • Obtaining the server IP address in WCF?

    - by chris
    How can I obtain the server IP address that was used to connect to a service? The server has multiple IP addresses and I need to know which one the client is connected to. So far I only found that OperationContext.Current.EndpointDispatcher.EndpointAddress and OperationContext.Current.Channel.LocalAddress contain the address from .config (e.g. localhost) OperationContext.Current.IncomingMessageProperties.Via contains the Url that the client used to connect to the server (but this might just be a name from the clients hosts file). EDIT - Sorry, I wasn't being clear enough: The server needs to know which of its IP addresses were used by the client. E.g. the server has the addresses 10.0.0.1 and 10.0.0.2. when processing the request the server service needs to know if 10.0.0.1 or 10.0.0.2 was used by the client to connect to it.

    Read the article

  • Reading from a file not line-by-line

    - by MadH
    Assigning a QTextStream to a QFile and reading it line-by-line is easy and works fine, but I wonder if the performance can be inreased by first storing the file in memory and then processing it line-by-line. Using FileMon from sysinternals, I've encountered that the file is read in chunks of 16KB and since the files I've to process are not that big (~2MB, but many!), loading them into memory would be a nice thing to try. Any ideas how can I do so? QFile is inhereted from QIODevice, which allows me to ReadAll() it into QByteArray, but how to proceed then and divide it into lines?

    Read the article

  • fetch only first row from oracle - is it faster?

    - by john
    My main goal with this question is optimization and faster run time. After doing lot of processing in the Stored Proc I finally return a count like below: OPEN cv_1 FOR SELECT COUNT(*) num_of_members FROM HOUSEHOLD_MEMBER a, HOUSEHOLD b WHERE RTRIM(LTRIM(a.mbr_last_name)) LIKE v_MBR_LAST_NAME || '%' AND a.number = '01' AND a.code = v_CODE AND a.ssn_head = v_SSN_HEAD AND TO_CHAR( a.mbr_dob, 'MM/DD/YYYY') = v_DOB; But in my code that is calling the SP does not need the actual count. It just cares that count is greater than 1. Question: How can I change this to return just 1 or 0. 1 when count is 0 and 0 when count 1. Will it be faster to do this rather than returning the whole count?

    Read the article

  • CImg compile problems in Codegear 2009

    - by Seth
    I wish to use the CImg library for image processing in my current project. I am using Codegear C++ Builder 2009. I include CImg.h in the source file and put in the following code: int rows =5; int cols = 5; CImg<double> img(rows,cols); I get the following error: [BCC32 Error] CImg.h(39159): E2285 Could not find a match for 'CImg<unsigned char>::move_to<t>(const CImg<unsigned char>)' Does anyone know if there is a #define I should be using when building in Codegear C++ Builder 2009. Or is it simply not compatible?

    Read the article

  • Algorithm to match natural text in mail

    - by snøreven
    I need to separate natural, coherent text/sentences in emails from lists, signatures, greetings and so on before further processing. example: Hi tom, last monday we did bla bla, lore Lorem ipsum dolor sit amet, consectetur adipisici elit, sed eiusmod tempor incidunt ut labore et dolore magna aliqua. list item 2 list item 3 list item 3 Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquid x ea commodi consequat. Quis aute iure reprehenderit in voluptate velit regards, K. ---line-of-funny-characters-####### example inc. 33 evil street, london mobile: 00 234534/234345 Ideally the algorithm would match only the bold parts. Is there any recommended approach - or are there even existing algorithms for that problem? Should I try approximate regular expressions or more statistical stuff based on number of punctation marks, length and so on?

    Read the article

  • using .htaccess to redirect from friendly url to actual file

    - by Kohalza
    I have the following RewriteRule in my .htaccess to redirect from a friendly url to my main application file: RewriteRule ^\/(.*).html$ home/www/page.php?p=$1 [L] This should send any url that points to a html page to page.php with the url as a parameter that will be parsed by the app. This works for urls that look like http://www.example.com/hello.html The problem is that I get a 404 error when the url contains a directory path, for example: http://www.example.com/category/hello.html The error reads: "File does not exist: /home/www/category" Seems it is first looking for the 'category' path instead of processing the .htaccess Any ideas how to solve this?

    Read the article

  • SSI not producing output, not giving error either.

    - by bstullkid
    in the html file: <!--#exec cgi="/cgi-bin/test.pl"--> the perl script: #!/usr/bin/perl print "Content-Type: text/html\n\n"; print "<input type=\"hidden\" name=\"aname\" value=\"avalue\">\n"; print "<img src=\"/cgi-bin/script.pl\" />"; This does not give me an 'error processing directive' error, nor does it output my HTML inplace of the tag. I'll also add that the ssi tag gets replaced with nothing.

    Read the article

  • Excluding directories in Exuberant CTags

    - by DeepYellow
    I'm working with a very large code base and I find it useful to be selective about which directories are included for use with Exuberant Ctags. The --exclude option works well to eliminate individual file and directory names (with globing wildcards), but I can't figure out how to get it to exclude path patterns containing more than one directory. For example, I may want to exclude a directory tests, but only when processing thirdparty\tests (under Windows). The problem is if I just use --exclude=tests I exclude too many directories, including a test directory in the code I'm actively working on. Here are some things I've tried: --exclude=thirdparty\tests --exclude=thirdparty\\tests --exclude=*\thirdparty\tests --exclude=*\\thirdparty\\tests --exclude=thirdparty/tests Ctags silently ignores all these as evidenced by an examination of the tags file. How can I exclude a directory only when it is preceded by a given parent directory?

    Read the article

  • SmartGWT Server File Browser

    - by jluzwick
    Hi All: I have been looking for a SmartGWT example that would show me how to build a File Browser widget that takes the files from the local server's root directory. The user would be shown the files through the browser which they could then select to perform some processing operations. So far I have thought of using SmartGWT's Tree-Data Binding-Load from Local Data Widget and then grabbing a list of the directories using: new File("\").listFiles(); My Question is: Is there a better way to do this? Has someone already thought of this and has an example of their code that I can see? PS: I'm fairly new to GWT and Web Services but fairly competent with Java. If you believe there is a better way to do this (while still doing this through the web and not using Applets, please tell me). Thanks

    Read the article

  • Should XML be used server-side, and JSON client-side?

    - by Michel Carroll
    As a personal project, I'm making an AJAX chatroom application using XML as a server-side storage, and JSON for client-side processing. Here's how it works: AJAX Request gets sent to PHP using GET (chat messages/logins/logouts) PHP fetches/modifies the XML file on the server PHP encodes the XML into JSON, and sends back JSON response Javascript handles JSON information (chat messages/logins/logouts) I want to eventually make this a larger-scale chatroom application. Therefore, I want to make sure it's fast and efficient. Was this a bad design choice? In this case, is switching between XML and JSON ok, or is there a better way?

    Read the article

  • Ridiculously Simple Erlang Question

    - by Aiden Bell
    Read (skimmed enough to get coding) through Erlang Programming and Programming Erlang. One question, which is as simple as it sounds: If you have a process Pid1 on machine m1 and a billion million messages are sent to Pid1, are messages handled in parallel by that process (I get the impression no) and is there any guarantee of order when processing messages? ie. Received in order sent? Coming from the whole C/Thread pools/Shared State background ... I want to get this concrete. I understand distributing an application, but want to ensure the 'raw bones' are what I expect before building processes and distributing workload. Also, am I right in thinking the whole world is currently flicking through Erlang texts ;)

    Read the article

  • Flex RemoteObject Intermittently Failing to Invoke CFC

    - by Justin
    I have a Flex app that uses Flash Remoting and the RemoteObject to pull data from a ColdFusion CFC. About 75% of the time it works, but the other times I get a message using Charles (a debugging tool) that says faultString = "Unable to Invoke CFC". FaultCode = "Server.Processing". Here's my RemoteObject: The server set up on our web farm is to use load balancing. I'm not sure if this is causing the problem or not. Probably not, but it's a thought. Any help is appreciated!!

    Read the article

  • Good XMPP Java Libraries for server side?

    - by Taylor Gautier
    I was hoping to implement a simple XMPP server in Java. What I need is a library which can parse and understand xmpp requests from a client. I have looked at Smack (mentioned below) and JSO. Smack appears to be client only so while it might help parsing packets it doesn't know how to respond to clients. Is JSO maintained it looks very old. The only promising avenue is to pull apart Openfire which is an entire commercial (OSS) XMPP server. I was just hoping for a few lines of code on top of Netty or Mina, so I could get started processing some messages off the wire.

    Read the article

  • MYSQL and the LIMIT clause

    - by Lizard
    I was wondering if adding a LIMIT 1 to a query would speed up the processing? For example... I have a query that will most of the time return 1 result, but will occasionaly return 10's, 100's or even 1000's of records. But I will only ever want the first record. Would the limit 1 speed things up or make no difference? I know I could use GROUP BY to return 1 result but that would just add more computation. Any thoughts gladly accepted! Thanks

    Read the article

  • Using a Resource as an Attribute to a HTML Element in ASP.net

    - by Michael Stum
    I would like to have this piece of code in my .aspx file: <input class="ms-ButtonHeightWidth" type="button" name="BtnOK" id="Button2" value="Close" onclick="javascript:HandleOKButtonClick()" accesskey="<%$Resources:wss,okbutton_accesskey%>" /> Unfortunately, ASP.net doesn't seem to like that: An error occurred during the processing of /_layouts/MyPage/Info.aspx. Literal expressions like '<%$Resources:wss,okbutton_accesskey%>' are not allowed. Use <asp:Literal runat="server" Text="<%$Resources:wss,okbutton_accesskey%>" /> instead That doesn't work in this situation as that would mean nesting the Literal between the quotes of the accesskey attribute, which causes a "The tag contains duplicate 'ID' attributes" error. Is there a way to use a string from a resource without having to change the input to an asp:Button? I guess there has to be a way using <%=, but I don't know how I would address the resource itself?

    Read the article

  • How to print element values when iterating through XML document in PHP

    - by pharma_joe
    I am iterating through the results of a service call to yahoo news thus: //Send service request if (!$yahooResults = file_get_contents($yahooRequest)) { echo 'Error processing service request'; } //Read result into xml document $yahooResultXml = new DOMDocument('1.0', 'UTF-8'); $yahooResultXml-loadXML($yahooResults); //Build page include_once('components/pageHeader.php'); echo 'Search Results'; //echo $yahooResultXml-saveHTML(); //Iterate over each Result node $stories = $yahooResultXml-getElementsByTagName('Result'); foreach ($stories as $story) { //Title //Summary //Url //Source //Language //Publish Date //Modification Date } include_once('components/pageFooter.php'); Each Title is in a Title node within a Result Node. I cannot figure out how to simply echo the content of the Title node!

    Read the article

< Previous Page | 470 471 472 473 474 475 476 477 478 479 480 481  | Next Page >