Search Results

Search found 12039 results on 482 pages for 'job searching'.

Page 29/482 | < Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >

  • I'm searching for a messaging platform (like XMPP) that allows tight integration with a web applicat

    - by loxs
    Hi, At the company I work for, we are building a cluster of web applications for collaboration. Things like accounting, billing, CRM etc. We are using a RESTfull technique: For database we use CouchDB Different applications communicate with one another and with the database via http. Besides, we have a single sign on solution, so that when you login in one application, you are automatically logged to the other. For all apps we use Python (Pylons). Now we need to add instant messaging to the stack. We need to support both web and desktop clients. But just being able to chat is not enough. We need to be able to achieve all of the following (and more similar things). When somebody gets assigned to a task, they must receive a message. I guess this is possible with some system daemon. There must be an option to automatically group people in groups by lots of different properties. For example, there must be groups divided both by geographical location, by company division, by job type (all the programers from different cities and different company divisions must form a group), so that one can send mass messages to a group of choice. Rooms should be automatically created and destroyed. For example when several people visit the same invoice, a room for them must be automatically created (and they must auto-join). And when all leave the invoice, the room must be destroyed. Authentication and authorization from our applications. I can implement this using custom solutions like hookbox http://hookbox.org/docs/intro.html but then I'll have lots of problems in supporting desktop clients. I have no former experience with instant messaging. I've been reading about this lately. I've been looking mostly at things like ejabberd. But it has been a hard time and I can't find whether what I want is possible at all. So I'd be happy if people with experience in this field could help me with some advice, articles, tales of what is possible etc.

    Read the article

  • SharePoint job queued, but never starts.

    - by Eric Uniacke
    I'm attempting to retract a solution from SharePoint. I've started the job via Central Admin site as well as from stsadm. The job is queued, however, it will stay in a state of pending for days. The SharePoint services - Admin and Timer service - are started. Any suggestoins of why the retraction jobs are queued, but never started? I'm really at a loss of what to try next, so anything will be helpful. Thank you!

    Read the article

  • Part-time Programming Job

    - by Bluechip Solutions
    I am a student at Middlesex Universtity, London studying Information Technology. I really love software development and I have taught myself how to write HTML + CSS, JavaScript (I use jQuery and AngularJS) and Java (I learnt this in school). I have developed few apps (a desktop app in Java and a mobile app with AngularJS and PhoneGap) I am looking at applying for a part-time programming job to develop myself. Are there part time jobs available for someone like me and are my skill set enough to get me a job? I understand this topic may not be ideal here but this is the only place I know can provide me answers. Thank you!

    Read the article

  • jqgrid with asp.net webmethod and json working with sorting, paging, searching and LINQ

    - by aimlessWonderer
    THIS WORKS! Most topics covering jqgrid and asp.net seem to relate to just receiving JSON, or working in the MVC framework, or utilizing other handlers or web services... but not many dealt with actually passing parameters back to an actual webmethod in the codebehind. Furthermore, scarce are the examples that contain successful implementation the AJAX paging, sorting, or searching along with LINQ to SQL for asp.net jqGrid. Below is a working example that may help others who need help to pass parameters to jqGrid in order to have correct paging, sorting, filtering.. it uses pieces from here and there... ================================================== First, THE JAVASCRIPT <script type="text/javascript"> $(document).ready(function() { var grid = $("#list"); $("#list").jqGrid({ // setup custom parameter names to pass to server prmNames: { search: "isSearch", nd: null, rows: "numRows", page: "page", sort: "sortField", order: "sortOrder" }, // add by default to avoid webmethod parameter conflicts postData: { searchString: '', searchField: '', searchOper: '' }, // setup ajax call to webmethod datatype: function(postdata) { mtype: "GET", $.ajax({ url: 'PageName.aspx/getGridData', type: "POST", contentType: "application/json; charset=utf-8", data: JSON.stringify(postdata), dataType: "json", success: function(data, st) { if (st == "success") { var grid = jQuery("#list")[0]; grid.addJSONData(JSON.parse(data.d)); } }, error: function() { alert("Error with AJAX callback"); } }); }, // this is what jqGrid is looking for in json callback jsonReader: { root: "rows", page: "page", total: "totalpages", records: "totalrecords", cell: "cell", id: "id", //index of the column with the PK in it userdata: "userdata", repeatitems: true }, colNames: ['Id', 'First Name', 'Last Name'], colModel: [ { name: 'id', index: 'id', width: 55, search: false }, { name: 'fname', index: 'fname', width: 200, searchoptions: { sopt: ['eq', 'ne', 'cn']} }, { name: 'lname', index: 'lname', width: 200, searchoptions: { sopt: ['eq', 'ne', 'cn']} } ], rowNum: 10, rowList: [10, 20, 30], pager: jQuery("#pager"), sortname: "fname", sortorder: "asc", viewrecords: true, caption: "Grid Title Here" }).jqGrid('navGrid', '#pager', { edit: false, add: false, del: false }, {}, // default settings for edit {}, // add {}, // delete { closeOnEscape: true, closeAfterSearch: true}, //search {} ) }); </script> ================================================== Second, THE C# WEBMETHOD [WebMethod] public static string getGridData(int? numRows, int? page, string sortField, string sortOrder, bool isSearch, string searchField, string searchString, string searchOper) { string result = null; MyDataContext db = null; try { //--- retrieve the data db = new MyDataContext("my connection string path"); var query = from u in db.TBL_USERs select u; //--- determine if this is a search filter if (isSearch) { searchOper = getOperator(searchOper); // need to associate correct operator to value sent from jqGrid string whereClause = String.Format("{0} {1} {2}", searchField, searchOper, "@" + searchField); //--- associate value to field parameter Dictionary<string, object> param = new Dictionary<string, object>(); param.Add("@" + searchField, searchString); query = query.Where(whereClause, new object[1] { param }); } //--- setup calculations int pageIndex = page ?? 1; //--- current page int pageSize = numRows ?? 10; //--- number of rows to show per page int totalRecords = query.Count(); //--- number of total items from query int totalPages = (int)Math.Ceiling((decimal)totalRecords / (decimal)pageSize); //--- number of pages //--- filter dataset for paging and sorting IQueryable<TBL_USER> orderedRecords = query.OrderBy(sortfield); IEnumerable<TBL_USER> sortedRecords = orderedRecords.ToList(); if (sortorder == "desc") sortedRecords= sortedRecords.Reverse(); sortedRecords= sortedRecords .Skip((pageIndex - 1) * pageSize) //--- page the data .Take(pageSize); //--- format json var jsonData = new { totalpages = totalPages, //--- number of pages page = pageIndex, //--- current page totalrecords = totalRecords, //--- total items rows = ( from row in sortedRecords select new { i = row.USER_ID, cell = new string[] { row.USER_ID.ToString(), row.FNAME.ToString(), row.LNAME } } ).ToArray() }; result = Newtonsoft.Json.JsonConvert.SerializeObject(jsonData); } catch (Exception ex) { Debug.WriteLine(ex); } finally { if (db != null) db.Dispose(); } return result; } ================================================== Third, NECESSITIES In order to have dynamic OrderBy clauses in the LINQ, I had to pull in a class to my AppCode folder called 'Dynamic.cs'. You can retrieve the file from downloading here. You will find the file in the "DynamicQuery" folder. That file will give you the ability to utilized dynamic ORDERBY clause since we don't know what column we're filtering by except on the initial load. To serialize the JSON back from the C-sharp to the JS, I incorporated the James Newton-King JSON.net DLL found here : http://json.codeplex.com/releases/view/37810. After downloading, there is a "Newtonsoft.Json.Compact.dll" which you can add in your Bin folder as a reference Here's my USING's block using System; using System.Collections; using System.Collections.Generic; using System.Linq; using System.Web.UI.WebControls; using System.Web.Services; using System.Linq.Dynamic; For the Javascript references, I'm using the following scripts in respective order in case that helps some folks: 1) jquery-1.3.2.min.js ... 2) jquery-ui-1.7.2.custom.min.js ... 3) json.min.js ... 4) i18n/grid.locale-en.js ... 5) jquery.jqGrid.min.js For the CSS, I'm using jqGrid's necessities as well as the jQuery UI Theme: 1) jquery_theme/jquery-ui-1.7.2.custom.css ... 2) ui.jqgrid.css The key to getting the parameters from the JS to the WebMethod without having to parse an unserialized string on the backend or having to setup some JS logic to switch methods for different numbers of parameters was this block postData: { searchString: '', searchField: '', searchOper: '' }, Those parameters will still be set correctly when you actually do a search and then reset to empty when you "reset" or want the grid to not do any filtering Hope this helps some others!!!! Please reply if you find major issues or ways of refactoring or doing better that I haven't considered.

    Read the article

  • MySQL equivalent to .pgpass, or automatic authentication in a cron job for mySQL

    - by Ibrahim
    I'm writing a bash script to back up my databases. Most are postgresql, and in postgres there's a way to avoid having to authenticate by creating a ~/.pgpass file which contains the postgres password. I put this in root's home directory and made it chmod 0600, so that root could dump the postgres databases without having to authenticate. Now I want to do something similar for mysql, although I only have one mysql database. How can I do this? I don't want to specify the password on the command line for mysqldump because this is part of a script that might be somewhat visible to other users. Is there a better way (i.e. built in to mysql) to do this than make a file that only root can read and then read that to get the mysql password, and then use that in the bash script as a variable?

    Read the article

  • MySQL equivalent to .pgpass, or automatic authentication in a cron job for mySQL

    - by Ibrahim
    I'm writing a bash script to back up my databases. Most are postgresql, and in postgres there's a way to avoid having to authenticate by creating a ~/.pgpass file which contains the postgres password. I put this in root's home directory and made it chmod 0600, so that root could dump the postgres databases without having to authenticate. Now I want to do something similar for mysql, although I only have one mysql database. How can I do this? I don't want to specify the password on the command line for mysqldump because this is part of a script that might be somewhat visible to other users. Is there a better way (i.e. built in to mysql) to do this than make a file that only root can read and then read that to get the mysql password, and then use that in the bash script as a variable?

    Read the article

  • mdadm cron job sends email that cron has run

    - by Andrew
    I've got an Ubuntu 8.04 server using mdadm to create several RAID1 arrays. I created /etc/cron.hourly/mdadm as follows: #! /bin/sh set -e mdadm --monitor /dev/md0 /dev/md3 /dev/md4 --oneshot (Yes, the array numbers are not sequential, and I'm not using --scan beacuse I have a degraded array that may or may not have been used as swap and I can't delete, but I think that's a separate issue. If it's the underlying cause of this, I need to fix it.) mdadm sends me email (configured in the /etc/mdadm/mdadm.conf) on DegradedArray etc. events. This is the desired behaviour. What is not desired, and I can't work out, is why cron is sending me (relatively pointless) emails, via an alias in /etc/aliases: From: root@<hostname> (Cron Daemon) To: root@<hostname> Subject: Cron <root@<hostname>> cd / && run-parts --report /etc/cron.hourly Content-Type: text/plain; charset=ANSI_X3.4-1968 X-Cron-Env: <SHELL=/bin/sh> X-Cron-Env: <PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin> X-Cron-Env: <HOME=/root> X-Cron-Env: <LOGNAME=root> Message-Id: <id@hostname> Date: Fri, 7 May 2010 13:17:01 +0930 (CST) /etc/cron.hourly/mdadm: mdadm: Monitor using email address "<root_alias@domain>" from config file I've got a dozen other servers behaving correctly (mdadm sends email, cron doesnt') with identical /etc/crontab files: # /etc/crontab: system-wide crontab # <snip comments> SHELL=/bin/sh PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin # m h dom mon dow user command 17 * * * * root cd / && run-parts --report /etc/cron.hourly <snip anacron jobs> Should I simply remove the --report, or is there something else in my cron config somewhere causing this?

    Read the article

  • sqlldr job not running through Autosys

    - by Frank
    I have a shell script that if run manually or via Cron executes fine and loads a delimited file using sqlldr to the database successfully. However via Autosys the script executes, sqlldr says it was successful, however the data is never actually loaded into the database. Has anyone ever experienced this before with the sqlldr/Autosys combination, and if so, knows of a workaround/fix?

    Read the article

  • "pull" process/job into the background

    - by Mustafa Ismail Mustafa
    I know of terminating a command with & and then moving it into the background by pressing Ctrl-Z and then bg [pid], and I also know of nohup. But say you started a process that turned out to take much longer than one expected, is there a way of pulling, so to speak, this process from another terminal screen into the background so that even if I log off from the server the process would continue?

    Read the article

  • What would you think of a job based on mostly doing the proof of concept?

    - by davsan
    I'm working as a developer in a small software company whose main job is interfacing between separate applications, like between a telephony system and an environment control system, between IP TVs and hospitality systems, etc...And it seems like I am the candidate for a new job title in the company, as the person who does the proof of concept of a new interfacing project and does some R&D for prototyping. What do you think the pros and cons of such a job would be, considering mainly the individual progress/regress of a person as a software engineer? And what aspects would you consider essential in a person to put him/her in such a job position?

    Read the article

  • BITS http download job fails to connect for owner Local SYSTEM account

    - by MikeT
    A service I have written that uses BITS (Background Intelligent Transfer Service) to auto update itself is having a problem on some machines (Windows 7 so far). I have been investigating and have discovered that some of the jobs that my service adds to the bits queue are failing immediately with the error code 0x80072efd (a connection with this server could not be established). The is not problem with connecting to the server for the download as it works fine on the same machine using IE (or any other web browser) and other clients can connect and update from the same server. I tried using the BITSADMIN.exe tool to add the jobs manually and they worked ok. I then changed the account my service was running under to the network service account so the bits jobs would be created with a different owner and the jobs completed successfully. My question is I don't want to run my service as this account as it wont have the required local permissions, so how to I change the permissions of the local system user to allow it to download from the HTTP source, I'm not aware of any way of this being restricted for this account but it obviously is.

    Read the article

  • What do you need to know to get a job as a web developer [closed]

    - by Alex Foster
    What do you need to know to at the very least get your foot in the door? We're assuming for someone who doesn't have a college degree (yet) but will eventually get one. My guess is html, css, javascript, and php, and photoshop and dreamweaver, and sql. And being familiar with using a web host to have sites live, like knowing how to use cpanel. It's probably a very inaccurate and narrow guess but that's what i think right now. I don't know exactly.

    Read the article

  • In CPanel, cron job is not being executing and not sending any mail

    - by Abhishek
    though many of us has asked many question related to cron jobs, let me ask my one... I want to execute a PHP script periodically. As a cron command I'm using: php -q http://www.example.com/cron.php?action=getA I also tried this one: php -q /home/myuser/www.example.com/cron.php?action=getA It is not getting executed and not sending any mail. I set the mail ID to my gMail ID. What am I doing wrong?

    Read the article

  • Cron Job that Boots Screen at Start

    - by Pez Cuckow
    I am trying to set up a number of processes that start during boot (servers for games) with the below command as the cron item: @reboot /usr/bin/screen -fa -d -m -S NAME COMMAND However if the server crashes for what ever reason screen closes and the server doesn't get a chance to run it's auto restart (as far as I understand; screen sees no processes in the socket and so closes). Is there a way that I can get around this so screen will sit there even if nothing is running in it?

    Read the article

  • Unix printing a banner page on every print job

    - by yum_tacos4u
    I have a Data General server on unix that is printing a banner page on every print. I originally thought that the banner page was comming from the printer. As this is an HP printer, I used telnet to get to the jetadmin and then proceded to disable the banner page, but this did not solve the issue. I then went into the sysadm program to see if the TCPIP printing was set to print a banner page on print jobs, but I did not see any options to print a banner page. Any help or ideas on how to disable the banner page from printing in unix? Here is a banner page example print

    Read the article

  • Will taking a job that's web and database related limit my software development opportunities later on?

    - by someone
    I love programming, particularly OOP. My school experience was mostly in Java/OOP, and I had a job for a limited time in Java, Python, and other OOP kind of languages. However, a move necessitated a change in jobs, and what I've ended up with now is a web-development and database intensive job. I may possibly hold this job for several years. My question is, will this limit my choices later on? Will I be able to find another Java / software-development kind of job, or will I be rejected because my experience will be mostly in a different area?

    Read the article

  • Searching a set of data with multiple terms using Linq

    - by Cj Anderson
    I'm in the process of moving from ADO.NET to Linq. The application is a directory search program to look people up. The users are allowed to type the search criteria into a single textbox. They can separate each term with a space, or wrap a phrase in quotes such as "park place" to indicate that it is one term. Behind the scenes the data comes from a XML file that has about 90,000 records in it and is about 65 megs. I load the data into a DataTable and then use the .Select method with a SQL query to perform the searches. The query I pass is built from the search terms the user passed. I split the string from the textbox into an array using a regular expression that will split everything into a separate element that has a space in it. However if there are quotes around a phrase, that becomes it's own element in the array. I then end up with a single dimension array with x number of elements, which I iterate over to build a long query. I then build the search expression below: query = query & _ "((userid LIKE '" & tempstr & "%') OR " & _ "(nickname LIKE '" & tempstr & "%') OR " & _ "(lastname LIKE '" & tempstr & "%') OR " & _ "(firstname LIKE '" & tempstr & "%') OR " & _ "(department LIKE '" & tempstr & "%') OR " & _ "(telephoneNumber LIKE '" & tempstr & "%') OR " & _ "(email LIKE '" & tempstr & "%') OR " & _ "(Office LIKE '" & tempstr & "%'))" Each term will have a set of the above query. If there is more than one term, I put an AND in between, and build another query like above with the next term. I'm not sure how to do this in Linq. So far, I've got the XML file loading correctly. I'm able to search it with specific criteria, but I'm not sure how to best implement the search over multiple terms. 'this works but far too simple to get the job done Dim results = From c In m_DataSet...<Users> _ Where c.<userid>.Value = "XXXX" _ Select c The above code also doesn't use the LIKE operator either. So partial matches don't work. It looks like what I'd want to use is the .Startswith but that appears to be only in Linq2SQL. Any guidance would be appreciated. I'm new to Linq, so I might be missing a simple way to do this. The XML file looks like so: <?xml version="1.0" standalone="yes"?> <theusers> <Users> <userid>person1</userid> <nickname></nickname> <lastname></lastname> <firstname></firstname> <department></department> <telephoneNumber></telephoneNumber> <email></email> </Users> <Users> <userid>person2</userid> <nickname></nickname> <lastname></lastname> <firstname></firstname> <department></department> <telephoneNumber></telephoneNumber> <email></email> </Users>

    Read the article

  • Lucene.Net memory consumption and slow search when too many clauses used

    - by Umer
    I have a DB having text file attributes and text file primary key IDs and indexed around 1 million text files along with their IDs (primary keys in DB). Now, I am searching at two levels. First is straight forward DB search, where i get primary keys as result (roughly 2 or 3 million IDs) Then i make a Boolean query for instance as following +Text:"test*" +(pkID:1 pkID:4 pkID:100 pkID:115 pkID:1041 .... ) and search it in my Index file. The problem is that such query (having 2 million clauses) takes toooooo much time to give result and consumes reallly too much memory.... Is there any optimization solution for this problem ?

    Read the article

  • How to schedule dynamic function with cron job?

    - by iBrazilian2
    I want to know how I can schedule a dynamic(auto populated data) function to auto run everyday at saved time? Let's say I have a form that once the button is clicked it sends the data to the function, which the posts the data. I simply want to automate that so that I don't have to press the button. <ul> <?php foreach($Class->retrieveData as $data) { <form method="post" action=""> <li> <input type="hidden" name="name">'.$data['name'].'<br/> <input type="hidden" name="description">'.$data['description'].'<br/> <input type="submit" name="post_data" value="Post"> </li> </form> } ?> </ul> Now, the form will pass the data to the function. if(isset($_POST['post_data'])) // if post_data button is clicked then it runs myFunction() { myFunction(); } myFunction() { $name = $_POST['name']; $description = $_POST['description']; } I tried doing the following but the problem is that Cron Job can only run the whole .php file, and I am retrieving the saved time to run from MySQL. foreach($Class->getTime() as $timeData) { $timeHour = $timeData['timeHour']; $timeMinute = $timeData['timeMinute']; $hourMin = date('H:i'); $timeData = ''.$timeHour.':'.$timeMinute.''; if($hourMin == $timeData) { run myFunction. } } $hourMin is the current hour:minute which is being matched against a saved time to auto run from Mysql. So if $hourMin == $timeData then the function will run. How can I run Cron Job to auto run myFunction() if the $hourMin equals $timeData? So... List 1 = is to be runned at 10am List 2 = is to be runned at 12pm List 3 = is to be runned at 2pm The 10am, 12pm, 2pm is the $timeHour and $timeMinute that is retrieved from MySQL but based on each list id's. EDIT @randomSeed, 1) I can schedule cron jobs. 2) $name and $description will all be arrays, so the following is what I am trying to accomplish. $name = array( 'Jon', 'Steven', 'Carter' ); $description = array( 'Jon is a great person.', 'Steven has an outgoing character.', 'Carter is a horrible person.' ); I want to parse the first arrays from both $name and $description if the scheduled time is correct. In database I have the following postDataTime table +----+---------+----------+------------+--------+ | iD | timeDay | timeHour | timeMinute | postiD | +--------------------------------------+--------+ | 1 | * | 9 | 0 | 21 | |----|---------|----------|------------|--------| | 2 | * | 10 | 30 | 22 | |----|---------|----------|------------|--------| | 3 | * | 11 | 0 | 23 | +----|---------+----------+------------+--------+ iD = auto incremented on upload. timeDay = * is everyday (cron job style) timeHour = Hour of the day to run the script timeMinute = minute of the hour to run script postiD = this is the id of the post that is located in another table (n+1 relationship) If it's difficult to understand.. if(time() == 10:30(time from MySQL postiD = 22)) { // run myFunction with the data that is retrieved for that time ex: $postiD = '22'; $name = 'Steven'; $description = 'Steven has an outgoing character.'; // the above is what will be in the $_POST from the form and will be // sent to the myFunction() } I simply want to schedule everything according to the time that is saved to MySQL as I showed at the very top(postDataTime table). (I'd show what I have tried, but I have searched for countless hours for an example of what I am trying to accomplish but I cannot find anything and what I tried doesn't work.). I thought I could use the exec() function but from what it seems that does not allow me to run functions, otherwise I would do the following.. $time = '10:30'; if($time == time()) { exec(myFunction()); }

    Read the article

  • Are there any Rails Plugins that use JS to poll the server for completion of a background process?

    - by btelles
    I know it might be simple to write some polling JS that asks the Rails application if it has completed a background job. But I wonder if anyone has already packaged up this functionality as a plugin. Basically, I have a few delayed_jobs running, and am looking for a plugin that will poll the server for completion of a particular job, then activate a callback when it finds that the job is complete. Any ideas anyone? Berns

    Read the article

  • Writing an efficient cron job script utilizing Zend_Mail_Storage_Imap.

    - by fireeyedboy
    I'm new to the IMAP protocol and Zend_Mail_Storage and I'm writing a small php script for a cron job that should regularly poll an IMAP account and check for new messages, and send an e-mail if new messages have arrived. As you can imagine, I want to only poll the IMAP account for relevant messages, and I only want to send a new e-mail if new messages have arrived since the last polled new message. So I thought of keeping track of the last message I polled with some unique identifier for a message. But I'm a bit uncertain about whether the methods I want to utilize for this do what I expect them to do though. So my questions are: Does the iterator position of Zend_Mail_Storage_Imap actually resemble some IMAP unique identifier for messages, or is it simply only and internal position of Zend_Mail_Storage_Abstract? For instance, if I tell it to seek() to message 5 (which I stored from an earlier session) will it indeed seek to the appropriate message on the IMAP server, even if for instance messages have been deleted since last session? Would keeping track of this latest polled message id in a file suffice for a cron job that, say, polls the account every 5 or 10 minutes? Or is this too naive, and should I be using a database for instance. Or is there maybe a much easier way to keep track of such state with Zend_Mail_Storage_Abstract? Also, do I need to poll every IMAP folder? Or is everything accumulated when I poll INBOX? If you could shed some light on any of these matters, I'ld appreciate it. Thanks in advance.

    Read the article

  • How to maintain a job history using Quartz scheduler

    - by rwwilden
    I'd like to maintain a history of jobs that were scheduled by a Quartz scheduler containing the following properties: 'start time', 'end time', 'success', 'error'. There are two interfaces available for this: ITriggerListener and IJobListener (I'm using the C# naming convention for interfaces because I'm using Quartz.NET but the same question could be asked for the Java version). IJobListener has a JobToBeExecuted and a JobWasExecuted method. The latter provides a JobExecutionException so that you know when something went wrong. However, there is no way to correlate JobToBeExecuted and JobWasExecuted. Suppose my job runs for ten minutes. I start it at t0 and t0+2 (so they overlap). I get two calls to JobToBeExecuted and insert two start times into my history table. When both jobs finish at t1 and t1+2 I get two calls to JobWasExecuted. How do I know what database record to update in each call (to store an end time with its corresponding start time)? ITriggerListener has another problem. There is no way to get any errors inside the TriggerComplete method when a job failed. How do I get the desired behavior?

    Read the article

  • Efficient cron job utilizing Zend_Mail_Storage_Imap.

    - by fireeyedboy
    I'm new to the IMAP protocol and Zend_Mail_Storage and I'm writing a small php script for a cron job that should regularly poll an IMAP account and check for new messages, and send an e-mail if new messages have arrived. As you can imagine, I want to only poll the IMAP account for relevant messages, and I only want to send a new e-mail if new messages have arrived since the last polled new message. So I thought of keeping track of the last message I polled with some unique identifier for a message. But I'm a bit uncertain about whether the methods I want to utilize for this do what I expect them to do though. So my questions are: Does the iterator position of Zend_Mail_Storage_Imap actually resemble some IMAP unique identifier for messages, or is it simply only and internal position of Zend_Mail_Storage_Abstract? For instance, if I tell it to seek() to message 5 (which I stored from an earlier session) will it indeed seek to the appropriate message on the IMAP server, even if for instance messages have been deleted since last session? Would keeping track of this latest polled message id in a file suffice for a cron job that, say, polls the account every 5 or 10 minutes? Or is this too naive, and should I be using a database for instance. Or is there maybe a much easier way to keep track of such state with Zend_Mail_Storage_Abstract? Also, do I need to poll every IMAP folder? Or is everything accumulated when I poll INBOX? If you could shed some light on any of these matters, I'ld appreciate it. Thanks in advance.

    Read the article

  • kick off a map reduce job from my java/mysql webapp

    - by Brian
    Hi guys, I need a bit of archecture advice. I have a java based webapp, with a JPA based ORM backed onto a mysql relational database. Now, as part of the application I have a batch job that compares thousands of database records with each other. This job has become too time consuming and needs to be parallelized. I'm looking at using mapreduce and hadoop in order to do this. However, I'm not too sure about how to integrate this into my current architecture. I think the easiest initial solution is to find a way to push data from mysql into hadoop jobs. I have done some initial research on this and found the following relevant information and possibilities: 1) https://issues.apache.org/jira/browse/HADOOP-2536 this gives an interesting overview of some inbuilt JDBC support 2) This article http://architects.dzone.com/articles/tools-moving-sql-database describes some third party tools to move data from mysql to hadoop. To be honest I'm just starting out with learning about hbase and hadoop but I really don't know how to integrate this into my webapp. Any advice is greatly appreciated. cheers, Brian

    Read the article

< Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >