Search Results

Search found 720 results on 29 pages for 'aaron b russell'.

Page 11/29 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • Add all lines multiplied by another line in another table

    - by russell
    Hi, I hope I can explain this good enough. I have 3 tables. wo_parts, workorders and part2vendor. I am trying to get the cost price of all parts sold in a month. I have this script. $scoreCostQuery = "SELECT SUM(part2vendor.cost*wo_parts.qty) as total_score FROM part2vendor INNER JOIN wo_parts ON (wo_parts.pn=part2vendor.pn) WHERE workorder=$workorder"; What I am trying to do is each part is in wo_parts (under partnumber [pn]). The cost of that item is in part2vendor (under part number[pn]). I need each part price in part2vendor to be multiplied by the quantity sold in wo_parts. The way all 3 tie up is workorders.ident=wo_parts.workorder and part2vendor.pn=wo_parts.pn. I hope someone can assist. The above script does not give me the same total as when added by calculator.

    Read the article

  • Why does Net::Twitter complain "HTTP::Message content not bytes"?

    - by Russell C.
    We have been using Perl's Net:Twitter CPAN module (version 3.12) and basic authentication (not OAuth) for almost a year now to syndicate updates from our site to our Twitter account. We just migrated to a new server last week and since the move our updates to Twitter have stopped and the following error is being reported whenever we try to post an update: HTTP::Message content not bytes at /usr/lib/perl5/site_perl/5.8.8/HTTP/Request/Common.pm line 90 Here is the code we're using the update our Twitter account: use Net::Twitter; my $twitter = Net::Twitter->new( traits => [qw/API::REST/], username => $username, password => $password, source => 'twitterfeed' ); my $result = $twitter->update($status); I have no idea what the issue is and was hoping that someone else has run into this issue and can provide a quick solution. Thanks in advance for your help!

    Read the article

  • Adding Postgres table cells based on same value

    - by russell kinch
    I have a table called expenses. There are numerous columns but the ones involved in my php page are date, spplierinv, amount. I have created a page that lists all the expenses in a given month and totals it at the end. However, each row has a value, but many rows might be on the same supplier invoice.This means adding each row with the same supplierinv to get a total as per my bank statement. Is there anyway I can get a total for the rows based on the supplierinv. I mean say I have 10 rows. 5 on supplierinv 4, two on supplierinv 5 and 3 on supplierinv 12, how can a get 3 figures (inv 4, 5 and 12) and the grand total at the bottom. Many thanks

    Read the article

  • Why does my regex fail when the number ends in 0?

    - by Russell C.
    This is a really basic regex question but since I can't seem to figure out why the match is failing in certain circumstances I figured I'd post it to see if anyone else can point out what I'm missing. I'm trying to pull out the 2 sets of digits from strings of the form: 12309123098_102938120938120938 1321312_103810312032123 123123123_10983094854905490 38293827_1293120938129308 I'm using the following code to process each string: if($string && $string =~ /^(\d)+_(\d)+$/) { if(IsInteger($1) && IsInteger($2)) { print "success ('$1','$2')"; } else { print "fail"; } } Where the IsInterger() function is as follows: sub IsInteger { my $integer = shift; if($integer && $integer =~ /^\d+$/) { return 1; } return; } This function seems to work most of the time but fails on the following for some reason: 1287123437_1268098784380 1287123437_1267589971660 Any ideas on why these fail while others succeed? Thanks in advance for your help!

    Read the article

  • What to return when making an Ajax request

    - by Russell
    When we return data from an Ajax call, is it better to return a document containing HTML to display on the page or return an Xml/json data which can be processed? I know different circumstances may determine what 'better' means, but I really want to know which will be more appropriate for different circumstances. I am working on the framework for a large ASP .Net application, using jQuery Ajax (forms plugin). My initial thought was to return the data as Xml, then process accordingly. Then this increases processing required in Javascript, to populate the page. I am trying to balance flexible, clear and simple. Thanks in advance for your knowledge and information.

    Read the article

  • Undesired Output of Crontab Job Using CURL

    - by Russell C.
    I have written a perl script that runs as a daily crontab job that uploads files to Amazon S3 via CURL. I want the output of the cron job emailed to me which works fine but I don't want that email to include messages related to the CURL upload (only those message my script is outputting). Here are the CURL related messages I'm seeing in the daily email right now: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 230M 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 230M 0 0 0 544k 0 1519k 0:02:35 --:--:-- 0:02:35 1807k 0 230M 0 0 0 1744k 0 1286k 0:03:03 0:00:01 0:03:02 1342k 1 230M 0 0 1 2880k 0 1219k 0:03:13 0:00:02 0:03:11 1250k 1 230M 0 0 1 4016k 0 1198k 0:03:17 0:00:03 0:03:14 1218k 2 230M 0 0 2 5168k 0 1186k 0:03:19 0:00:04 0:03:15 1202k 2 230M 0 0 2 6336k 0 1181k 0:03:19 0:00:05 0:03:14 1157k 3 230M 0 0 3 7488k 0 1177k 0:03:20 0:00:06 0:03:14 1147k 3 230M 0 0 3 8592k 0 1167k 0:03:22 0:00:07 0:03:15 1142k 4 230M 0 0 4 9744k 0 1166k 0:03:22 0:00:08 0:03:14 1145k 4 230M 0 0 4 10.6M 0 1163k 0:03:23 0:00:09 0:03:14 1142k 5 230M 0 0 5 11.7M 0 1161k 0:03:23 0:00:10 0:03:13 1140k 5 230M 0 0 5 12.8M 0 1158k 0:03:23 0:00:11 0:03:12 1133k 6 230M 0 0 6 13.9M 0 1155k 0:03:24 0:00:12 0:03:12 1138k 6 230M 0 0 6 15.0M 0 1155k 0:03:24 0:00:13 0:03:11 1138k 7 230M 0 0 7 16.1M 0 1152k 0:03:25 0:00:14 0:03:11 1131k 7 230M 0 0 7 17.2M 0 1152k 0:03:25 0:00:15 0:03:10 1132k 7 230M 0 0 7 18.4M 0 1152k 0:03:24 0:00:16 0:03:08 1140k I am using a simple Perl system() call to invoke CURL. Does anyone know what command line argument I can supply CURL to turn off the reporting of the upload progress? Thanks in advance for your help!

    Read the article

  • Inserting an element into a sorted list

    - by Russell Cargill
    Ok I'm using getSharedPreferences to store my high score but before I fill it up I wanted to sort the scores into ascending order via and array, but if it finds a Score less than it in the first pos then it wont check the rest for the smallest? //function to add score to array and sort it public void addscoretoarray(int mScore){ for(int pos = 0; pos< score.length; pos++){ if(score[pos] > mScore){ //do nothing }else { //Add the score into that position score[pos] = mScore; break; } } sortArray(score); } should I call sortArray() before and after the loop to fix this problem or is there a better method to achive the same results? I should also mention that the sortArray(score) funtion is just calling Arrays.sort(score) where score is an array of mScore

    Read the article

  • Connecting form to database errors

    - by Russell Ehrnsberger
    Hello I am trying to connect a page to a MySQL database for newsletter signup. I have the database with 3 fields, id, name, email. The database is named newsletter and the table is named newsletter. Everything seems to be fine but I am getting this error Notice: Undefined index: Name in C:\wamp\www\insert.php on line 12 Notice: Undefined index: Name in C:\wamp\www\insert.php on line 13 Here is my form code. <form action="insert.php" method="post"> <input type="text" value="Name" name="Name" id="Name" class="txtfield" onblur="javascript:if(this.value==''){this.value=this.defaultValue;}" onfocus="javascript:if(this.value==this.defaultValue){this.value='';}" /> <input type="text" value="Enter Email Address" name="Email" id="Email" class="txtfield" onblur="javascript:if(this.value==''){this.value=this.defaultValue;}" onfocus="javascript:if(this.value==this.defaultValue){this.value='';}" /> <input type="submit" value="" class="button" /> </form> Here is my insert.php file. <?php $host="localhost"; // Host name $username="root"; // Mysql username $password=""; // Mysql password $db_name="newsletter"; // Database name $tbl_name="newsletter"; // Table name // Connect to server and select database. mysql_connect("$host", "$username", "$password")or die("cannot connect"); mysql_select_db("$db_name")or die("cannot select DB"); // Get values from form $name=$_POST['Name']; $email=$_POST['Email']; // Insert data into mysql $sql="INSERT INTO $tbl_name(name, email)VALUES('$name', '$email')"; $result=mysql_query($sql); // if successfully insert data into database, displays message "Successful". if($result){ echo "Successful"; echo "<BR>"; echo "<a href='index.html'>Back to main page</a>"; } else { echo "ERROR"; } ?> <?php // close connection mysql_close(); ?>

    Read the article

  • Perl Regex Mismatch Issue

    - by Russell C.
    This is a really basic regex question but since I can't seem to figure out why the match is failing in certain circumstances I figured I'd post it to see if anyone else can point out what I'm missing. I'm trying to pull out the 2 sets of digits from strings of the form: 12309123098_102938120938120938 1321312_103810312032123 123123123_10983094854905490 38293827_1293120938129308 I'm using the following code to process each string: if($string && $string =~ /^(\d)+_(\d)+$/) { if(IsInteger($1) && IsInteger($2)) { print "success ('$1','$2')"; } else { print "fail"; } } Where the IsInterger() function is as follows: sub IsInteger { my $integer = shift; if($integer && $integer =~ /^\d+$/) { return 1; } return; } This function seems to work most of the time but fails on the following for some reason: 1287123437_1268098784380 1287123437_1267589971660 Any ideas on why these fail while others succeed? Thanks in advance for your help!

    Read the article

  • Does Visual Studio 2010 Beta 2 Include a TFS server?

    - by Russell
    I recently downloaded Visual Studio 2010 beta 2, and was told it included a TFS server. However I am unsure of if it has/can be installed, or how to start it up if it has been. Can anyone shed any light on this for me please? Thanks :) Thanks for your help :) I am downloading the ISO of the separate product instead from msdn.

    Read the article

  • How can I keep curl output out of mail from my cronjob?

    - by Russell C.
    I have written a Perl script that runs as a daily crontab job that uploads files to Amazon S3 via CURL. I want the output of the cron job emailed to me which works fine but I don't want that email to include messages related to the CURL upload (only those message my script is outputting). Here are the CURL related messages I'm seeing in the daily email right now: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 230M 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 230M 0 0 0 544k 0 1519k 0:02:35 --:--:-- 0:02:35 1807k 0 230M 0 0 0 1744k 0 1286k 0:03:03 0:00:01 0:03:02 1342k 1 230M 0 0 1 2880k 0 1219k 0:03:13 0:00:02 0:03:11 1250k 1 230M 0 0 1 4016k 0 1198k 0:03:17 0:00:03 0:03:14 1218k 2 230M 0 0 2 5168k 0 1186k 0:03:19 0:00:04 0:03:15 1202k 2 230M 0 0 2 6336k 0 1181k 0:03:19 0:00:05 0:03:14 1157k 3 230M 0 0 3 7488k 0 1177k 0:03:20 0:00:06 0:03:14 1147k 3 230M 0 0 3 8592k 0 1167k 0:03:22 0:00:07 0:03:15 1142k 4 230M 0 0 4 9744k 0 1166k 0:03:22 0:00:08 0:03:14 1145k 4 230M 0 0 4 10.6M 0 1163k 0:03:23 0:00:09 0:03:14 1142k 5 230M 0 0 5 11.7M 0 1161k 0:03:23 0:00:10 0:03:13 1140k 5 230M 0 0 5 12.8M 0 1158k 0:03:23 0:00:11 0:03:12 1133k 6 230M 0 0 6 13.9M 0 1155k 0:03:24 0:00:12 0:03:12 1138k 6 230M 0 0 6 15.0M 0 1155k 0:03:24 0:00:13 0:03:11 1138k 7 230M 0 0 7 16.1M 0 1152k 0:03:25 0:00:14 0:03:11 1131k 7 230M 0 0 7 17.2M 0 1152k 0:03:25 0:00:15 0:03:10 1132k 7 230M 0 0 7 18.4M 0 1152k 0:03:24 0:00:16 0:03:08 1140k I am using a simple Perl system() call to invoke CURL. Does anyone know what command line argument I can supply CURL to turn off the reporting of the upload progress?

    Read the article

  • ASP.net attachment then refresh page

    - by Russell
    I am working on a c# project using ASP .net. I have a list of reports with a hyperlink for each, which calls the web server, retrieves a PDF and then returns the PDF for the user to save or open: ASPX page: <table> <tr> <td> <a href="#" onclick="SubmitFormToOpenReport();">Open Report 1</a> <td> </tr> ... </table> ASP.Net: context.Response.Clear(); context.Response.AddHeader("content-disposition", "attachment;filename=report.pdf"); context.Response.Charset = ""; context.Response.ContentType = "application/pdf"; context.Response.BinaryWrite(myReport); context.Response.Flush(); This works as expected, however I would like it to also refresh the page with an updated list. I am having trouble as the single request/response is returning the report. Is there a way to refresh the page as well? While there is a correct response, feel free to include answers which details alternative solutions/ideas for doing this.

    Read the article

  • Improving MySQL Update Query Efficiency

    - by Russell C.
    In our database tables we keep a number of counting columns to help reduce the number of simple lookup queries. For example, in our users table we have columns for the number of reviews written, photos uploaded, friends, followers, etc. To help make sure these stay in sync we have a script that runs periodically to check and update these counting columns. The problem is that now that our database has grown significantly the queries we have been using are taking forever to run since they are totally inefficient. I would appreciate someone with more MySQL knowledge than myself to recommend how we can improve it's efficiency: update users set photos=(select count(*) from photos where photos.status="A" AND photos.user_id=users.id) where users.status="A"; If this were a select statement I would just use a join but I'm not sure if that is possible with update. Thanks in advance for your help!

    Read the article

  • Is object remain fixed when scrolling background in cocos2d.

    - by russell
    I have one question when infinite background scrolling is done, is the object remain fixed(like doodle in doodle jump, papy in papi jump) or these object really moves.Is only background move or both (background and object )move.plz someone help me.I am searching for this solution for 4/5 days,but can't get the solution.So plz someone help me. And if object does not move how to create such a illusion of object moving.

    Read the article

  • Are there any free Xml Diff/Merge tools available?

    - by Russell
    I have several config files in my .net applications which I would like to merge application settings elements etc. I was about to begin doing it manually as I usually do, however thought there must be an XML diff GUI tool available somewhere. The tool would be able to go to the element level to compare and display the differences etc. However Google gave no substantive free tool results and no hints for anything of value. Is such a tool available? That is very useful? For free? Thanks in advance. :) Edit: Here is a bit of clarification of the functionality that would turn my error-prone, tedious manual job into a 1-minute simpler task (and potential to automate): In KDiff3, you can do a diff/merge of entire directories. There is a hierarchical diff which is very accurate, user-friendly and clear. I was interested in finding a similar solution, however instead of directory hierarchy, an XML element hierarchy. If there is no such open source software, I am considering creating one on CodePlex to provide this functionality.

    Read the article

  • Task Scheduler Cannot Apply My Changes - Adding a User with Permissions

    - by Aaron
    I can log in to the server using a domain account without administrator privileges and create a task in the Task Scheduler. I am allowed to do an initial save of the task but unable to modify it with the same user account. When changes are complete, a message box prompts for the user password (same domain user I logged in with), then fails with the following message. Task Scheduler cannot apply your changes. The user account is unknown, the password is incorrect, or the account does not have permission to modify the task. When I check Log on as Batch Job Properties (found this from the Help documentation): This policy is accessible by opening the Control Panel, Administrative Tools, and then Local Security Policy. In the Local Security Policy window, click Local Policy, User Rights Assignment, and then Logon as batch job. Everything is grayed out, so I can't add a user. How can I add a user?

    Read the article

  • What is a Valid Trust Anchor in Windows 7 relating to Wifi?

    - by Aaron
    The error below just started happening at work with a personal laptop running Windows 7 Ultimate. I'm unable to use installed, non-expired certificates to connect to a private wireless network. No recent changes were made by IT that would explain the issue. It worked fine several weeks ago and happens on two laptops I own. The details and some screen shots are here: http://www.wiredprairie.us/blog/index.php/archives/906 The error we don't understand is this: The credentials provided by the server could not be validated. We recommend that you terminate the connection and contact your administrator with the information provided in the details. You may still connect but doing so exposes you to the security risk by a possible rogue server. The server XYZ presented a valid certificate issued by Company Name Certificate Authority but Company Name Certificate Authority is not configured as a valid trust anchor for this profile. We don't know to to resolve the issue without ignoring the error (nor what's changed that could explain this new error). Update: The new information is that we have our own Root CA, and that the certificates were not updated recently, nor have any expired.

    Read the article

  • How do you force Outlook 2007 to re-index it's seach on Windows XP SP 3?

    - by Aaron K
    So I have a Windows XP SP 3 machine which is running Outlook 2007. When I search in Outlook for an email that exists using a basic keyword, like say "MySQL", I get no results. However, Outlook gives me the following message: Search results may be incomplete because items are still being indexed. Click here for more details. When I click, I get the following: Outlook is currently indexing your items. Search results may be incomplete because items are still being indexed. 8783 items remaining in "Mailbox - USER" 8812 items remaining across all open mailboxes. The thing is, these are the numbers it has been reporting for several days, and Outlook is open for 8 hours a day. It does not seem like the index is working. As best I can tell, the index seemed to stop about 3 weeks ago. How can I force Outlook 2007 to re-index everything and start working properly again?

    Read the article

  • VLC (Server) re-stream Security Camera Feed

    - by Aaron
    I purchased a Swann Home Security DVR system and was hoping for some help on how to duplicate the streaming video on my server. In order to get their web view (streaming video in the browser) to work, I had to install the following plugins: HiDvrPlugin.dmg for mac. Hidvrocx.cab for Windows. I was originally thinking it was a sign of some form of DRM? Maybe. Maybe not. HTML wise, the following code is in the source of the safari version of the web view: <embed pluginspage="SurveilClient.dmg" width="10px" height="10px" type="application/x-scplugin" id="MacDiv" style="height: 592px; width: 720px; left: 278px; top: 61px; "> It seems to be the main display area. Using wireshark, I am able to see that the video stream is on port 9000. However, I have no idea what type of stream it is. I've tried opening it in VLC with no such luck. http://dvr_ip:9000 tcp://dvr_ip:9000 My hope was to do the following to redistribute the feed vlc dvr_ip:9000 --sout h264-version-on-localhost:3000 TLDR; Trying to re-distribute a stream from a security camera (can't tell the format) using vlc (re-distribute via h.264 / HTML5). Not sure how to accomplish this. Is it possible that the software has some type of DRM that only the plugins can decode?

    Read the article

  • Account Lockout with pam_tally2 in RHEL6

    - by Aaron Copley
    I am using pam_tally2 to lockout accounts after 3 failed logins per policy, however, the connecting user does not receive the error indicating pam_tally2's action. (Via SSH.) I expect to see on the 4th attempt: Account locked due to 3 failed logins No combination of required or requisite or the order in the file seems to help. This is under Red Hat 6, and I am using /etc/pam.d/password-auth. The lockout does work as expected but the user does not receive the error described above. This causes a lot of confusion and frustration as they have no way of knowing why authentication fails when they are sure they are using the correct password. Implementation follows NSA's Guide to the Secure Conguration of Red Hat Enterprise Linux 5. (pg.45) It's my understanding that that only thing changed in PAM is that /etc/pam.d/sshd now includes /etc/pam.d/password-auth instead of system-auth. If locking out accounts after a number of incorrect login attempts is required by your security policy, implement use of pam_tally2.so. To enforce password lockout, add the following to /etc/pam.d/system-auth. First, add to the top of the auth lines: auth required pam_tally2.so deny=5 onerr=fail unlock_time=900 Second, add to the top of the account lines: account required pam_tally2.so EDIT: I get the error message by resetting pam_tally2 during one of the login attempts. user@localhost's password: (bad password) Permission denied, please try again. user@localhost's password: (bad password) Permission denied, please try again. (reset pam_tally2 from another shell) user@localhost's password: (good password) Account locked due to ... Account locked due to ... Last login: ... [user@localhost ~]$

    Read the article

  • Chicago SQL Saturday

    - by Johnm
    This past Saturday, April 17, 2010, I journeyed North to the great city of Chicago for some SQL Server fun, learning and fellowship. The Chicago edition of this grassroots phenomenon was the 31st scheduled SQL Saturday since the program's birth in late 2007. The Chicago SQL Saturday consisted of four tracks with eight sessions each and was a very energetic and fast paced day for the 300+/- SQL Server enthusiasts in attendance. The speaker line up included national notables such as Kevin Kline, Brent Ozar, and Brad McGehee. My hometown of Indianapolis was well represented in the speaker line up with Arie Jones, Aaron King and Derek Comingore. The day began with a very humorous keynote by Kevin Kline and Brent Ozar who emphasized the importance of community events such as SQL Saturday and the monthly user group meetings. They also brilliantly included the impact that getting involved in the SQL community through social media can have on your professional career. My approach to the day was to try to experience as much of the event as I could, so there were very few sessions that I attended for their full duration. I leaped from session to session like a bumble bee, gleaning bits of nectar from each session. Amid these leaps I took the opportunity to briefly chat with some of the in-the-queue speakers as well as other attendees that wondered the hallways. I especially enjoyed a great discussion with Devin Knight about his plans regarding the upcoming Jacksonville SQL Saturday as well as an interesting SQL interpretation of the Iron Chef, which I think would catch on like wild-fire. There were two sessions that stood out as exceptional. So much so that I could not pull myself away: Kevin Kline presented on "SQL Server Internals and Architecture". This session could have been classified as one that is intended for the beginner. Kevin even personally warned me of such as I entered the room. I am a believer in revisiting the basics regardless of the level of your mastery, so I entered into this session in that spirit. It was a very clear and precise presentation. Masterfully illustrated and demonstrated. Brad McGehee presented on "How and When to Use Indexed Views". This was a topic that I was recently exploring and was considering to for use in an integration project. Brad effectively communicated the complexity of this feature and what is involved to gain their full benefit. It was clear at the conclusion of this session that it was not the right feature for my specific needs. Overall, the event was a great success. The use of volunteers, from an attendee's perspective was masterful. The only recommendation that I would have for the next Chicago SQL Saturday would be to include more time in between sessions to permit some level of networking among the attendees, one-on-one questions for speakers and visits to the sponsor booths. Congratulations to Wendy Pastrick, Ted Krueger, and Aaron Lowe for their efforts and a very successful SQL Saturday!

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >