Search Results

Search found 14874 results on 595 pages for 'mysql connector'.

Page 574/595 | < Previous Page | 570 571 572 573 574 575 576 577 578 579 580 581  | Next Page >

  • CodeIgniter and SimpleTest -- How to make my first test?

    - by Smandoli
    I'm used to web development using LAMP, PHP5, MySQL plus NetBeans with Xdebug. Now I want to improve my development, by learning how to use (A) proper testing and (B) a framework. So I have set up CodeIgniter, SimpleTest and the easy Xdebug add-in for Firefox. This is great fun because maroonbytes provided me with clear instructions and a configured setup ready for download. I am standing on the shoulders of giants, and very grateful. I've used SimpleTest a bit in the past. Here is a the kind of thing I wrote: <?php require_once('../simpletest/unit_tester.php'); require_once('../simpletest/reporter.php'); class TestOfMysqlTransaction extends UnitTestCase { function testDB_ViewTable() { $this->assertEqual(1,1); // a pseudo-test } } $test = new TestOfMysqlTransaction(); $test->run(new HtmlReporter()) ?> So I hope I know what a test looks like. What I can't figure out is where and how to put a test in my new setup. I don't see any sample tests in the maroonbytes package, and Google so far has led me to posts that assume unit testing is already functionally available. What do I do?

    Read the article

  • convert date error when retrieving data in twig page through doctrine

    - by user201892
    I just find this error when I retrieve all my records on twig page from the database through doctrine here is my twig code : {% extends "gestionConferenceApplicationBundle::layout.html.twig" %} {% block title "Hello " ~ name %} {% block content %} je suis un debutant <table border=2 > <th>Numéro</th> <th>Titre</th> <th>Ville</th> <th>Lieu</th> <th>Date de début</th> <th>Date de fin</th> {% for item in conferences %} <tr> <td>{{ item.id }}</td> <td>{{ item.titre }}</td> <td>{{ item.ville }}</td> <td>{{ item.lieu }}</td>* <td>{{ item.dateDebut }}</td> <td>{{ item.dateFin }}</td> </tr> {% endfor %} </table> {% endblock %} the error in the date : here it is : An exception has been thrown during the rendering of a template ("Catchable Fatal Error: Object of class DateTime could not be converted to string in C:\wamp\www\Symfony\app\cache\dev\twig\3c\dd\40a703549de9b8769fa40b82230e.php line 72") in gestionConferenceApplicationBundle:acceuil:acceuil.html.twig at line 19. do you have any idea how to convert this date or any other things in the MySql I have dateTime field and in doctrine I have : /** * @var \DateTime $dateDebut * * @ORM\Column(name="date_debut", type="datetime", nullable=false) */ private $dateDebut;

    Read the article

  • Software Company Library

    - by dbemerlin
    Hi. A few days ago i had the idea to create a company library since my company has no training and many developers still develop as they did when they learned it 5 years ago. My hope is that they can lend books, read them and hopefully learn something from them (for example: object oriented programming or unit testing, which noone here knows how to use). After asking around most agreed that it was a good idea, so i brought my books, made a simple printed sheet with "Book A belongs to B" and "Developer A took the Book on dd.mm.yyyy" to get it started. Now i want to get some ideas for Books that i could add to the shelf (sadly from my own money since 100€/month for training is too much money for this multi-million euro company). We develop mostly PHP & MySQL so books specific to this topic would be preferred but i think if people learn other languages they might get ideas on how to develop better with the current language so other books are ok, too. Which books would you recommend? PS: Personally i'd like to add some Project Management books, too, as it's a topic i'm interested in, eventhough i'm just a junior developer (We've got Peopleware already, great book btw).

    Read the article

  • Php JSON Response Array

    - by Nick Kl
    I have this php code. As you can see i query a mysql database through a function showallevents. I return a the $result to the $event variable. I try to return all rows of the data that i take with the msql_fetch_assoc. I don't get response even when i encode the $response variable. It returns null to all fields. Can anyone help me on what i am doing wrong. I had a valid code but it was returning only 1 row of data so i tried to make an associative array but seems i am failing. if ($tag == 'showallevents') { // Request type is show all events // show all events $event = $db->showallevents(); if ($event != false) { $data = array(); while($row = mysql_fetch_assoc($event)) { $data[] = array( $response["success"] = 1, $response["uid"] = $event["uid"], $response["event"]["date"] = $event["date"], $response["event"]["hours"] = $event["hours"], $response["event"]["store_name"] = $event["store_name"], $response["event"]["event_information"] = $event["event_information"], $response["event"]["event_type"] = $event["event_type"], $response["event"]["Phone"] = $event["Phone"], $response["event"]["address"] = $event["address"], $response["event"]["created_at"] = $event["created_at"], $response["event"]["updated_at"] = $event["updated_at"]); } echo json_encode($data); } else { // event not found // echo json with error = 1 $response["error"] = 1; $response["error_msg"] = "Events not found"; echo json_encode($response); } } else { echo "Access Denied"; } } ?>

    Read the article

  • What technologies/tools do people use to implement live websites ?

    - by MadSeb
    Hi, I have the following situation: -I have a server A hooked up to a piece of hardware that sends values and information out of every second. Programs on the server machine can read these values. This server A is in a very remote location so Internet connection is very slow and not reliable but the connection does exist. Let's say it's a weather station in the Arctic. -Users from the home location want to monitorize the weather values somehow. Well, the users can use a remote desktop connection the server A but that would be too too slow. My idea is somehow to have a website on a web server (let's call the webserver - B and B is in a home location ) and make the server A connect to the server B and somehow send values and the web application reads the values and displays them....... but how to do such a system ?? I know I can use MySQL and have the server A connect to a SQL server on server B and send INSERT queries and have the web application running on server B constantly read from the SQL server but I think that would be way way too slow and I think there has to be a better solution. Any ideas ? BTW. The users should be able to send information to the weather station from the website as well ( so an ADMIN user should be allowed to shut down the weather station from the website or whatever) Best regards, MadSeb

    Read the article

  • Accents in uploaded file being replaced with '?'

    - by Katfish
    I am building a data import tool for the admin section of a website I am working on. The data is in both French and English, and contains many accented characters. Whenever I attempt to upload a file, parse the data, and store it in my MySQL database, the accents are replaced with '?'. I have text files containing data (charset is iso-8859-1) which I upload to my server using CodeIgniter's file upload library. I then read the file in PHP. My code is similar to this: $this->upload->do_upload() $data = array('upload_data' => $this->upload->data()); $fileHandle = fopen($data['upload_data']['full_path'], "r"); while (($line = fgets($fileHandle)) !== false) { echo $line; } This produces lines with accents replaced with '?'. Everything else is correct. If I download my uploaded file from my server over FTP, the charset is still iso-8850-1, but a diff reveals that the file has changed. However, if I open the file in TextEdit, it displays properly. I attempted to use PHP's stream_encoding method to explicitly set my file stream to iso-8859-1, but my build of PHP does not have the method. After running out of ideas, I tried wrapping my strings in both utf8_encode and utf8_decode. Neither worked. If anyone has any suggestions about things I could try, I would be extremely grateful.

    Read the article

  • The Current State Of Serving a PHP 5.x App on the Apache, LightTPD & Nginx Web Servers?

    - by Gregory Kornblum
    Being stuck in a MS stack architecture/development position for the last year and a half has prevented me from staying on top of the world of open source stack based web servers recent evolution more than I would have liked to. However I am now building an open source stack based application/system architecture and sadly I do not have the time to give each of the above mentioned web servers a thorough test of my own to decide. So I figured I'd get input from the best development community site and more specifically the people who make it so. This is a site that is a resource for information regarding a specific domain and target audience with features to help users not only find the information but to also interact with one another in various ways for various reasons. I chose the open source stack for the wealth of resources it has along with much better offers than the MS stack (i.e. WordPress vs BlogEngine.NET). I feel Java is more in the middle of these stacks in this regard although I am not ruling out the possibility of using it in certain areas unrelated to the actual web app itself such as background processes. I have already come to the conclusion of using PHP (using CodeIgniter framework & APC), MySQL (InnoDB) and Memcached on CentOS. I am definitely serving static content on Nginx. However the 3 servers mentioned have no consensus on which is best for dynamic content in regards to performance. It seems LightTPD still has the leak issue which rules it out if it does, Nginx seems it is still not mature enough for this aspect and of course Apache tries to be everything for everybody. I am still going to compile the one chosen with as many performance tweaks as possible such as static linking and the likes. I believe I can get Apache to match the other 2 in regards to serving dynamic content through this process and not having it serve anything static. However during my research it seems the others are still worth considering. So with all things considered I would love to hear what everyone here has to say on the matter. Thanks!

    Read the article

  • How can I send GET data to multiple URLs at the same time using cURL?

    - by Rob
    My apologies, I've actually asked this question multiple times, but never quite understood the answers. Here is my current code: while($resultSet = mysql_fetch_array($SQL)){ $ch = curl_init($resultSet['url'] . $fullcurl); //load the urls and send GET data curl_setopt($ch, CURLOPT_TIMEOUT, 2); //Only load it for two seconds (Long enough to send the data) curl_exec($ch); //Execute the cURL curl_close($ch); //Close it off } //end while loop What I'm doing here, is taking URLs from a MySQL Database ($resultSet['url']), appending some extra variables to it, just some GET data ($fullcurl), and simply requesting the pages. This starts the script running on those pages, and that's all that this script needs to do, is start those scripts. It doesn't need to return any output. Just the load the page long enough for the script to start. However, currently it's loading each URL (currently 11) one at a time. I need to load all of them simultaneously. I understand I need to use curl_multi_*, but I haven't the slightest idea on how cURL functions work, so I don't know how to change my code to use curl_multi_* in a while loop. So my questions are: How can I change this code to load all of the URLs simultaneously? Please explain it and not just give me code. I want to know what each individual function does exactly. Will curl_multi_exec even work in a while loop, since the while loop is just sending each row one at a time? And of course, any references, guides, tutorials about cURL functions would be nice, as well. Preferably not so much from php.net, as while it does a good job of giving me the syntax, its just a little dry and not so good with the explanations.

    Read the article

  • SQL Grouping with multiple joins combining results incorrectly

    - by Matt
    Hi I'm having trouble with my query combining records when it shouldn't. I have two tables Authors and Publications, they are related by Publication ID in a many to many relationship. As each author can have many publications and each publication has many Authors. I want my query to return every publication for a set of authors and include the ID of each of the other authors that have contributed to the publication grouped into one field. (I am working with mySQL) I have tried to picture it graphically below Table: authors Table:publications AuthorID | PublicationID PublicationID | PublicationName 1 | 123 123 | A 1 | 456 456 | B 2 | 123 789 | C 2 | 789 3 | 123 3 | 456 I want my result set to be the following AuthorID | PublicationID | PublicationName | AllAuthors 1 | 123 | A | 1,2,3 1 | 456 | B | 1,3 2 | 123 | A | 1,2,3 2 | 789 | C | 2 3 | 123 | A | 1,2,3 3 | 456 | B | 1,3 This is my query Select Author1.AuthorID, Publications.PublicationID, Publications.PubName, GROUP_CONCAT(TRIM(Author2.AuthorID)ORDER BY Author2.AuthorID ASC)AS 'AuthorsAll' FROM Authors AS Author1 LEFT JOIN Authors AS Author2 ON Author1.PublicationID = Author2.PublicationID INNER JOIN Publications ON Author1.PublicationID = Publications.PublicationID WHERE Author1.AuthorID ="1" OR Author1.AuthorID ="2" OR Author1.AuthorID ="3" GROUP BY Author2.PublicationID But it returns the following instead AuthorID | PublicationID | PublicationName | AllAuthors 1 | 123 | A | 1,1,1,2,2,2,3,3,3 1 | 456 | B | 1,1,3,3 2 | 789 | C | 2 It does deliver the desired output when there is only one AuhorID in the where statement. I have not been able to figure it out, does anyone know where i'm going wrong?

    Read the article

  • entity framework insert bug

    - by tmfkmoney
    I found a previous question which seemed related but there's no resolution and it's 5 months old so I've opened my own version. http://stackoverflow.com/questions/1545583/entity-framework-inserting-new-entity-via-objectcontext-does-not-use-existing-e When I insert records into my database with the following it works fine for a while and then eventually it starts inserting null values in the referenced field. This typically happens after I do an update on my model from the database although not always after I do an update. I'm using a MySQL database for this. I have debugged the code and the values are being set properly before the save event. They're just not getting inserted properly. I can always fix this issue by re-creating the model without touching any of my code. I have to recreate the entire model, though. I can't just dump the relevant tables and re-add them. This makes me think it doesn't have anything to do with my code but something with the entity framework. Does anyone else have this problem and/or solved it? using (var db = new MyModel()) { var stocks = from record in query let ticker = record.Ticker select new { company = db.Companies.FirstOrDefault(c => c.ticker == ticker), price = Convert.ToDecimal(record.Price), date_stamp = Convert.ToDateTime(record.DateTime) }; foreach (var stock in stocks) { if (stock.company != null) { var price = new StockPrice { Company = stock.company, price = stock.price, date_stamp = stock.date_stamp }; db.AddToStockPrices(price); } } db.SaveChanges(); }

    Read the article

  • Parse large XML file w/ script or use BioPython API ?

    - by jeremy04
    Hey guys this is my first question on here. I'm trying to make a local copy of the UniprotKB in SQL. The UniprotKB is 2.1GB, and it comes in XML and a special text format used by SwissProt Here are my options: 1) Use a SAX parser (XML) - I chose Ruby, and Nokogiri. I started writing the parser, but my initial reaction: how would I map the XML schema to the SAX parser? 2) BioPython - I already have BioSQL/Biopython installed, which literally created my SQL schema for me, and I was able to successfully insert one SwissProt/Uniprot txt file into the database. I'm running it right now (crosses fingers) on the entire 2.1gb. Here is the code I'm running: from Bio import SeqIO from BioSQL import BioSeqDatabase from Bio import SwissProt server = BioSeqDatabase.open_database(driver = "MySQLdb", user = "root", passwd = "", host="localhost", db = "bioseqdb") db = server["uniprot"] iterator = SeqIO.parse(open("/path/to/uniprot_sprot.dat", "r"), "swiss") db.load(iterator) server.commit() Edit: it's now crashing because the transactions are getting locked (since the tables are Innodb) Error Number: 1205 Lock wait timeout exceeded; try restarting transaction. I'm using MySQL version: 5.1.43 Should I switch my database to Postgrelsql ?

    Read the article

  • .htacces Rewrite Rule to Keep .php File Extensions

    - by user2672112
    I'm upgrading my static website that had .php extensions on the content pages. I've created my own simple cms which will start retrieving data from mysql database from now on, keeping the url structure same as the old once. The cms has get function to retrieve url structure from the database. Overall it started working fine with .html when i tested. But when i change the .html extension to .php in my .htaccess code the content pages starts reflecting "Internal Server Error. The server encountered an internal error or misconfiguration and was unable to complete your request." Here is my .htaccess code which i've used: RewriteBase / Options +FollowSymLinks RewriteEngine On RewriteRule ^([^?]*).php$ content.php?pid=$1 Perhaps there is a conflict, here is the code with .html extension that actually works fine. RewriteBase / Options +FollowSymLinks RewriteEngine On RewriteRule ^([^?]*).html$ content.php?pid=$1 So basically, content pages with .html are working & .php are not working. But i need my content pages to be with .php Please help. Thanks in advance... :)

    Read the article

  • Javascript libraries + JQuery plugins contradict? How to debug?

    - by Metafaniel
    This is somewhat a newbie question... I effort everyday to learn, so please understand ;) I'm not the very best expert, but I can do a decent job good looking and functional websites or web applications. My main tools are PHP5, HTML5, CSS2 y 3, a database (SQLite, MySQL) and Javascript and JQuery. I'm not an expert at all in Javascript. I often find interesting JQuery plugins or tutorials and try to mix them up to do the functionality needed. This time I'm mixing maybe too much plugins and js files from different sources. In fact, my app do what I want except for certain behaviors... There are no errors, everything looks fine, but the misbehavior persists. So maybe I need to specify a class I don't know about, or one contradicts another one from another plugin and I just can't understand, for example, why a <button type="button">DON'T submit</button> just submits the form... Anyway, my point is: Do you people know a way to debug this situations??? Is there a generic tool, suggestion, workflow or something to help me understand conflicts or omissions between libraries or plugins??? (Javascript libraries, my own Javascripts and JQuery plugins)??? I hope it is a way! THANKS A LOT FOR YOUR HELP AND COMPREHENSION! =)

    Read the article

  • Incorrect component when querying immediately after insert using NHibernate

    - by Am
    I have the following mapping for my table in MySql: <class name="Tag, namespace" table="tags" > <id name="id" type="Int32" unsaved-value="0"> <generator class="native"></generator> </id> <property name="name" type="String" not-null="true"></property> <component name="record_dates" class="DateMetaData, namespace" > <property name="created_at" type="DateTime" not-null="true"></property> <property name="updated_at" type="DateTime" not-null="true"></property> </component> </class> As you see the record_dates property is defined as a component field of type DateMetaDate. Both created_at and updated_at fields in 'tags' table are updated via triggers. Thus I can insert a new record like such: var newTag = new Tag() { name = "some string here" } Int32 id = (Int32)Session.Save(tag); Session.Flush(); ITag t = Session.Get<Tag>(id); ViewData["xxx"] = t.name; // -----> not null ViewData["xxx"] = t.record_dates.created_at; // -----> is null However when querying the same record back immediately after it was inserted the record_dates field ends up null even though in the table those fields have got values. Can any one please point out why the Session.Get ignores getting everything back from the table? is it because it caches the newly created record for which the records_dates is null? If so how can it be told to ignore the cached version?

    Read the article

  • jquery with php loading file

    - by Marcus Solv
    I'm trying to use jquery with a simple php code: $('#some').click(function() { <?php require_once('some1.php?name="some' + index + '"'); ?> }); It shows no error, so I don't know what is wrong. In some1 I have: <?php //Start session session_start(); //Include database connection details require_once('../sql/config.php'); //Connect to mysql server $link = mysql_connect(DB_HOST, DB_USER, DB_PASSWORD); if(!$link) { die('Failed to connect to server: ' . mysql_error()); } //Select database $db = mysql_select_db(DB_DATABASE); if(!$db) { die("Unable to select database"); } //Function to sanitize values received from the form. Prevents SQL injection function clean($str) { $str = @trim($str); if(get_magic_quotes_gpc()) { $str = stripslashes($str); } return mysql_real_escape_string($str); } //Sanitize the POST values $name = clean($_GET['name']); ?> It's not complete because I want to make a sql command (insert). I want when I click in #some to execute that file (create a entry in the table that isn't define yet).

    Read the article

  • What's the best Linux backup solution?

    - by Jon Bright
    We have a four Linux boxes (all running Debian or Ubuntu) on our office network. None of these boxes are especially critical and they're all using RAID. To date, I've therefore been doing backups of the boxes by having a cron job upload tarballs containing the contents of /etc, MySQL dumps and other such changing, non-packaged data to a box at our geographically separate hosting centre. I've realised, however that the tarballs are sufficient to rebuild from, but it's certainly not a painless process to do so (I recently tried this out as part of a hardware upgrade of one of the boxes) long-term, the process isn't sustainable. Each of the boxes is currently producing a tarball of a couple of hundred MB each day, 99% of which is the same as the previous day partly due to the size issue, the backup process requires more manual intervention than I want (to find whatever 5GB file is inflating the size of the tarball and kill it) again due to the size issue, I'm leaving stuff out which it would be nice to include - the contents of users' home directories, for example. There's almost nothing of value there that isn't in source control (and these aren't our main dev boxes), but it would be nice to keep them anyway. there must be a better way So, my question is, how should I be doing this properly? The requirements are: needs to be an offsite backup (one of the main things I'm doing here is protecting against fire/whatever) should require as little manual intervention as possible (I'm lazy, and box-herding isn't my main job) should continue to scale with a couple more boxes, slightly more data, etc. preferably free/open source (cost isn't the issue, but especially for backups, openness seems like a good thing) an option to produce some kind of DVD/Blu-Ray/whatever backup from time to time wouldn't be bad My first thought was that this kind of incremental backup was what tar was created for - create a tar file once each month, add incrementally to it. rsync results to remote box. But others probably have better suggestions.

    Read the article

  • Do I need to write a trigger for such a simple constraint?

    - by Paul Hanbury
    I really had a hard time knowing what words to put into the title of my question, as I am not especially sure if there is a database pattern related to my problem. I will try to simplify matters as much as possible to get directly to the heart of the issue. Suppose I have some tables. The first one is a list of widget types: create table widget_types ( widget_type_id number(7,0) primary key, description varchar2(50) ); The next one contains icons: create table icons ( icon_id number(7,0) primary key, picture blob ); Even though the users get to select their preferred widget, there is a predefined subset of widgets that they can choose from for each widget type. create table icon_associations ( widget_type_id number(7,0) references widget_types, icon_id number(7,0) references icons, primary key (widget_type_id, icon_id) ); create table icon_prefs ( user_id number(7,0) references users, widget_type_id number(7,0), icon_id number(7,0), primary key (user_id, widget_type_id), foreign key (widget_type_id, icon_id) references icon_associations ); Pretty simple so far. Let us now assume that if we are displaying an icon to a user who has not set up his preferences, we choose one of the appropriate images associated with the current widget. I'd like to specify the preferred icon to display in such a case, and here's where I run into my problem: alter table icon_associations add ( is_preferred char(1) check( is_preferred in ('y','n') ) ) ; I do not see how I can enforce that for each widget_type there is one, and only one, row having is_preferred set to 'y'. I know that in MySQL, I am able to write a subquery in my check constraint to quickly resolve this issue. This is not possible with Oracle. Is my mistake that this column has no business being in the icon_associations table? If not where should it go? Is this a case where, in Oracle, the constraint can only be handled with a trigger? I ask only because I'd like to go the constraint route if at all possible. Thanks so much for your help, Paul

    Read the article

  • Jquery Apache - IE problem

    - by Soldierflup
    I'm having a button tag on my page with a value. <button class='btn' value='value'>show value</button> I have this jquery code : $('.btn').click(function() { var w = 'value = '+$(this).val()+' / text = '+$(this).html(); alert(w); }); In FF, no problem the result is ok (display: value = value / text = show value). The problem comes with IE8 which displays a different results from my testing server and the production server. The testing server is my local machine with a standard XAMPP installation. The productionserver is a server based on linux with apache, php and mysql. Result from the testing server is ok (display like FF), the result from the production server is not good (displaying : value = show value / text : show value). Anyone an idea if it is apache that causes the error ? I know there are some issues with the use of val() because IE is considering it as an attribute and not a value. The problem is that changing the jQuery from val() to attr('value') is quit a lot of work (this implementation is already on a lot of pages) and I think it could be much easier to change something on the webserver.

    Read the article

  • Performing a SVD on tweets. Memory problem

    - by plotti
    I have generated a huge csv file as an output from my pos tagging and stemming. It looks like this: word1, word2, word3, ..., word14400 person1 1 2 0 1 person2 0 0 1 0 ... person650 It contains the word counts for each person. Like this I am getting characteristic vectors for each person. I want to run a SVD on this beast, but it seems the matrix is too big to be held in memory to perform the operation. My quesion is: should i reduce the column size by removing words which have a column sum of for example 1, which means that they have been used only once. Do I bias the data too much with this attempt? I tried the rapidminer attempt, by loading the csv into the db. and then sequentially reading it in with batches for processing, like rapidminer proposes. But Mysql can't store that many columns in a table. If i transpose the data, and then retranspose it on import it also takes ages.... -- So in general I am asking for advice how to perform a svd on such a corpus.

    Read the article

  • PHP GET issue - it aint workin!

    - by benhowdle89
    I have a simple PHP form which displays inputs with values from a mysql DB and send the form results to another page which updates a db table based on the GET results: echo "<form method='get' action='updateprojects.php'>"; echo "<table>"; echo "<tr>"; echo " <th>Project No</th> <th>Customer Name</th> <th>Description</th> </tr>"; while($row = mysql_fetch_array($result)) { echo "<tr>"; echo "<td><input value=" . $row['project_no'] . "></input></td>"; echo "<td><input value='" . $row['cust_name'] . "'></input></td>"; echo "<td><input value='" . $row['description'] . "'></input></td>"; echo "</tr>"; } echo "</table>"; echo "<input type='submit' value='Update' />"; echo "</form>"; on updateprojects.php i have set up the GET variable and even echoed them to check but nothing comes through! Cannot see why!? This is the start of updateprojects.php: echo $_GET['project_no'].$_GET['cust_name'].$_GET['description'];

    Read the article

  • Spring Security ACL: NotFoundException from JDBCMutableAclService.createAcl

    - by user340202
    Hello, I've been working on this task for too long to abandon the idea of using Spring Security to achieve it, but I wish that the community will provide with some support that will help reduce the regret that I have for choosing Spring Security. Enough ranting and now let's get to the point. I'm trying to create an ACL by using JDBCMutableAclService.createAcl as follows: [code] public void addPermission(IWFArtifact securedObject, Sid recipient, Permission permission, Class clazz) { ObjectIdentity oid = new ObjectIdentityImpl(clazz.getCanonicalName(), securedObject.getId()); this.addPermission(oid, recipient, permission); } @Override @Transactional(propagation = Propagation.REQUIRED, isolation = Isolation.READ_UNCOMMITTED, readOnly = false) public void addPermission(ObjectIdentity oid, Sid recipient, Permission permission) { SpringSecurityUtils.assureThreadLocalAuthSet(); MutableAcl acl; try { acl = this.mutableAclService.createAcl(oid); } catch (AlreadyExistsException e) { acl = (MutableAcl) this.mutableAclService.readAclById(oid); } // try { // acl = (MutableAcl) this.mutableAclService.readAclById(oid); // } catch (NotFoundException nfe) { // acl = this.mutableAclService.createAcl(oid); // } acl.insertAce(acl.getEntries().length, permission, recipient, true); this.mutableAclService.updateAcl(acl); } [/code] The call throws a NotFoundException from the line: [code] // Retrieve the ACL via superclass (ensures cache registration, proper retrieval etc) Acl acl = readAclById(objectIdentity); [/code] I believe this is caused by something related to Transactional, and that's why I have tested with many TransactionDefinition attributes. I have also doubted the annotation and tried with declarative transaction definition, but still with no luck. One important point is that I have used the statement used to insert the oid in the database earlier in the method directly on the database and it worked, and also threw a unique constraint exception at me when it tried to insert it in the method. I'm using Spring Security 2.0.8 and IceFaces 1.8 (which doesn't support spring 3.0 but definetely supprorts 2.0.x, specially when I keep caling SpringSecurityUtils.assureThreadLocalAuthSet()). My AppServer is Tomcat 6.0, and my DB Server is MySQL 6.0 I wish to get back a reply soon because I need to get this task off my way

    Read the article

  • Any suggestions to improve my PDO connection class?

    - by Scarface
    Hey guys I am pretty new to pdo so I basically just put together a simple connection class using information out of the introductory book I was reading but is this connection efficient? If anyone has any informative suggestions, I would really appreciate it. class PDOConnectionFactory{ public $con = null; // swich database? public $dbType = "mysql"; // connection parameters public $host = "localhost"; public $user = "user"; public $senha = "password"; public $db = "database"; public $persistent = false; // new PDOConnectionFactory( true ) <--- persistent connection // new PDOConnectionFactory() <--- no persistent connection public function PDOConnectionFactory( $persistent=false ){ // it verifies the persistence of the connection if( $persistent != false){ $this->persistent = true; } } public function getConnection(){ try{ $this->con = new PDO($this->dbType.":host=".$this->host.";dbname=".$this->db, $this->user, $this->senha, array( PDO::ATTR_PERSISTENT => $this->persistent ) ); // carried through successfully, it returns connected return $this->con; // in case that an error occurs, it returns the error; }catch ( PDOException $ex ){ echo "We are currently experiencing technical difficulties. We have a bunch of monkies working really hard to fix the problem. Check back soon: ".$ex->getMessage(); } } // close connection public function Close(){ if( $this->con != null ) $this->con = null; } }

    Read the article

  • unknown exception error in php

    - by fayer
    i wanna catch all exceptions thrown in a script and then check if they have a error code 23000. if they don't i want to rethrow the exception. here is my code: function myException($exception) { /*** If it is a Doctrine Connection Mysql Duplication Exception ***/ if(get_class($exception) === 'Doctrine_Connection_Mysql_Exception' && $exception->getCode() === 23000) { echo "Duplicate entry"; } else { throw $exception; } } set_exception_handler('myException'); $contact = new Contact(); $contact->email = 'peter'; $contact->save(); but i get this error message and i dont know what it means: Fatal error: Exception thrown without a stack frame in Unknown on line 0 i want to be able to rethrow the original error message if it has not the error code 23000. even when i deleted the check errorcode i still get the same message: function myException($exception) { throw $exception; } set_exception_handler('myException'); $contact = new Contact(); $contact->email = 'peter'; $contact->save(); how could i solve this? thanks

    Read the article

  • XMLHttpRequest POST Data Size

    - by usurper
    Hi, Is there a size limit to a XHR POST request? I am using the POST method for saving textdata into MySQL using PHP script and the data is cut off. Firebug sends me the following message: ... Firebug request size limit has been reached by Firebug. ... This is my code for sending the data: function makeXHR(recordData) { xmlhttp = createXHR(); var body = "q=" + encodeURIComponent(recordData); xmlhttp.open("POST", "insertRowData.php", true); xmlhttp.setRequestHeader("Content-Type", "application/x-www-form-urlencoded"); xmlhttp.setRequestHeader("Content-length", body.length); xmlhttp.setRequestHeader("Connection", "close"); xmlhttp.onreadystatechange = function() { if(xmlhttp.readyState == 4 && xmlhttp.status == 200) { //alert(xmlhttp.responseText); alert("Records were saved successfully!"); } } xmlhttp.send(body); } The only solution I can think of is splitting the data and making a queue of XHR requests but I don't like it. Is there another way?

    Read the article

  • SQL query to calculate running group counts on time-phased data

    - by spong
    I have some data, like this: BUG DATE STATUS ---- ---------------------- -------- 9012 18/03/2008 9:08:44 AM OPEN 9012 18/03/2008 9:10:03 AM OPEN 9012 28/03/2008 4:55:03 PM RESOLVED 9012 28/03/2008 5:25:00 PM CLOSED 9013 18/03/2008 9:12:59 AM OPEN 9013 18/03/2008 9:15:06 AM RESOLVED 9013 18/03/2008 9:16:44 AM CLOSED 9014 18/03/2008 9:17:54 AM OPEN 9014 18/03/2008 9:18:31 AM RESOLVED 9014 18/03/2008 9:19:30 AM CLOSED 9015 18/03/2008 9:22:40 AM OPEN 9015 18/03/2008 9:23:03 AM RESOLVED 9015 19/03/2008 12:27:08 PM CLOSED 9016 18/03/2008 9:24:20 AM OPEN 9016 18/03/2008 9:24:35 AM RESOLVED 9016 19/03/2008 12:28:14 PM CLOSED 9017 18/03/2008 9:25:47 AM OPEN 9017 18/03/2008 9:26:02 AM RESOLVED 9017 19/03/2008 12:30:30 PM CLOSED Which I would like to transform into something like this: DATE OPEN RESOLVED CLOSED ---------------------- -------- -------- -------- 18/03/2008 9:08:44 AM 1 0 0 18/03/2008 9:12:59 AM 2 0 0 18/03/2008 9:15:06 AM 1 1 0 18/03/2008 9:16:44 AM 1 0 1 18/03/2008 9:17:54 AM 2 0 1 18/03/2008 9:18:31 AM 1 1 0 18/03/2008 9:19:30 AM 1 0 2 18/03/2008 9:22:40 AM 2 0 2 18/03/2008 9:23:03 AM 1 1 2 18/03/2008 9:24:20 AM 2 1 2 18/03/2008 9:24:35 AM 1 2 2 18/03/2008 9:25:47 AM 2 2 2 18/03/2008 9:26:02 AM 1 3 2 19/03/2008 12:27:08 PM 1 2 3 19/03/2008 12:28:14 PM 1 1 4 19/03/2008 12:30:30 PM 1 0 5 28/03/2008 4:55:03 PM 0 1 5 28/03/2008 5:25:00 PM 0 0 6 i.e. keeping running counts of bugs with each status. This is easy enough to code up using cursors, but I'm wondering if any of you SQL gurus out there can help with a query to achieve this? Ideally for mysql, but I'm curious to see anything that will work.

    Read the article

< Previous Page | 570 571 572 573 574 575 576 577 578 579 580 581  | Next Page >