Search Results

Search found 24675 results on 987 pages for 'table'.

Page 809/987 | < Previous Page | 805 806 807 808 809 810 811 812 813 814 815 816  | Next Page >

  • oracle sql developer is truncating my results

    - by nont
    I'm calling a stored function like this: select XML_INVOICE.GENERATE_XML_DOC('84200006823') from dual; The query results then show up in a table underneath, which I can right click and select "Export Date" - XML <?xml version='1.0' encoding='UTF8' ?> <RESULTS> <ROW> <COLUMN NAME="XML_INVOICE.GENERATE_XML_DOC('84200006823')" <![CDATA[<xml>yada yada</xml><morexml>...]]></COLUMN> </ROW> </RESULTS> The problem is the "..." - SQL Developer (2.1.0.63 on Linux) is not showing all the data - its truncating the result and appending the ellipsis. This is of no use to me. How do I get it to export ALL of my data?

    Read the article

  • SQL Server 2000 - Filter by String Length

    - by user208662
    Hello, I have a database on a SQL Server 2000 server. This database has a table called "Person" that has a field call "FullName" that is a VARCHAR(100). I am trying to write a query that will allow me to get all records that have a name. Records that do not have a name have a FullName value of either null or an empty string. How do I get all of the Person records have a FullName? In other words, I want to ignore the records that do not have a FullName. Currently I am trying the following: SELECT * FROM Person p WHERE p.FullName IS NOT NULL AND LEN(p.FullName) > 0 Thank you

    Read the article

  • Is Classic ADO still viable for a mixed managed/unmanaged App?

    - by Andy Dent
    We have a complex architecture with much logic in unmanaged code that needs database access. Currently this is via ODBC drivers and MFC classes and we're considering the issues of migrating our abstraction layer to use ADO or ADO.Net. In the latter case we'd have to be pushing database logic back up into the .Net layer. I'm trying to decide if the pain of invoking the database via .Net callbacks is offset by the improvements in ADO.Net. The Wikipedia comparison was interesting although I'm not sure I believe all the points in the comparison table (eg: does ADO.Net always use XML to pass data?). A 2005 comparison shows ADO.Net performing dramatically faster. Microsoft's guide to ADO.Net for ADO programmers suggests we will gain much from going to ADO.Net especially the way that data is available in native (.Net) types rather than solely through OLEAutomation's Variant.

    Read the article

  • getTitle() on Doctrine i18n with non-default language

    - by fesja
    Hi, I'm having a problem getting the title of an object from my i18n object in Doctrine 1.1.6 / Symfony 1.2 I have the following Doctrine Table method: public function getPlace($place_id, $lang=''){ $q = Doctrine::getTable('Place') ->createQuery('p'); if($lang != '') $q = $q->leftJoin('p.Translation ptr') ->addWhere('ptr.lang = ?', $lang); return $q->addWhere('p.id = ?', $place_id) ->fetchOne(); } Then on the view file if I do $place-getTitle(), it prints the title correctly in the language I wanted. However, if I do $place-getTitle() on an action it returns nothing, I have to do $place-Translation['es']-title to get the title in Spanish. If I work with the default language ('en') $place-getTitle() works. Any idea on how to make $place-getTitle() to work always? thanks!

    Read the article

  • android accelerometer accuracy is extremely poor

    - by user564594
    Wrote a simple program that prints out accelerometer output. mSensorManager = (SensorManager)getSystemService(SENSOR_SERVICE); mSensorManager.registerListener(this,mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_FASTEST); It turns out that: The accelerometer output is always set to "lowest" as determined by: public void onAccuracyChanged(Sensor sensor, int accuracy) { if (collectingData){ accelPrecision.setText("accelerometer accuracy: " + accuracy); } } The actual accelerometer readings are very inaccurate - about 2%-5% fluctuations even when it's resting on the table as far as I can tell it's the same problem on Nexus S, Nexus One and G1 Any idea how it could be made mode accurate / what sets a

    Read the article

  • Sql Server performance

    - by Jose
    I know that I can't get a specific answer to my question, but I would like to know if I can find the tools to get to my answer. Ok we have a Sql Server 2008 database that for the last 4 days has had moments where for 5-20 minutes becomes unresponsive for specific queries. e.g. The following queries run in different query windows simultaneously have the following results SELECT * FROM Assignment --hangs indefinitely SELECT * FROM Invoice -- works fine Many of the tables have non-clustered indexes to help speed up SELECTs Here's what I know: 1) The same query will either hang indefinitely or run normally. 2) In Activity Monitor in the processes tab there are normally around 80-100 processes running I think that what's happening is 1) A user updates a table 2) This causes one or more indexes to get updated 3) Another user issues a select while the index is updating Is there a way I can figure out why at a specific moment in time SQL Server is being unresponsive for a specific query?

    Read the article

  • how to force ejb3 to reload value from data base and not use those of the context

    - by Kohan95
    Hello here I have a big problem that I hope to find help here I have two entities @Entity @Inheritance(strategy=InheritanceType.JOINED) @DiscriminatorColumn(name="Role", discriminatorType=DiscriminatorType.STRING) public class Utilisateur implements Serializable { private static final long serialVersionUID = 1L; @Id @GeneratedValue(strategy = GenerationType.IDENTITY) @Column(name="id") private Long id; @Column(name="nom",nullable=false) private String nom; @Column(name="Role",nullable=false, insertable=false) private String Role ; //... } @Entity @Table(name="ResCom") @DiscriminatorValue("ResCom") public class ResCom extends Utilisateur { /... } the first thing I do ResCom rsCom= new ResCom(nom,prenom, email,civilite, SysQl.crypePasse(pass)); gr.create(rsCom); I check my database I see that property is ResCom insert but when I check the value of role I get null Utilisateur tets= gr.findByEmail(email); message=tets.getEmail()+" and Role :"+tets.getRole()+""; but in my bass it ResCom !!!!! the problem disappears when I deploy the project again I hope you have a solution And thank you in advance sorry for my English

    Read the article

  • Find maximum number of logged on users in SQL

    - by lleto
    Hi, I want to keep tabs on the number of concurrent users of my application. I therefore log a time_start and a time_stop. If I now want to query the database for the maximum number of logged on users and return the start date, how would I do that. The table looks like this: id | time_start | time_stop ----+---------------------+--------------------- 1 | 2010-03-07 05:40:59 | 2010-03-07 05:41:33 2 | 2010-03-07 06:50:51 | 2010-03-07 10:50:51 3 | 2010-02-21 05:20:00 | 2010-03-07 12:23:44 4 | 2010-02-19 08:21:12 | 2010-03-07 12:37:28 5 | 2010-02-13 05:52:13 | Where time_stop is empty the user is still logged on. In this case I would expect to see 2010-03-07 returned, since all users (5) were logged on at that moment. However if I would run the query with 'where time_start BETWEEN '2010-02-17' AND '2010-02-23' I would expect to see 2010-02-21 with a maximum of 2. Is this possible direct in SQL (using postgres) or do I need to parse the results in PHP? Thanks, lleto

    Read the article

  • MySQL inner join different results

    - by Darryl at NetHosted
    I am trying to work out why the following two queries return different results: SELECT DISTINCT i.id, i.date FROM `tblinvoices` i INNER JOIN `tblinvoiceitems` it ON it.userid=i.userid INNER JOIN `tblcustomfieldsvalues` cf ON it.relid=cf.relid WHERE i.`tax` = 0 AND i.`date` BETWEEN '2012-07-01' AND '2012-09-31' and SELECT DISTINCT i.id, i.date FROM `tblinvoices` i WHERE i.`tax` = 0 AND i.`date` BETWEEN '2012-07-01' AND '2012-09-31' Obviously the difference is the inner join here, but I don't understand why the one with the inner join is returning less results than the one without it, I would have thought since I didn't do any cross table references they should return the same results. The final query I am working towards is SELECT DISTINCT i.id, i.date FROM `tblinvoices` i INNER JOIN `tblinvoiceitems` it ON it.userid=i.userid INNER JOIN `tblcustomfieldsvalues` cf ON it.relid=cf.relid WHERE cf.`fieldid` =5 AND cf.`value` REGEXP '[A-Za-z]' AND i.`tax` = 0 AND i.`date` BETWEEN '2012-07-01' AND '2012-09-31' But because of the different results that seem incorrect when I add the inner join (it removes some results that should be valid) it's not working at present, thanks.

    Read the article

  • Joomla! 2.5 use login data from another database

    - by user1756746
    I have a hard question. I'd like the joomla login does not use its own database for users/password but I want to use my database users with my table fields, my passwords etc.. I don't know from where start, I thought I could edit database request for login to my db or create a little script to automatically add the users on joomla database. I tried to see components/com_users/views/login/tmpl/default_login.php but it seems that there is nothing. Can someone help me figure out what to change? Maybe the simple thing is import my database users into database user joomla, is there any plugin or something else that you know? p.s. I use Clarion theme build on Gantry framework, Joomla! 2.5.6 Stable, PHP 5.2.17

    Read the article

  • What wiki tools exist to generate shippable user doc from a wiki?

    - by tletnes
    I am looking into using a wiki (prefer mediawiki, but not a req.) as the repository for developer generated documentation (User Guides, Release Notes, Application Notes, Errata, etc.) from a collaborative/easy-to-update point of view a wiki seems like a good match, however since this documentation will ultimately ship to customers we want to be able to export the documents in their final state (e.g. during the release cycle) to static versions that no longer include histories. Ideally the export would leave the document n a form where errors could be easily fixed by a non-programmer It would be good if niceties like section ordering and table of contents were available, or easy to add after the fact. Are any tools with features like these avalible?

    Read the article

  • How to find foreign-key dependencies pointing to one record in Oracle?

    - by daveslab
    Hi folks, I have a very large Oracle database, with many many tables and millions of rows. I need to delete one of them, but want to make sure that dropping it will not break any other dependent rows that point to it as a foreign key record. Is there a way to get a list of all the other records, or at least table schemas, that point to this row? I know that I could just try to delete it myself, and catch the exception, but I won't be running the script myself and need it to run clean the first time through. I have the tools SQL Developer from Oracle, and PL/SQL Developer from AllRoundAutomations at my disposal. Thanks in advance!

    Read the article

  • Export MySQL Data as Insert Statements

    - by gav
    Hi All, I'm working in Ubuntu with MySql and I also have Query Browser and Administrator installed, I'm not afraid of the command line either if it helps. I want simply to be able to run a query and see a result set but then convert that result set into a series of commands that could be used to create the same rows in a table of an identical schema. I hope the question makes sense, it's quite a simple problem and one that must have been solved but I can't for the life of me work out where this kind of conversion is made available. Thanks in advance, Gav

    Read the article

  • MySQLi insert into prepare error

    - by JPM
    Hi I inserted a lot of stuff into a mysql databse. But now I get an error in the prepare statement. I see Database prepare error. What am I doing wrong? This is my Code: $sql = "INSERT INTO Contact (IP,To,Name,Email,Subject,Text) VALUES ( ?, ?, ?, ?, ?, ? ) "; if (!$stmt = $db->prepare($sql)) { echo 'Database prepare error'; exit; } $stmt->bind_param('ssssss', $ip_contact, $to_contact, $name_contact, $email_contact, $subject_contact, $text_contact); if (!$stmt->execute()) { echo 'Database execute error'; exit; } $stmt->close(); My SQL table looks like this: Contact: - ID int(11) auto_increment primary key - IP varchar(15) - To varchar(5) - Name varchar(20) - Email varchar(20) - Subject varchar(20) - Text varchar(600)

    Read the article

  • Extracting Data Daily from MySQL to a Local MySQL DB

    - by Sunny Juneja
    I'm doing some experiments locally that require some data from a production MySQL DB that I only have read access to. The schemas are nearly identical with the exception of the omission of one column. My goal is to write a script that I can run everyday that extracts the previous day's data and imports it into my local table. The part that I'm most confused about is how to download the data. I've seen names like mysqldump be tossed around but that seems a way to replicate the entire database. I would love to avoid using php seeing as I have no experience with it. I've been creating CSVs but I'm worried about having the data integrity (what if there is a comma in a field or a \n) as well as the size of the CSV (there are several hundred thousand rows per day).

    Read the article

  • In query in Entity Frame work

    - by Syed Salman Raza Zaidi
    I am working on Entity frame work, i have created a method which is returning List of my Table, I am retrieving data on base of grpID(which is foreign key, so i can have multiple records) I have saved these grpID's in an array so I want to run IN command on Entity framework so that i can get records in single List, How can i apply In command,my code is below public List<tblResource> GetResources(long[] grpid) { try { return dataContext.tblResource.Where(c => c.GroupId == grpid && c.IsActive == true).ToList();//This code is not working as i am having array of groupIds } catch (Exception ex) { return ex; } }

    Read the article

  • Dynamic upsert in postgresql

    - by Daniel
    I have this upsert function that allows me to modify the fill_rate column of a row. CREATE FUNCTION upsert_fillrate_alarming(integer, boolean) RETURNS VOID AS ' DECLARE num ALIAS FOR $1; dat ALIAS FOR $2; BEGIN LOOP -- First try to update. UPDATE alarming SET fill_rate = dat WHERE equipid = num; IF FOUND THEN RETURN; END IF; -- Since its not there we try to insert the key -- Notice if we had a concurent key insertion we would error BEGIN INSERT INTO alarming (equipid, fill_rate) VALUES (num, dat); RETURN; EXCEPTION WHEN unique_violation THEN -- Loop and try the update again END; END LOOP; END; ' LANGUAGE 'plpgsql'; Is it possible to modify this function to take a column argument as well? Extra bonus points if there is a way to modify the function to take a column and a table.

    Read the article

  • What is "helpx_last_check" from wordpress database

    - by bvandrunen
    I am a developer who works full time with wordpress and I came across something in my database which I have never seen before. I tried the normal search engine approach and have found nothing. Wondering if the wonderful people of stackoverflow have seen this before. I am pretty sure it isn't harmful. This is the entry in the mysql database. Table is "_options" option_id: 1165 blog_id: 0 option_name: helpx_last_check option_value: 1276628545 autoload: yes I am specifically wondering what "helpx_last_check" is. Thanks

    Read the article

  • Should I continue using R v2.8.1 ?

    - by Mehper C. Palavuzlar
    I've been using R v2.8.1 for a long time. Normally I would upgrade it to the latest version but something keeps me away from the builds later than 2.8.1: I use read.table(file=file.choose(),header=TRUE) frequently in my libraries. After upgrading to 2.9.0, R started not to remember the latest directory used while selecting file. I downgraded to 2.8.1 and now R can remember again the last directory used. I don't know why they changed that behavior in this direction but this is absolutely crucial for me. It wastes my time in v2.9.0 every time I try to find a specific directory when R cannot remember it. Now R 2.10.1 is released. I don't know if they have corrected this issue. Should I upgrade or is it just enough to continue using v2.8.1? Will I miss something if I stick at 2.8.1?

    Read the article

  • Oracle Triggers Query..

    - by AGeek
    Lets consider a Table STUD and a ROW-Level TRIGGER is implemented over INSERT query.. My scenario goes like this, whenever a row is inserted, a trigger is fired and it should access some script file which is placed in the hard disk, and ultimately should print the result. So, is this thing is possible? and if yes, then this thing should exist in dynamic form, i.e. if we change the content of script file, then the oracle should reflect those changes as well. I have tried doing this for java using External Procedures, but doesn't feel that much satisfied with the result that i wanted. Kindly give your point-of-view for this kind of scenario and ways that this can be implemented.

    Read the article

  • MySQL command-line tool: How to find out number of rows affected by a DELETE?

    - by ambivalence
    I'm trying to run a script that deletes a bunch of rows in a MySQL (innodb) table in batches, by executing the following in a loop: mysql --user=MyUser --password=MyPassword MyDatabase < SQL_FILE where SQL_FILE contains a DELETE FROM ... LIMIT X command. I need to keep running this loop until there's no more matching rows. But unlike running in the mysql shell, the above command does not return the number of rows affected. I've tried -v and -t but neither works. How can I find out how many rows the batch script affected? Thanks!

    Read the article

  • How do I take the advantage of the variable column key/value structure of Cassandra

    - by icejade
    If I have to predefine all the columns, how do I take the advantage of the variable column key/value structure of cassandra? If I use update table command, it will insert null for all the rows which don't have that column. This is same as relational DB. For example, for contact column family, I have name, phone, email. I have 100 contacts have all 3 field. Then number 101 contact has skype id which I want to add. If I use insert statement, it won't let me add skypeid since it's not defined in the CF. So I have to run alter statement to change the CF, then all the first 100 contacts will have a null field for each of them.

    Read the article

  • Using fixtures with factory_girl

    - by deb
    When building the following factory: Factory.define :user do |f| f.sequence(:name) { |n| "foo#{n}" } f.resume_type_id { ResumeType.first.id } end ResumeType.first returns nil and I get an error. ResumeType records are loaded via fixtures. I checked using the console and the entries are there, the table is not empty. I've found a similar example in the factory_girl mailing list, and it's supposed to work. What am I missing? Do I have to somehow tell factory_girl to set up the fixtures before running the tests?

    Read the article

  • Best place to store large amounts of session data

    - by audiopleb
    I'm building an application that needs to store and re-use large amounts of data per session. So for example, the user selects a large list of list items (say 2000 or significantly more) which have a numeric value as their key then they save that selection and go off to another page, do something else and then come back to the original page and need to load their selections into that page. What is the quickest and most efficient way of storing and reusing that data? In a text file saved with the session id? In a temp db table? In the session data itself (db sessions so size isn't a limit) using a serialised string or using gzcompress or gzencode? Any advice or insight would be great! Thank you!!!!

    Read the article

  • CakePHP hasAndBelongsToMany (HABTM) Delete Joining Record

    - by Jason McCreary
    I have a HABTM relationship between Users and Locations. Both Models have the appropriate $hasAndBelongsToMany variable set. When I managing User Locations, I want to delete the association between the User and Location, but not the Location. Clearly this Location could belong to other users. I would expect the following code to delete just the join table record provided the HABTM associations, but it deleted both records. $this->Weather->deleteAll(array('Weather.id' => $this->data['weather_ids'], false); However, I am new to CakePHP, so I am sure I am missing something. I have tried setting cascade to false and changing the Model order with User, User-Weather, Weather-User. No luck. Thanks in advance for any help.

    Read the article

< Previous Page | 805 806 807 808 809 810 811 812 813 814 815 816  | Next Page >