Search Results

Search found 6399 results on 256 pages for 'record'.

Page 187/256 | < Previous Page | 183 184 185 186 187 188 189 190 191 192 193 194  | Next Page >

  • Caching result of SELECT statement for reuse in multiple queries

    - by Andrew
    I have a reasonably complex query to extract the Id field of the results I am interested in based on parameters entered by the user. After extracting the relevant Ids I am using the resulting set of Ids several times, in separate queries, to extract the actual output record sets I want (by joining to other tables, using aggregate functions, etc). I would like to avoid running the initial query separately for every set of results I want to return. I imagine my situation is a common pattern so I am interested in what the best approach is. The database is in MS SQL Server and I am using .NET 3.5.

    Read the article

  • rails - form to disply non-input type fields in nested form

    - by nktokyo
    Hi, I guess this is a newbie question, but what is the syntax in a form to show contents of fields not as a text box/area, but rather like label would appear. <% form_for @user do |f| %> <% f.fields_for :user_ingreds do |builder| %> <p> <%= builder.??? %> </p> <% end %> <% end%> user has many user_ingreds and accepts_nested_attributes for user_ingreds. Basically I want to make a list of user_ingreds where the user can't edit the data but can remove the record from the list via a button - however the fields_for builder doesn't recognize a direct call to to the fields in user_ingreds model (ie, builder.user_id throws and error.

    Read the article

  • What tools do people use to make programming tutorial videos?

    - by Pure.Krome
    Hi folks, I'm wanting to make some yee-run-o-the-mill tutorial video's about some programming concepts and stuff i've been doing. Nothing special ... lots of peeps been doing it. What tools are people using to record and edit these videos? What resolutions / fonts / sizes do people generally use/set? The only tool I've had experience with is Camtasia - and i didn't mind it. But i've seen vid's (or live demo's) where people zoom in to code sections.. how do they do that? For final editing, do most people just do some simple power point presentation with some video snippets mashed in. cheers!

    Read the article

  • EXC_BAD_ACCESS from AudioBuffer

    - by jfalexvijay
    I am trying to do the record using AudioUnit for iPhone app. Changes: (start) I have added the following code bufferList = (AudioBufferList *)malloc(sizeof(AudioBuffer)); bufferList-mNumberBuffers = 1; bufferList-mBuffers[0].mNumberChannels = 2; bufferList-mBuffers[0].mDataByteSize = 1024; bufferList-mBuffers[0].mData = calloc(256, sizeof(uint32_t)); Changes: (end) static OSStatus recordingCallback(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList *ioData) { OSStatus status; status = AudioUnitRender(appdelegate-audioUnit, ioActionFlags, inTimeStamp, inBusNumber, inNumberFrames, appdelegate-bufferList); if(status != 0) NSLog(@"AudioUnitRender status is %d", status); SInt16* samples = (SInt16*)(ioData-mBuffers[0].mData); ..... } fixed: (I am getting OSStatus -50 error code)- Because I didn't initialize the bufferList. I am EXC_BAD_ACCESS from AudioBuffer (ioData-mBuffers[0].mData). I am not sure with this error. Please help me to resolve it.

    Read the article

  • Which database should I use for best performance

    - by _simon_
    Hello, I am working in Visual Studio 2005, .NET 2.0. I need to write an application, which listens on COM port and saves incoming data to a database. Main feature: save incoming data (series of 13-digits numbers), if this number allready exists, then mark it as double. For example, there could be these records in database: 0000000000001 OK 0000000000002 OK 0000000000002 Double 0000000000003 OK 0000000000004 OK I could use SQL database, but I don't know if it is fast enough... Database should be able to store up to 10.000.000 records and write up to 100 records per minute (so it needs to check 100 times per minute if this record allready exists). Which database should I use? Maybe the whole database would need to be in RAM. Where could I learn more about this? Thanks

    Read the article

  • Getting certain array from a multidimensional array

    - by Leron
    I have multidimensional array which is a query returning the info from a table named 'users'. In another part of my code I need to get the records of only one certain user and I want to take it using the array I mentioned above. It's of type: array(24) { [0]=>array(9) { ["id"]=>string(1) "1" ... } [1]=>array(9) { ["id"]=>string(1) "2" ... } [2]=>array(9) { ["id"]=>string(1) "5" ...} I'll use foreach compairing by ["id"] to find the record I need, but when I get a match I'm not sure how to extract only this array from the parent one. Thanks Leron

    Read the article

  • How to enforce DB field size limit to related Java String?

    - by Sri Sankaran
    What is the correct way to defend against trying to store more text than can be accommodated in a VARCHAR2 database field? Say the PARTNER_ROLE field of the REGISTRATIONS table is declared as VARCHAR2(80). This field is mapped in the Registration class in Java as public class Registration { @Column(name=”PARTNER_ROLE” length=”80”) private String partnerRole; } However, the setPartnerRole() method allows the user to stuff a string of any length. The problem is encountered only when one subsequently tries to insert or update the REGISTRATIONS record. Oracle complains. What is the correct way to handle this situation?

    Read the article

  • how to use oracle package to get rid of Global Temp table

    - by john
    I have a sample query like below: INSERT INTO my_gtt_1 (fname, lname) (select fname, lname from users) In my effort to getting rid of temporary tables I created a package: create or replace package fname_lname AS Type fname_lname_rec_type is record ( fname varchar(10), lname varchar(10) ); fname_lname_rec fname_lname_rec_type Type fname_lname_tbl_type is table of fname_lname_rec_type; function fname_lname_func ( v_fnam in varchar2, v_lname in varchar2 )return fname_lname_tbl_type pipelined; being new to oracle...creating this package took a long time. but now I can not figure out how to get rid of the my_gtt_1 how can i say... INSERT INTO <newly created package> (select fnma, name from users)

    Read the article

  • Setting parameters after obtaining their values in stored procedures

    - by user1260028
    Right now I have an upload field while uploads files to the server. The prefix is saved so that it can later be obtained for retrieval. For this I need to attach the ID of the form to the prefix. I would like to be able to do this as such: @filePrefix = SCOPE_IDENTITY() + @filePrefix; However I am not so sure this would work because the record has not been created yet. If anything I could call an update function which obtains the ID and then injects it into the row after it has been created. To speed things up, I don't want to do this on the server but rather do this on the database. Regardless of what the approach is, I would still like to know if something like the above is possible (at least for future reference?) So if we replace that with @filePrefix = 5 + @filePrefix; would that be possible? SQL doesn't seem to like the current syntax very much...

    Read the article

  • Replacing emty csv column values with a zero

    - by homerjay
    Hey, So I'm dealing with a csv file that has missing values. What I want my script to is: #!/usr/bin/python import csv import sys #1. Place each record of a file in a list. #2. Iterate thru each element of the list and get its length. #3. If the length is less than one replace with value x. reader = csv.reader(open(sys.argv[1], "rb")) for row in reader: for x in row[:]: if len(x)< 1: x = 0 print x print row Here is an example of data, I trying it on, ideally it should work on any column lenghth Before: actnum,col2,col4 xxxxx , , xxxxx , 845 , xxxxx , ,545 After actnum,col2,col4 xxxxx , 0 , 0 xxxxx , 845, 0 xxxxx , 0 ,545 Any guidance would be appreciated

    Read the article

  • T-SQL Self Join in combination with aggregate function

    - by Nick
    Hi, i have the following table. CREATE TABLE [dbo].[Tree]( [AutoID] [int] IDENTITY(1,1) NOT NULL, [Category] [varchar](10) NULL, [Condition] [varchar](10) NULL, [Description] [varchar](50) NULL, CONSTRAINT [PK_Tree] PRIMARY KEY CLUSTERED ( [AutoID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO the data looks like this: INSERT INTO [Test].[dbo].[Tree] ([Category] ,[Condition] ,[Description]) VALUES ('1','Alpha','Type 1') INSERT INTO [Test].[dbo].[Tree] ([Category] ,[Condition] ,[Description]) VALUES ('1','Alpha','Type 1') INSERT INTO [Test].[dbo].[Tree] ([Category] ,[Condition] ,[Description]) VALUES ('2','Alpha','Type 2') INSERT INTO [Test].[dbo].[Tree] ([Category] ,[Condition] ,[Description]) VALUES ('2','Alpha','Type 2') go I try now to do the following: SELECT Category,COUNT(*) as CategoryCount FROM Tree where Condition = 'Alpha' group by Category but i wish also to get the Description for each Element. I tried several subqueries, self joins etc. i always come to the problem that the subquery cannot return more than one record. The problem is caused by a poor database design which i cannot change and i run out of ideas to get this done in a single query ;-(

    Read the article

  • Why is my left join not returning nulls?

    - by Griz
    In sql server 2008, I have the following query: select c.title as categorytitle, s.title as subcategorytitle, i.title as itemtitle from categories c join subcategories s on c.categoryid = s.categoryid left join itemcategories ic on s.subcategoryid = ic.subcategoryid left join items i on ic.itemid = i.itemid where (ic.isactive = 1 or ic.isactive is null) and i.siteid = 132 order by c.title, s.title I am trying to get items in their subcategories, but I still want to return a record if there are no items in the category or subcategory. Subcategories that have no items are never returned. What am I doing wrong? Thank you EDIT Modified query with a second left join and where clause, but it's still not returning nulls. :/

    Read the article

  • Changing set_timezone does not always take effect

    - by LearneR
    I have two table table-1 id date-time ----------------------- 1 2012-12-13 15:20:13 table-2 id date-time ----------------------- 1 2012-12-13 15:20:13 Now i am selecting the record with mysql set_timezone function Case-1 SET time_zone='+00:00'; SELECT `date-time` FROM `table-1`; // 2012-12-13 09:50:13 Case-2 SET time_zone='+00:00'; SELECT `date-time` FROM `table-2`; // 2012-12-13 15:20:13 ---Not converting to specified timezone In case-1 it's giving converted date-time, but not in Case-2. What would be the issue?

    Read the article

  • Binding multiple objects in Grails

    - by WaZ
    I have there domain classes: :: Person. (Person.ID, Name,Address) :: Designation.(Designation.ID, Title, Band) :: SalarySlip (Person.ID, Designation.ID, totalIncome, Tax etc etc.) In the update method the person controller when I assign a person a designation from a list of designation values I want to insert a new record inside SalarySlip. Something like: def update = { def SalarySlipInstance = new SalarySlip() SalarySlipInstance.Person.ID = Params.ID //is this correct? SalarySlipInstance.Designation.ID = ?? //since the value is coming from a list. How can I bind this field? } Much Appreciated, Thanks, WB

    Read the article

  • How to improve the speed of a loop containing a sqlalchemy query statement as conditional

    - by LtPinback
    This loop checks if a record is in the sqlite database and builds a list of dictionaries for those records that are missing and then executes a multiple insert statement with the list. This works but it is very slow (at least i think it is slow) as it takes 5 minutes to loop over 3500 queries. I am a complete newbie in python, sqlite and sqlalchemy so I wonder if there is a faster way of doing this. list_dict = [] session = Session() for data in data_list: if session.query(Class_object).filter(Class_object.column_name_01 == data[2]).filter(Class_object.column_name_00 == an_id).count() == 0: list_dict.append({'column_name_00':a_id, 'column_name_01':data[2]}) conn = engine.connect() conn.execute(prices.insert(),list_dict) conn.close() session.close() edit: I moved session = Session() outside the loop. Did not make a difference.

    Read the article

  • Getting "too many rows" error inside a "for" cursor loop

    - by Will
    I have a trigger that contains two cursors loops, one nested inside the other like this: FOR outer_rec IN outer_cursor LOOP FOR inner_rec IN inner_cursor LOOP -- Do some calculations END LOOP; END LOOP; Somewhere in this it is throwing the following error: ORA-01422: exact fetch returns more than requested number of rows I've been trying to determine where it's coming from for an hour or so.. but should this error never happen? Also.. I am assuming the inner loop automatically closes and opens itself again every time the outer loop goes the next record, i hope this is correct.

    Read the article

  • Get binary data from audio impulses

    - by Timo
    I have IR sensor which have TRS plug and I can record my remotes signals into audio. Now I want to control my computer with TV remote, but I don't have any clue how to compare audio input with pre-recorded audio. But after I realized that these audio waves contains only some kind data (binary) I can turn these into binary or hex, so it is much easier to compare. Waves look just like this: http://i.imgur.com/lCIyl.png And this: ttp://i.imgur.com/goJ6d.png These are records of "OK" button, sometimes there are some impulses on right channel too and I don't know why, it seems like connections in sensor are damaged maybe. Ok thats not matter, anyway I need help with python program which read these impulses and turn these into binary, in realtime from audio input(mic). I know it's sounds like "Do it for me, while I enjoy my life", but I don't have experiences with sound transforming/reading... I've looking for python examples for recording and reading audio, but unsuccessfully.

    Read the article

  • How to go 'back' 2 levels?

    - by Joe McGuckin
    From the list view of my app, I can view a list of records or drill down and edit/update a record. After updating, I want to go directly back to the list view, bypassing a couple of intermediate pages - but I don't simply want to link_to(:action => list) - there's pagination involved. I want to go back to the exact 'list' page I came from. What's the best way? Pass a hidden arg somewhere with the page number? Is there an elegant way to accomplish this?

    Read the article

  • ASP.NET Speed up DataView sorting/paging

    - by rlb.usa
    I have a page in ASP.NET where I'm using a Repeater to display a record listing. But it's slow as molasses, I've been tasked with speeding it up (sorting,paging). I've got it set up as follows: When user enters page, grab all of the data from the database (500 records, up to 4 relation'ed records) Store it all in Application["MyDataView"] On sort or paging, simply use the data view's internal sort/page method (no db calls) and rebind. I understand that databases can take time to query, but simply to have the DataView call it's sort method (no db calls) takes 10ish seconds, that's an alarmingly slow. Two questions: Why is it taking so long? How can I speed it up? A gridview is not possible.

    Read the article

  • linq to sql update data

    - by pranay
    can i update my employee record as given in below function or i have to make query of employee collection first and than i update data public int updateEmployee(App3_EMPLOYEE employee) { DBContextDataContext db = new DBContextDataContext(); db.App3_EMPLOYEEs.Attach(employee); db.SubmitChanges(); return employee.PKEY; } or i have to do this public int updateEmployee(App3_EMPLOYEE employee) { DBContextDataContext db = new DBContextDataContext(); App3_EMPLOYEE emp = db.App3_EMPLOYEEs.Single(e => e.PKEY == employee.PKEY); db.App3_EMPLOYEEs.Attach(employee,emp); db.SubmitChanges(); return employee.PKEY; } But i dont want to use second option is there any efficient way to update data

    Read the article

  • One-to-many relationship related to many tables

    - by Andrey
    I have a scenario where: there are two (or more) tables that represent independent items. lets say Users and Companies Both of these tables need addresses stored. Each one can have one or more address In a normal 1 to many scenario Addresses table woudl just have a UserId or a CompanyId creating a normal 1 to many relationship. In this case i have a few approaches i can think of the Addresses table could have both a UserId and a CompanyId and only one would be used for each record. 2 keys could be used ObjectId and ObjectType So Object id would have a UserId or a CompanyId, and ObjectType woudl be User or Company Create an ObjectTable and add ObjectId to Users and Companies. Addresses would then have an OjbectId I do not really like any of these solutions. i am wondering what is the best approach here. On another note i will most likely user linqtosql for my data access layer.

    Read the article

  • get foreign key objects in a single query - Django

    - by John
    Hi I have 2 models in my django code: class ModelA(models.Model): name = models.CharField(max_length=255) description = models.CharField(max_length=255) created_by = models.ForeignKey(User) class ModelB(models.Model): category = models.CharField(max_length=255) modela_link = models.ForeignKey(ModelA, 'modelb_link') functions = models.CharField(max_length=255) created_by = models.ForeignKey(User) Say ModelA has 100 records, all of which may or may not have links to ModelB Now say I want to get a list of every ModelA record along with the data from ModelB I would do: list_a = ModelA.objects.all() Then to get the data for ModelB I would have to do for i in list_a: i.additional_data = i.modelb_link.all() However this runs a query on every instance of i. Thus making 101 queries to run. Is there any way of running this all in just 1 query. Or at least less than the 101 queries. I've tried putting in ModelA.objects.select_related().all() but this didn't seem to have any effect. Thanks

    Read the article

  • How to set the pointer to the current row in the Datagridview in C#

    - by user286546
    I have a datagridview on the Windows Form in c#. I am updating and filling the table adapter on the cell event. The problem is that When I go to lower half of the datagrid which is not visible until I scroll down, as I click any cell, the table adapter is filled and updated and the pointerpoints to the very first row. Any suggestion on how to fix it. I have a ideas to record the top row of the datagridview that is visible and set the pointer to that row. But how to do it?

    Read the article

  • Elegant way to add functionallity to previously defined functions

    - by Bastiaan
    How to combine two functions together I have a class controlling some hardware: class Heater() def set_power(self,dutycycle, period) ... def turn_on(self) ... def turn_off(self) And a class that connects to a database and handles all data logging fuctionallity for an experiment: class DataLogger() def __init__(self) # Record measurements and controls in a database def start(self,t) # Starts a new thread to aqcuire and reccord measuements every t secconds Now, in my program recipe.py I want to do something like: log = DataLogger() @DataLogger_decorator H1 = Heater() log.start(60) H1.set_power(10,100) H1.turn_on() sleep(10) H1.turn_off() etc Where all actions on H1 are recorded by the datalogger. I can change any of the classes involved, just looking for an elegant way to do this. Ideally the hardware functions remain separated from the database and DataLogger functions. And ideally the DataLogger is reusable for other controls and measurements.

    Read the article

  • delete all records except the id I have in a python list

    - by jay_t
    Hi all, I want to delete all records in a mysql db except the record id's I have in a list. The length of that list can vary and could easily contain 2000+ id's, ... Currently I convert my list to a string so it fits in something like this: cursor.execute("""delete from table where id not in (%s)""",(list)) Which doesn't feel right and I have no idea how long list is allowed to be, .... What's the most efficient way of doing this from python? Altering the structure of table with an extra field to mark/unmark records for deletion would be great but not an option. Having a dedicated table storing the id's would indeed be helpful then this can just be done through a sql query... but I would really like to avoid these options if possible. Thanks,

    Read the article

< Previous Page | 183 184 185 186 187 188 189 190 191 192 193 194  | Next Page >