Search Results

Search found 17140 results on 686 pages for 'records management'.

Page 276/686 | < Previous Page | 272 273 274 275 276 277 278 279 280 281 282 283  | Next Page >

  • Get forum page by PostID

    - by cem
    I can't figure out how it's working. Like this. How is this get page number -and records- by post id? I think the first option is; declare an index / int variable in post table and increase-decrease it when adding and deleting post. but whats happen when i delete first row and if table has one million records? Do you have any idea about this? by the way, i'm using nhibernate and sql server 2005. Thank you

    Read the article

  • SSIS - How do I see/set the field types in a Recordset?

    - by thursdaysgeek
    I'm looking at an inherited SSIS package, and a stored procedure is sending records to a recordset called USER:NEW_RECORDS. It's of type Object, and the value is System.Object. It is then used for inputting that data to a SQL table. We're getting an error, because it seems that the numeric results of the stored procedure are being put in a DT_WSTR field, and then failing when it is then put into a decimal field in the database. Most of the records are working, but one, which happens to have a longer number of decimal digits, is failing. I want to see exactly what my SSIS recordset field types are, and probably change them, so I can force the data to be truncated properly and copied. Or, perhaps, I'm not even looking at this correctly. The data is put into the recordset using a SQL Task that executes the stored procedure.

    Read the article

  • Is there a Drupal module for Forms with powerful CRUD style behaviour?

    - by Petras
    We are building a website that contains a large number of database tables that need to be edited by the CMS administrator. Some of the tables are fed by form submissions from users on the front end of the website. Some of the tables are purely in the CMS and are used to create custom modules on the front end of the website. Although there is a forms module in Drupal, I think our requirements cannot be met by it. Does anyone know of a system that allows forms to be saved to a CRUD style database with the following features? Export of all database fields. View a summary of the records in a filterable table. With paging You can have one to many relationships in records eg To code this manually for 10 forms is A LOT of work. Particularly the one to many relationships. If there is a powerful module available it would save us writing one.

    Read the article

  • How to add rows in middle of a table with jQuery?

    - by understack
    I've a table which has customers names along with the products they purchased with their prices. So there are multiple records for each customer. This table is simple 3 column table : name, product and price. What I wanna do is: Put all records belonging to one customer together (I've done it) and just after these rows add one new extra row which would just show total price of all the products each customer has purchased. This row would have empty cell in name and product column. And would have total in price column.

    Read the article

  • SQL searching table fields with LIKE

    - by Tom Gullen
    Given your data stored somewhere in a database: Hello my name is Tom I like dinosaurs to talk about SQL. SQL is amazing. I really like SQL. We want to implement a site search, allowing visitors to enter terms and return relating records. A user might search for: Dinosaurs And the SQL: WHERE articleBody LIKE '%Dinosaurs%' Copes fine with returning the correct set of records. How would we cope however, if a user mispells dinosaurs? IE: Dinosores (Poor sore dino). How can we search allowing for error in spelling? We can associate common misspellings we see in search with the correct spelling, and then search on the original terms + corrected term, but this is time consuming to maintain. Any way programatically?

    Read the article

  • Any simple approaches for managing customer data change requests for global reference files?

    - by Kelly Duke
    For the first time, I am developing in an environment in which there is a central repository for a number of different industry standard reference data tables and many different customers who need to select records from these industry standard reference data tables to fill in foreign key information for their customer specific records. Because these industry standard reference files are utilized by all customers, I want to reserve Create/Update/Delete access to these records for global product administrators. However, I would like to implement a (semi-)automated interface by which specific customers could request record additions, deletions or modifications to any of the industry standard reference files that are shared among all customers. I know I need something like a "data change request" table specifying: user id, user request datetime, request type (insert, modify, delete), a user entered text explanation of the change request, the user request's current status (pending, declined, completed), admin resolution datetime, admin id, an admin entered text description of the resolution, etc. What I can't figure out is how to elegantly handle the fact that these data change requests could apply to dozens of different tables with differing table column definitions. I would like to give the customer users making these data change requests a convenient way to enter their proposed record additions/modifications directly into CRUD screens that look very much like the reference table CRUD screens they don't have write/delete permissions for (with an additional text explanation and perhaps request priority field). I would also like to give the global admins a tool that allows them to view all the outstanding data change requests for the users they oversee sorted by date requested or user/date requested. Upon selecting a data change request record off the list, the admin would be directed to another CRUD screen that would be populated with the fields the customer users requested for the new/modified industry standard reference table record along with customer's text explanation, the request status and the text resolution explanation field. At this point the admin could accept/edit/reject the requested change and if accepted the affected industry standard reference file would be automatically updated with the appropriate fields and the data change request record's status, text resolution explanation and resolution datetime would all also be appropriately updated. However, I want to keep the actual production reference tables as simple as possible and free from these extraneous and typically null customer change request fields. I'd also like the data change request file to aggregate all data change requests across all the reference tables yet somehow "point to" the specific reference table and primary key in question for modification & deletion requests or the specific reference table and associated customer user entered field values in question for record creation requests. Does anybody have any ideas of how to design something like this effectively? Is there a cleaner, simpler way I am missing? Thank you so much for reading.

    Read the article

  • fetching rows from mysql table and displaying them as JGROWL notifications

    - by jeansymolanza
    hey guys, i am having problems implementing my php/mysql code into a jgrowl if you are familiar with jgrowl you will know it delivers notifications like Growl does for OS X i am trying to get it read all the records from my table but at the moment it is only displaying one record as a notification and it loops through it 4 times another problem is that if i have 5 rows in the table then jgrowl will only display 4 notifications are going to be viewed how do i get it to view all the records in the table as notifications and how do i display the total number of records (5) as notifications and account for the missing one at the moment thanking you guys in advance... God bless <script type="text/javascript"> // In case you don't have firebug... if (!window.console || !console.firebug) { var names = ["log", "debug", "info", "warn", "error", "assert", "dir", "dirxml", "group", "groupEnd", "time", "timeEnd", "count", "trace", "profile", "profileEnd"]; window.console = {}; for (var i = 0; i < names.length; ++i) window.console[names[i]] = function() {}; } (function($){ $(document).ready(function(){ // This specifies how many messages can be pooled out at any given time. // If there are more notifications raised then the pool, the others are // placed into queue and rendered after the other have disapeared. $.jGrowl.defaults.pool = 5; var i = 1; var y = 1; setInterval( function() { if ( i < <?php echo $totalRows_comment; ?> ) { <?php echo '$.jGrowl("'.$row_comment['comment'].'",'; ?> { sticky: true, log: function() { console.log("Creating message " + i + "..."); }, beforeOpen: function() { console.log("Rendering message " + y + "..."); y++; } }); } i++; } , 1000 ); }); })(jQuery); </script> <p> </span> <p>

    Read the article

  • linq-to-sql combine .any expression

    - by Victor
    I need to filter out parent by property value of child collection. I am doing something like this: var results = (from c in db.Customers where c.Orders.Any(o => o.Status = (int)Status.Ordered) select c; It's fine but now I need to filter by 2 values, i.e. take all parent records that have any chilren records that have BOTH values: var results = (from c in db.Customers where c.Orders.Any(o => o.Status == (int)Status.Ordered) && (o.Status == (int).Shipped)) select c; Trying something obvious like this doesn't work.

    Read the article

  • Why Tokyo Tyrant so slow

    - by Tantra
    I have follow situation tyrant server lunched on freebsd host, like this: ttserver -uas -log /data/tyrant/1.log -sid 1 -thnum 8 -tout 5 /data/tyrant/data/1.tct And i try to communicate this server on windows from python and pyrant-0.3.5: like this: import pyrant; import time; t = pyrant.Tyrant(host="192.168.0.220", port=1978); tbegin = time.time(); for i in xrange(4000000): if i and ((i % 10000) == 0): print time.time() - tbegin; tbegin = time.time(); t[i] = {"text": "ruslan text", "value": i}; and have i think very slow performance about 5-6 per 10,000 records. But if i start this code on the same machine like server(ttserver). Performance are good - about 0.5 sec per 10,000 records What i must do to workaround this problem?

    Read the article

  • Creating items in groups in Rails

    - by LearnRails
    My product table is id type price location 1 chips $3 aisle3 I have a question with adding the products in groups. There is a quantity field(nonmodel) where the user can enter the quantity While adding a new product if the user enters: type: soda quantity: 3 Then there 3 records should be created in product model with type= soda like the following. id type 2 soda 3 soda 4 soda If user enters location: aisle4 quantity: 2 Then id location 5 ailse4 6 ailse4 Can you tell me how to pass the nonmodel field 'quantity' to the rails(model or controller) and how use it to add the products in groups as mentioned above? or should I create a column called quantity in my product table? Will the history be updated too for all these new records with after_create filter which I already have ? Is there any good tutorial or book which shows how to pass such nonmodel html/javascript fields from view to rails and then back to the view? Any help will be greatly appreciated. Thanks

    Read the article

  • asp.net mvc making delete usercontrol information passing on and off

    - by mazhar kaunain baig
    i am creating a generalize deleteusercontrol , my aim is that on the listing page where all the records are listed when the delete is pressed i want to display the acknowledgment on the same page up the list. I had little idea to do that q1) first of all where will i place my deleteusercontrol(in the shared folder?). q2) on and off the deleteusercontrol as the acknowledgment will not be there all the time how would i be doing that on delete press. i don't want to pass any data in the querystring. q3)how would i be passing the records list id and listname to the general deleteusercontrol as it would be same for all the listing

    Read the article

  • Monitor what users are doing in an .net Application and trigger application functionality changes.

    - by Jamie Clayton
    I need some suggestions for how to implement a very basic mechanism that logs what multiple users are doing in an application. When another feature is running I then need to change the application, to restrict functionality. Use Case Example User can normaly edit unpaid records. If the application then runs a Payrun process (Long), I need to then change parts of the application to restrict functionality for a short period of time (eg. Make existing unpaid records readonly). Any suggestions on how I can do this in a .net application?

    Read the article

  • SaaS Architecture Question from Newbie

    - by user226767
    I have developed a number of departmental client-server applications, and am now ready to begin working on moving one of these applications to a SaaS model. I have done some basic web development, but I'm a newbie when it comes to SaaS architectures. One of the first questions that comes to mind as I try to design the architecture is the question of single vs. multi tenancy. The pros and cons of each vary significantly depending on the type of application and scale required, so I'd like to describe my application and scale needs below, and hope others can comment on how I should get started with the architecture. The client-server application currently consists of a Firebird database and a Windows application. The database contains about 20 tables containing a few thousand records in 4 primary tables, and a few hundred records in various lookup and related tables. Although the number of records is small, the size can get large, as the database can contain large BLOBS. Each customer sets up their own database and has a handful of users within the organization connected to it. When I update the db schema, a new windows application is released, and it checks the db schema and then applies the updates as needed. For the SaaS application, I am designing for 100's (not 1000's or millions) of new customers per year. My first thought was to go with a multi tenancy model to make updates easy (shut down apply the updates to one database, and then start up). On the other hand, a single tenancy model would provide a means to roll updates out to a group of customers at a time, and spread the risk of data corruption - i.e. if something goes wrong with a database, it will impact one customer instead of all customers. With this idea, I was thinking of having a single web front-end which would connect to a single customer database upon login. Thus, when a new customer creates an account, a new database would be created (each customer would have their own db with multiple users as needed for the customer). In this model, a db update would require either a process to go through each db to apply schema changes, or a trigger upon logging in to initiate a schema update similar to the client-server model currently in use. Can anyone point me to information for similar applications which have been ported from client-server to SaaS? Or provide any pointers to consider? Basically I'm looking for architecture examples of taking a departmental application and making it available as a self service website for multiple customers. Thanks for any suggestions, resources, etc.

    Read the article

  • Recreation of MySQL DB using "mysql mydb < mydb.sql" is really slow when the table has tens of milli

    - by Jian Lin
    It seems that a MySQL database that has a table with tens of millions of records will get a big INSERT INTO statement when the following mysqldump some_db > some_db.sql is done to back up the database. (is it 1 insert statement that handles all the records?) So when reconstructing the DB using mysql some_db < some_db.sql then the CPU is hardly busy (about 1.8% usage by the mysql process... I don't see a mysqld either?) and also the hard disk doesn't seem to be too busy... Last time, the whole restore process took 5 hours. Is there a way to make it faster? Such as, when doing mysqldump, can it break the INSERT statement into shorter ones, so that the mysql doesn't need to parse the line so hard when restoring the DB?

    Read the article

  • How to get all visible markers on current zoom level

    - by nefo_x
    Here are some points: 1) I have some markers on the map and records associated with it on the right panel besides the map. They are connected via numeric id, which is stored as a property of marker. 2) All the markers are stored in an array. 3) When the user zooms in the map, records, associated to only visible markers should be shown on the right panel. So, how to get the list of all visible markers on the current zoom level? I've searched over the internet and didn't find something useful. Some kind of what I'm trying to achieve could be found here

    Read the article

  • How to perform a many-to-many Linq query with Include in the EF.

    - by despart
    Hi, I don't know how to perform this query using Linq and the EF. Imagine I have three tables A, B and C. A and B have a many-to-many relationship. B and C have a 1-to-many relationship. I want to obtain records from B including C but filtering from A's Id. I can get easily the records from B: var b = Context.A.Where(x => x.Id.Equals(aId)).SelectMany(x => x.B); but when I try to include C I don't know how to do it: //This doesn't work var b = Context.A.Where(x => x.Id.Equals(aId)).SelectMany(x => x.B.Include("C")); Also I've tried this with no luck (it is equivalent to the above): //Not working var b = (from a in Context.A.Where(x => x.Id.Equals(aId)) from b in a.B.Include("C") select b); Thanks for your help.

    Read the article

  • Why won't my anonymous function fire on grid.prerender?

    - by adam0101
    In my gridview I have fields for inserting a new record in the footer. In my objectdatasource selecting event if no records came back I bind a single mock row to force the footer to show so they can still add records. Since the row does not contain real data I hide the row. ... If result.ItemCount = 0 Then result = mockRow AddHandler mygridview.PreRender, AddressOf HideRow End If End Sub Private Sub HideRow(ByVal sender as Object, ByVal e as EventArgs) mygridview.Rows(0).Visible = False End Sub This works fine. However, I'd like to condense it like this: ... If result.ItemCount = 0 Then result = mockRow AddHandler mygridview.PreRender, Function() mygridview.Rows(0).Visible = False End If End Sub This compiles fine, but the row doesn't get hidden. Can anyone tell me why my anonymous function isn't getting hit?

    Read the article

  • SQL Query to delete oldest rows over a certain row count?

    - by Casey
    I have a table that contains log entries for a program I'm writing. I'm looking for ideas on an SQL query (I'm using SQL Server Express 2005) that will keep the newest X number of records, and delete the rest. I have a datetime column that is a timestamp for the log entry. I figure something like the following would work, but I'm not sure of the performance with the IN clause for larger numbers of records. Performance isn't critical, but I might as well do the best I can the first time. DELETE FROM MyTable WHERE PrimaryKey NOT IN (SELECT TOP 10,000 PrimaryKey FROM MyTable ORDER BY TimeStamp DESC)

    Read the article

  • Sql Paging/Sorting - Efficent method with dynamic sort?

    - by dmose
    I'm trying to implement this style of paging: http://www.4guysfromrolla.com/webtech/042606-1.shtml CREATE PROCEDURE [dbo].[usp_PageResults_NAI] ( @startRowIndex int, @maximumRows int ) AS DECLARE @first_id int, @startRow int -- A check can be added to make sure @startRowIndex isn't > count(1) -- from employees before doing any actual work unless it is guaranteed -- the caller won't do that -- Get the first employeeID for our page of records SET ROWCOUNT @startRowIndex SELECT @first_id = employeeID FROM employees ORDER BY employeeid -- Now, set the row count to MaximumRows and get -- all records >= @first_id SET ROWCOUNT @maximumRows SELECT e.*, d.name as DepartmentName FROM employees e INNER JOIN Departments D ON e.DepartmentID = d.DepartmentID WHERE employeeid >= @first_id ORDER BY e.EmployeeID SET ROWCOUNT 0 GO This method works great, however, is it possible to use this method and have dynamic field sorting? If we change this to SELECT e.*, d.name as DepartmentName FROM employees e INNER JOIN Departments D ON e.DepartmentID = d.DepartmentID WHERE employeeid >= @first_id ORDER BY e.FirstName DESC It breaks the sorting... Is there any way to combine this method of paging with the ability to sort on different fields?

    Read the article

  • Correct Way to Get Date Between Dates In SQL Server

    - by Chuck Haines
    I have a table in SQL server which has a DATETIME field called Date_Printed. I am trying to get all records in the table which lie between a specified date range. Currently I am using the following SQL DECLARE @StartDate DATETIME DECLARE @EndDate DATETIME SET @StartDate = '2010-01-01' SET @EndDate = '2010-06-18 12:59:59 PM' SELECT * FROM table WHERE Date_Printed BETWEEN @StartDate AND @EndDate I have an index on the Date_Printed column. I was wondering if this is the best way to get the rows in the table which lie between those date or if there is a faster way. The table has about 750,000 records in it right now and it will continue to grow. The query is pretty fast but I'd like to make it faster if possible.

    Read the article

  • Subprocess fails to catch the standard output

    - by user343934
    I am trying to generate tree with fasta file input and Alignment with MuscleCommandline import sys,os, subprocess from Bio import AlignIO from Bio.Align.Applications import MuscleCommandline cline = MuscleCommandline(input="c:\Python26\opuntia.fasta") child= subprocess.Popen(str(cline), stdout = subprocess.PIPE, stderr=subprocess.PIPE, shell=(sys.platform!="win32")) align=AlignIO.read(child.stdout,"fasta") outfile=open('c:\Python26\opuntia.phy','w') AlignIO.write([align],outfile,'phylip') outfile.close() I always encounter with these problems Traceback (most recent call last): File "<string>", line 244, in run_nodebug File "C:\Python26\muscleIO.py", line 11, in <module> align=AlignIO.read(child.stdout,"fasta") File "C:\Python26\Lib\site-packages\Bio\AlignIO\__init__.py", line 423, in read raise ValueError("No records found in handle") ValueError: No records found in handle

    Read the article

  • MySQL Export with Column Heading

    - by st4nt0n
    Hello - I am very, very, new to mySQL. I've got experience in general technical terms, but not with the syntax or concepts of mySQL. I have been tasked with exporting a table from MySQL into a pipe delimited .txt or .xls that I can use to add 7500 more records to manually, then import back into the table. I tried to use INTO OUTFILE, but I don't get column headings, which I need for reference to merge the new records. Is there a good resource that can explain this to a complete novice? I would usually go down to my bookstore and start learning, but I'm on a bit of a time crunch. Thanks all!

    Read the article

  • How to suppress a group header based on running-count results?

    - by Sarah
    Hi I'm trying to suppress a group header when there are no detailed results in another following group. I have added a manual running count total which is showing correct numbers (such as 0 when no records show on the report). I've taken this approach since I have various items suppressed within the detailed section and don't want them as part of the count. I'm trying to say in the header not to shown the header if there are no corresponding records showing in the detailed section. But, it's not working. When I say suppress if the display count is 0, it suppresses all of the headers instead of just the ones that need to be displayed. HOW CAN I FIX THIS? THANKS... Sarah

    Read the article

  • What are some best practises and "rules of thumb" for creating database indexes?

    - by Ash
    I have an app, which cycles through a huge number of records in a database table and performs a number of SQL and .Net operations on records within that database (currently I am using Castle.ActiveRecord on PostgreSQL). I added some basic btree indexes on a couple of the feilds, and as you would expect, the peformance of the SQL operations increased substantially. Wanting to make the most of dbms performance I want to make some better educated choices about what I should index on all my projects. I understand that there is a detrement to performance when doing inserts (as the database needs to update the index, as well as the data), but what suggestions and best practices should I consider with creating database indexes? How do I best select the feilds/combination of fields for a set of database indexes (rules of thumb)? Also, how do I best select which index to use as a clustered index? And when it comes to the access method, under what conditions should I use a btree over a hash or a gist or a gin (what are they anyway?).

    Read the article

  • Build an unbound form in Acess 2007

    - by DoubleJ92
    I have an access application that has a form that allows the user to enter case notes. The main field of this form is tied to a SQL Server varchar(MAX) field in the source table. Since the users switched to Access 2007, their program keeps crashing when they are on the case notes form. As a possible solution to this problem, I would like to try unbinding this form and re-building it as an unbound form. This form needs to be able to add and update records into my SQL Server database. It also needs to be able to browse between records. I guess I am at a loss as to where to start. Any suggestions/code snippets is appreciated.

    Read the article

< Previous Page | 272 273 274 275 276 277 278 279 280 281 282 283  | Next Page >