Search Results

Search found 6399 results on 256 pages for 'a record'.

Page 46/256 | < Previous Page | 42 43 44 45 46 47 48 49 50 51 52 53  | Next Page >

  • How does object of sub-class record information about its super-class the in a Virtual Inheritance

    - by Summer_More_More_Tea
    Hi there: I encounter this problem when tackling with virtual inheritance. I remember that in a non-virtual inheritance hierarchy, object of sub-class hold an object of its direct super-class. What about virtual inheritance? In this situation, does object of sub-class hold an object of its super-class directly or just hold a pointer pointing to an object of its super-class? By the way, why the output of the following code is: sizeof(A): 8 sizeof(B): 20 sizeof(C): 32 Code: #include <iostream> using namespace std; class A{ char k[ 3 ]; public: virtual void a(){}; }; class B : public virtual A{ char j[ 3 ]; public: virtual void b(){}; }; class C : public virtual B{ char i[ 3 ]; public: virtual void c(){}; }; int main( int argc, char *argv[] ){ cout << "sizeof(A): " << sizeof( A ) << endl; cout << "sizeof(B): " << sizeof( B ) << endl; cout << "sizeof(C): " << sizeof( C ) << endl; return 0; } Thanks in advance. Kind regards.

    Read the article

  • How do I select a random record efficiently in MySQL?

    - by user198729
    mysql> EXPLAIN SELECT * FROM urls ORDER BY RAND() LIMIT 1; +----+-------------+-------+------+---------------+------+---------+------+-------+---------------------------------+ | id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra | +----+-------------+-------+------+---------------+------+---------+------+-------+---------------------------------+ | 1 | SIMPLE | urls | ALL | NULL | NULL | NULL | NULL | 62228 | Using temporary; Using filesort | +----+-------------+-------+------+---------------+------+---------+------+-------+---------------------------------+ The above doesn't qualify as efficient,how should I do it properly?

    Read the article

  • Set to null a parent record so that children are removed: howto?

    - by EugeneP
    How to delete a child row (on delete cascade ?) when setting a null value on a parent? Here's the db design. table A [id, b_id_1, b_id_2] table B [id, other fields...] b_id_1 and b_id_2 can be NULL if any of them is null, it means NO B records for corresponding FK (there are 2 of them) so (b_id_1,b_id_2) can be (null,null), (100, null), (null, 100_or_any_other_number) etc How in one SQL query both set b_id_1 or b_id_2 to null and delete all rows from B that have this id? What FK design should be applied to the 2 tables? what foreign keys should be added? A - B (FK_1: A.b_id_1 references B.id, FK_2: A.b_id_2 references B.id) and also B-A (FK_3: B.id references A.b_id_1, FK_4: B.id references A.b_id_2) ? But again, setting an A's b_id_1 or A's b_id_2 to null - will it remove any of B's records? I don't think so. So how to do that?

    Read the article

  • How to fetch a random record from SQLite database?

    - by Bruce
    I am working on PHP. I was working with MySQL before. Here is the code I used - $offset_result = mysqli_query($con, " SELECT FLOOR(RAND() * COUNT(*)) AS `offset` FROM students "); $offset_row = mysqli_fetch_object( $offset_result ); $offset = $offset_row->offset; $result = mysqli_query($con, " SELECT name FROM students LIMIT $offset, 1 " ); $row = mysqli_fetch_row($result); mysqli_free_result($result); What will be the corresponding set of statements for SQLite?

    Read the article

  • How do I return the IDENTITY for an inserted record from a stored Proecedure?

    - by user54197
    I am adding data to my database, but would like to retrieve the UnitID that is Auto generated. using (SqlConnection connect = new SqlConnection(connections)) { SqlCommand command = new SqlCommand("ContactInfo_Add", connect); command.Parameters.Add(new SqlParameter("name", name)); command.Parameters.Add(new SqlParameter("address", address)); command.Parameters.Add(new SqlParameter("Product", name)); command.Parameters.Add(new SqlParameter("Quantity", address)); command.Parameters.Add(new SqlParameter("DueDate", city)); connect.Open(); command.ExecuteNonQuery(); } ... ALTER PROCEDURE [dbo].[Contact_Add] @name varchar(40), @address varchar(60), @Product varchar(40), @Quantity varchar(5), @DueDate datetime AS BEGIN SET NOCOUNT ON; INSERT INTO DBO.PERSON (Name, Address) VALUES (@name, @address) INSERT INTO DBO.PRODUCT_DATA (PersonID, Product, Quantity, DueDate) VALUES (@Product, @Quantity, @DueDate) END

    Read the article

  • What's the easiest and safest way to record data being inputted by a user on a web site

    - by fred august
    Apologies, this is a tragically simple question that will bore most of you. I need to implement the simplest "leave your email and we'll contact you" web page. The simplest thing I could think of is doing an HTML form which calls a PHP script which appends the data in some file on the server. Easy to implement, but now I'm wondering if it's totally hackable. Is it? Are there obvious better ways that are still simple? thanks f

    Read the article

  • Rails - HABTM Relationship -- How Can I Find A Record Based On An Attribute Of The Associated Model

    - by ChrisWesAllen
    I have setup this HABTM relationship in the past and its worked before....Now it isnt and I'm at my wits end trying to figure out whats wrong. I've looking through the rails guides all day and cant seem to figure out what I'm doing wrong, so help would really be appreciated. I have 2 models connected through a join model and I'm trying to find records based an attribute of the associated model. Event.rb has_and_belongs_to_many :interests Interest.rb has_and_belongs_to_many :events and a join table migration that was created like create_table 'events_interests', :id => false do |t| t.column :event_id, :integer t.column :interest_id, :integer end I tried @events = Event.all(:include => :interest, :conditions => [" interest.id = ?", 4 ] ) But got the error "Association named 'interest' was not found; perhaps you misspelled it?"... which I didnt of course I tried @events = Event.interests.find(:all, :conditions => [" interest.id = ?", 4 ] ) but got the error "undefined method `interests' for #Class:0x4383348" How can I find the Events that have an interest id of 4....I'm definitely going bald from this lol

    Read the article

  • HOw do I delete record in a table by keeping certain datas??

    - by mathew
    my site has lots of incoming searches which is stored in a database to show recent queries into my website. due to high search queries my database is getting bigger in size. so what I want is I need to keep only recent queries in database say 10 records. this keeps my database small and queries will be faster. I am able to store incoming queries to database but don't know how to restrict or delete excess/old data from table. any help?? well I am using PHP and MySQL

    Read the article

  • How to get only one record for each duplicate rows of the id in oracle?

    - by Psychocryo
    suppose i have this table: group_id | image | image_id | ----------------------------- 23 blob 1 23 blob 2 23 blob 3 21 blob 4 21 blob 5 25 blob 6 25 blob 7 how to get results of only 1 of each group id? in this case,there may be multiple images for one group id, i just want one result of each group_id i tried distinct but i will only get group_id. max for image also would not work.

    Read the article

  • In Django, using __init__() method of non-abstract parent model to record class name of child model

    - by k-g-f
    In my Django project, I have a non-abstract parent model defined as follows: class Parent(models.Model): classType = models.CharField(editable=False,max_length=50) and, say, two children models defined as follows: class ChildA(Parent): parent = models.OneToOneField(Parent,parent_link=True) class ChildB(Parent): parent = models.OneToOneField(Parent,parent_link=True) Each time I create an instance of ChildA or of ChildB, I'd like the classType attribute to be set to the strings "ChildA" or "ChildB" respectively. What I have done is added an _ _ init_ _() method to Parent as follows: class Parent(models.Model): classType = models.CharField(editable=False,max_length=50) def __init__(self,*args,**kwargs): super(Parent,self).__init__(*args,**kwargs) self.classType = self.__class__.__name__ Is there a better way to implement and achieve my desired result? One downside of this implementation is that when I have an instance of the Parent, say "parent", and I want to get the type of the child object linked with "parent", calling "parent.classType" gives me "Parent". In order to get the appropriate "ChildA" or "ChildB" value, I need to write a "_getClassType()" method to wrap a custom sql query.

    Read the article

  • :from parameter in active record find not well designed?

    - by potlee
    i got this error: SQLite3::SQLException: no such column: apis.name: SELECT * FROM examples WHERE ("apis"."name" = 'deep') my code Api.find :all, :from => params[:table_name], :conditions => {:name => 'deep' } I need to make a back end rails application which will be used by a silverlight application. one of the requirements is to fetch simple data from the database. i need to be able to query different tables with the same code.(my app has 2000 tables!) i think it does not make sense for rails to put in "apis" in the WHERE clause. is there any speciic reason for this?

    Read the article

  • How do I compare 2 fields and return the lowest value of each record?

    - by BigRob
    I'm slowly learning access to make a database of products and suppliers for my parents' business. What i've got is a table of products indexed by our product reference and 2 more tables for 2 different suppliers that contains the suppliers product reference and price that links with our reference. I've made a query that performs a left outer join such that it returns a table of our products with each supplier's reference and price, i.e: Ref | Product Name | Supplier 1 Ref | Supplier 1 Price | Supplier 2 Ref | Supplier 2 Price Here's the query I used: SELECT Catalog.Ref, Catalog.[Product Name], Catalog.Price, [D Products].[Supplier Ref], [D Products].Cost, [GS Products].[Supplier Ref], [GS Products].Cost FROM ([Catalog] LEFT JOIN [D Products] ON Catalog.Ref = [D Products].Ref) LEFT JOIN [GS Products] ON Catalog.Ref = [GS Products].Ref; Not all products are available from both suppliers, hence the outer join. What I want to do (with a query?) is to take the table produced by the query above and simply show the product reference, cheapest supplier reference and cheapest supplier price, i.e: Ref | Cheapest Suppplier Ref | Cheapest Supplier Price Unfortunately my SQL knowledge isn't quite good enough to figure this out, but if anyone can help i'd really appreciate it. Thanks, Rob

    Read the article

  • Remotely Schedule and Stream Recorded TV in Windows 7 Media Center

    - by DigitalGeekery
    Have you ever been away from home and suddenly realized you forgot to record your favorite program? Now Windows 7 Media Center, users can schedule recordings remotely from their phones or mobile devices with Remote Potato. How it Works Remote Potato installs server software on the host computer running Windows 7 Media Center. Once the software is installed, we’ll need to do some port forwarding on the router and setup an optional dynamic DNS address. When setup is completed, we will access the application through a web based interface. Silverlight is required for Streaming recorded TV, but scheduling recordings can be done through an HTML interface. Installing Remote Potato Download and install Remote Potato on the Media Center PC. (See download link below) If you plan to stream any Recorded TV, you’ll also want to install the streaming pack located on the same page. It isn’t required to stream all shows, only shows that require the AC3 audio codec. Click Yes to allow Remote Potato to add rules to the Windows Firewall for remote access. You’ll likely need to accept a few UAC prompts. When notified that the rules were added, click OK. Remote Potato will then prompt you to allow administrator privileges to reserve a URL for it’s web server. Click Yes. Remote Potato server will start. Click on the configuration button at the right to to reveal the settings tabs.   One the General tab, you’ll have the option to run Remote Potato on startup and minimized in the System Tray. If you’re running Media Center on a dedicated HTPC, you’ll probably want to enable both startup options. Forwarding Ports on Your Router You’ll need to forward a couple ports on your router. By default, these will be ports 9080 and 9081. In this example we’re using a Linksys WRT54GL router, however, the steps for port forwarding will vary from router to router. On the Linksys configuration page, click on the Applications & Gaming Tab, and then the Port Range Forward tab. Under Application, type in a name of your choosing. In both the Start and End boxes, type the port number 9080. Enter the local IP address of your Media Center computer in the IP address column. Click the check box under Enable. Repeat the process on the next line, but this time use port 9081. When finished, click the Save Settings button. Note: It’s highly recommended that you configure the home computer running Media Center & Remote Potato with a static IP address.   Find your IP Address You’ll need to find the IP address assigned to your router from your ISP. There are many ways to do this but a quick and easy way is to visit a site like checkip.dyndns.org (link available below) The current external IP address of your router will be displayed in the browser.   Dynamic DNS This is an optional step, but  it’s highly recommended. Many routers, such as the Linksys WRT54GL we are using, support Dynamic DNS (DDNS). What Dynamic DNS allows you to do is affiliate your home router’s external IP address to a domain name. Every time your home router is assigned a a new IP address by your ISP, the domain name is updated to point to your new IP address. Remote Potato’s user interface is accessed over the Internet is by connecting to your router’s IP address followed by a colon and the port number. (Ex: XXX.XXX.XXX.XXX:9080) Instead of constantly having to look up and remember an IP address, you can use DDNS along with a 3rd party provider like DynDNS.com, to sign up for a free domain name and configure it to be updated each time your router is assigned a new IP address. Go to the DynDNS.com website (See link at the end of the article) and sign up for a free Domain name. You’ll need to register and confirm by email.   Once you’ve signed in and selected your domain name click Activate Services. You’ll get a confirmation message that your domain name has been activated.    On the Linksys WRT54GL click on the Setup tab an then DDNS. Select DynDNS.org, or TZO.com if you prefer to use their service, from the drop down list.   With DynDNS, you’ll need to fill in your username and password you signed up with at the DynDNS website and the hostname you chose. Note: You can connect over your local network with the IP Address of the computer running Remote Potato followed by a colon and the port number. Ex: 192.168.1.2:9080 Logging in Remote Potato and Recording a Show Once you connect, you’ll see the start page. To view the TV listings, click on TV Guide. You’ll then see your guide listings. There are a few ways to navigate the listings. At the top left, you can click on any of the preset time buttons to jump to  the listings at that time of the day.  Click on the arrows to the right and left of the day and date at the top center to proceed to the previous or next day. Or, jump to a specific day with the date and date buttons at the top right.   To setup a recording, click on a program.   You can choose to record the individual show or the entire series by clicking on Record Show or Record Series.   Remote Potato on Mobile Devices Perhaps the coolest feature of Remote Potato is the ability to schedule recording from your phone or mobile device. Note: For any devices or computers without Silverlight, you will be prompted to view the HTML page. Select Browse Listings. Select your program to record. In the Program Details, select Record Show to record the single episode or Record Series to record all instances of the series. You will then see a red dot on the program listing to indicate that the show is scheduled for recording.   Streaming Recorded TV Click on Recorded TV from the home screen to access your previously recorded TV programs. Click on the selection you wish to stream. Click on Play. If you receive this error message, you’ll need to install the streaming pack for Remote Potato. This is found on the same download page as installation files. (See link below) The Begin from slider allows you to start playback from the start (by default) or a different time of the program by moving the slider. The Quality (bitrate) setting  allows you to choose the quality of the playback. We found the video quality on the Normal setting to be pretty lousy, and Low was just pointless. High was the best overall viewing experience as it provided smooth quality video playback. We experienced significant stuttering during playback using the Ultra High setting.   Click Start when you are ready to begin. When playback begins you’ll see a slider at the top right.   Move the slider left or right to increase or decrease the size of the video. There’s also a button to switch to full screen.   Media Center users who travel frequently or are always on the go will likely find Remote Potato to be a blessing. Since being released earlier this year, updates for Remote Potato have come fast and furious. The latest beta release includes support for streaming music and photos. If you like those nice network TV logos, check out our article on adding TV channel logos to Windows Media Center. Downloads and Links Download Remote Potato and Streaming Pack Find your IP address Sign Up for a Domain Name at DynDNS.com Similar Articles Productive Geek Tips Schedule Updates for Windows Media CenterUsing Netflix Watchnow in Windows Vista Media Center (Gmedia)Add a Sleep Timer to Windows 7 Media CenterStartup Customizations for Media Center in Windows 7Enable Media Streaming in Windows Home Server to Windows Media Player TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 FoxClocks adds World Times in your Statusbar (Firefox) Have Fun Editing Photo Editing with Citrify Outlook Connector Upgrade Error Gadfly is a cool Twitter/Silverlight app Enable DreamScene in Windows 7 Microsoft’s “How Do I ?” Videos

    Read the article

  • PostgreSQL 8.4 won't start after blackout

    - by RiZe
    I have problem with starting PostgreSQL 8.4 on Ubuntu 9.10 Server after blackout. When I try to connect to the database it says: psql: server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request. When I try to start it by using command sudo -u postgres /etc/init.d/postgresql-8.4 start * Starting PostgreSQL 8.4 database server [ OK ] Netstat output netstat -tulp (No info could be read for "-p": geteuid()=1000 but you should be root.) Active Internet connections (only servers) Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name tcp 0 0 localhost:postgresql *:* LISTEN - tcp 0 0 192.168.1.35:svn *:* LISTEN - tcp 0 0 192.168.1.35:http-alt *:* LISTEN - tcp 0 0 *:ssh *:* LISTEN - tcp6 0 0 localhost:postgresql [::]:* LISTEN - tcp6 0 0 [::]:ssh [::]:* LISTEN - udp 0 0 *:bootpc *:* - But still don't work so lets restart it sudo -u postgres /etc/init.d/postgresql-8.4 restart * Restarting PostgreSQL 8.4 database server * The PostgreSQL server failed to start. Please check the log output: 2009-11-30 13:39:37 CET LOG: database system was shut down at 2009-11-30 13:39:33 CET 2009-11-30 13:39:37 CET LOG: autovacuum launcher started 2009-11-30 13:39:37 CET LOG: database system is ready to accept connections 2009-11-30 13:39:37 CET LOG: incomplete startup packet 2009-11-30 13:39:38 CET LOG: server process (PID 2240) was terminated by signal 11: Segmentation fault 2009-11-30 13:39:38 CET LOG: terminating any other active server processes 2009-11-30 13:39:38 CET LOG: all server processes terminated; reinitializing 2009-11-30 13:39:38 CET LOG: database system was interrupted; last known up at 2009-11-30 13:39:37 CET 2009-11-30 13:39:38 CET LOG: database system was not properly shut down; automatic recovery in progress 2009-11-30 13:39:38 CET LOG: record with zero length at 0/11D464C 2009-11-30 13:39:38 CET LOG: redo is not required 2009-11-30 13:39:38 CET LOG: autovacuum launcher started 2009-11-30 13:39:38 CET LOG: database system is ready to accept connections 2009-11-30 13:39:38 CET LOG: server process (PID 2248) was terminated by signal 11: Segmentation fault 2009-11-30 13:39:38 CET LOG: terminating any other active server processes 2009-11-30 13:39:38 CET LOG: all server processes terminated; reinitializing 2009-11-30 13:39:38 CET LOG: database system was interrupted; last known up at 2009-11-30 13:39:38 CET 2009-11-30 13:39:38 CET LOG: database system was not properly shut down; automatic recovery in progress 2009-11-30 13:39:38 CET LOG: record with zero length at 0/11D4690 2009-11-30 13:39:38 CET LOG: redo is not required 2009-11-30 13:39:39 CET LOG: autovacuum launcher started 2009-11-30 13:39:39 CET LOG: database system is ready to accept connections 2009-11-30 13:39:39 CET LOG: server process (PID 2256) was terminated by signal 11: Segmentation fault 2009-11-30 13:39:39 CET LOG: terminating any other active server processes 2009-11-30 13:39:39 CET LOG: all server processes terminated; reinitializing 2009-11-30 13:39:39 CET LOG: database system was interrupted; last known up at 2009-11-30 13:39:38 CET 2009-11-30 13:39:39 CET LOG: database system was not properly shut down; automatic recovery in progress 2009-11-30 13:39:39 CET LOG: record with zero length at 0/11D46D4 2009-11-30 13:39:39 CET LOG: redo is not required 2009-11-30 13:39:39 CET LOG: autovacuum launcher started 2009-11-30 13:39:39 CET LOG: database system is ready to accept connections 2009-11-30 13:39:39 CET LOG: server process (PID 2264) was terminated by signal 11: Segmentation fault 2009-11-30 13:39:39 CET LOG: terminating any other active server processes 2009-11-30 13:39:39 CET LOG: all server processes terminated; reinitializing 2009-11-30 13:39:39 CET LOG: database system was interrupted; last known up at 2009-11-30 13:39:39 CET 2009-11-30 13:39:39 CET LOG: database system was not properly shut down; automatic recovery in progress 2009-11-30 13:39:40 CET LOG: record with zero length at 0/11D4718 2009-11-30 13:39:40 CET LOG: redo is not required 2009-11-30 13:39:40 CET LOG: autovacuum launcher started 2009-11-30 13:39:40 CET LOG: database system is ready to accept connections 2009-11-30 13:39:40 CET LOG: server process (PID 2272) was terminated by signal 11: Segmentation fault 2009-11-30 13:39:40 CET LOG: terminating any other active server processes 2009-11-30 13:39:40 CET LOG: all server processes terminated; reinitializing 2009-11-30 13:39:40 CET LOG: database system was interrupted; last known up at 2009-11-30 13:39:40 CET 2009-11-30 13:39:40 CET LOG: database system was not properly shut down; automatic recovery in progress 2009-11-30 13:39:40 CET LOG: record with zero length at 0/11D475C 2009-11-30 13:39:40 CET LOG: redo is not required 2009-11-30 13:39:40 CET LOG: autovacuum launcher started 2009-11-30 13:39:40 CET LOG: database system is ready to accept connections 2009-11-30 13:39:41 CET LOG: server process (PID 2280) was terminated by signal 11: Segmentation fault 2009-11-30 13:39:41 CET LOG: terminating any other active server processes 2009-11-30 13:39:41 CET LOG: all server processes terminated; reinitializing 2009-11-30 13:39:41 CET LOG: database system was interrupted; last known up at 2009-11-30 13:39:40 CET 2009-11-30 13:39:41 CET LOG: database system was not properly shut down; automatic recovery in progress 2009-11-30 13:39:41 CET LOG: record with zero length at 0/11D47A0 2009-11-30 13:39:41 CET LOG: redo is not required 2009-11-30 13:39:41 CET LOG: autovacuum launcher started 2009-11-30 13:39:41 CET LOG: database system is ready to accept connections 2009-11-30 13:39:41 CET LOG: server process (PID 2288) was terminated by signal 11: Segmentation fault 2009-11-30 13:39:41 CET LOG: terminating any other active server processes 2009-11-30 13:39:41 CET LOG: all server processes terminated; reinitializing 2009-11-30 13:39:41 CET LOG: database system was interrupted; last known up at 2009-11-30 13:39:41 CET 2009-11-30 13:39:41 CET LOG: database system was not properly shut down; automatic recovery in progress 2009-11-30 13:39:41 CET LOG: record with zero length at 0/11D47E4 2009-11-30 13:39:41 CET LOG: redo is not required 2009-11-30 13:39:41 CET LOG: autovacuum launcher started 2009-11-30 13:39:41 CET LOG: database system is ready to accept connections 2009-11-30 13:39:42 CET LOG: server process (PID 2296) was terminated by signal 11: Segmentation fault 2009-11-30 13:39:42 CET LOG: terminating any other active server processes 2009-11-30 13:39:42 CET LOG: all server processes terminated; reinitializing 2009-11-30 13:39:42 CET LOG: database system was interrupted; last known up at 2009-11-30 13:39:41 CET 2009-11-30 13:39:42 CET LOG: database system was not properly shut down; automatic recovery in progress 2009-11-30 13:39:42 CET LOG: record with zero length at 0/11D4828 2009-11-30 13:39:42 CET LOG: redo is not required 2009-11-30 13:39:42 CET LOG: autovacuum launcher started 2009-11-30 13:39:42 CET LOG: database system is ready to accept connections 2009-11-30 13:39:42 CET LOG: server process (PID 2304) was terminated by signal 11: Segmentation fault 2009-11-30 13:39:42 CET LOG: terminating any other active server processes 2009-11-30 13:39:42 CET LOG: all server processes terminated; reinitializing 2009-11-30 13:39:42 CET LOG: database system was interrupted; last known up at 2009-11-30 13:39:42 CET 2009-11-30 13:39:42 CET LOG: database system was not properly shut down; automatic recovery in progress 2009-11-30 13:39:42 CET LOG: record with zero length at 0/11D486C 2009-11-30 13:39:42 CET LOG: redo is not required 2009-11-30 13:39:43 CET LOG: autovacuum launcher started 2009-11-30 13:39:43 CET LOG: database system is ready to accept connections 2009-11-30 13:39:43 CET LOG: server process (PID 2312) was terminated by signal 11: Segmentation fault 2009-11-30 13:39:43 CET LOG: terminating any other active server processes 2009-11-30 13:39:43 CET LOG: all server processes terminated; reinitializing 2009-11-30 13:39:43 CET LOG: database system was interrupted; last known up at 2009-11-30 13:39:42 CET 2009-11-30 13:39:43 CET LOG: database system was not properly shut down; automatic recovery in progress 2009-11-30 13:39:43 CET LOG: record with zero length at 0/11D48B0 2009-11-30 13:39:43 CET LOG: redo is not required 2009-11-30 13:39:43 CET LOG: autovacuum launcher started 2009-11-30 13:39:43 CET LOG: database system is ready to accept connections [fail] So what happened and what can I do to solve this? Thanks for replies

    Read the article

  • SOA Suite Integration: Part 3: Loading files

    - by Anthony Shorten
    One of the most common scenarios in SOA Integration is the loading of a file into the product from an external source. In Oracle SOA Suite there is a File Adapter that can process many file types into your BPEL process. For this example I will use the File Adapter to load a file of user and emails to update the user object within the Oracle Utilities Application Framework. Remember you can repeat this process with other objects and other file types. Again I am illustrating the ease of integration. The first thing is to create an empty BPEL process that will hold our flow. In Oracle JDeveloper this can be achieved by specifying the Define Service Later template (as other templates have predefined inputs and outputs and in this case we want to specify those). So I will create simpleFileLoad process to house our process. You will start with an empty canvas so you need to first specify the load part of the process using the File Adapter. Select the File Adapter from the Component Palette under BPEL Services and drag and drop it to the left side Partner Links (left is input). You name the Service. In this case I chose LoadFile. Press Next. We will define the interface as part of the wizard so select Define from operation and schema (specified later). Press Next. We are going to choose Read File to denote that we will read the file and specify the default Operation Name as Read. Press Next. The next step is to tell the Adapter the location of the files, how to process them and what to do with them after they have been processed. I am using hardcoded locations in this example but you can have logical locations as well. Press Next. I am now going to tell the adapter how to recognize the files I want to load. In my case I am using CSV files and more importantly I am tell the adapter to run the process for each record in the file it encounters. Press Next. Now, I tell the adapter how often I want to poll for the files. I have taken the defaults. Press Next. At this stage I have no explanation of the format of the input. So I am going to invoke the Native Format Wizard which will guide me through the process of creating the file input format. Clicking the purple cog icon will start the wizard. After an introduction screen (not shown), you specify the format of the input file. The File Adapter supports multiple format types. For this example, I will use Delimited as I am going to load a CSV file. Press Next. The best way for the wizard to work is with a sample. I have a sample file and the wizard will ask how much of the file to use as a template. I will use the defaults. Note: If you are using a language that has other languages other than US-ASCII, it is at this point you specify the character set to use.  Press Next. The sample contains multiple instances of a single record type. The wizard supports complex types as well. We will use the appropriate setting for our file. Press Next. You have to specify the file element and the record element. This will be used by the input wizard to translate the CSV data into an XML structure (this will make sense later). I am using LoadUsers as my file delimiter (root element) and User Record as my record root element. Press Next. As the file is CSV the delimiter is "," so I will also specify that the End Of Line (EOL) indicator indicates the end of a record. Press Next. Up until this point your have not given the columns their names. In my case my sample includes the column names in the first record. This is not always the case but you can specify the names and formats of columns in this dialog (not shown). Press Next. The wizard now generates the schema for the input file. You can specify a name for the schema. I have used userupdate.xsd. We want to verify the schema so press Test. You can test the schema by specifying an input sample. and pressing the green play button. You will see the delimiters you specified earlier for the file and the records. Press Ok to continue. A confirmation screen will be displayed showing you the location of the schema in your project. Press Finish to return to the File Adapter configuration. You will now see the schema and elements prepopulated from the wizard. Press Next. The File Adapter configuration is now complete. Press Finish. Now you need to receive the input from the LoadFile component so we need to place a Receive node in the BPEL process by drag and dropping the Receive component from the Component Palette under BPEL Constructs onto the BPEL process. We link the receive process with the LoadFile component by dragging the left most connect node of the Receive node to the LoadFile component. Once the link is established you need to name the Receive node appropriately and as in the post of the last part of this series you need to generate input variables for the BPEL process to hold the input records in. You need to now add the product Web Service. The process is the same as described in the post of the last part of this series. You drop the Web Service BPEL Service onto the right side of the process and fill in the details of the WSDL URL . You also have to add an Invoke node to call the service and generate the input and outputs variables for the call in the Invoke node. Now, to get the inputs from File to the service. You have to use a Transform (you can use an Assign action but a Transform action is more flexible). You drag and drop the Transform component from the Component Palette under Oracle Extensions and place it between the Receive and Invoke nodes. We name the Transform Node, Mapper File and associate the source of the mapping the schema from the Receive node and the output will be the input variable from the Invoke node. We now build the transform. We first map the user and email attributes by drag and drop the elements from the left to the right. The reason we needed to use the transform is that we will be telling the AS-User service that we want to issue an update action. Remember when we registered the service we actually used Read as the default. If we do not otherwise inform the service to use the Update action it will use the Read action instead (which is not desired). To specify the update action you need to click on the transactionType node on the right and select Set Text to set the action. You need to specify the transactionType of UPD (for update). The mapping is now complete. The final BPEL process is ready for deployment. You then deploy the BPEL process to the server and to test the service by simply dropping a file, in the same pattern/name as you specified, in the directory you specified in the File Adapter. You will see each record as a separate instance entry in the Fusion Middleware Control console. You can now load files into the product. You can repeat this process for each type of file to process. While this was a simple example it illustrates the method of loading data can be achieved using SOA Suite in conjunction with our products.

    Read the article

  • Passing sql results to views hard-codes views to database column names

    - by Galen
    I just realized that i may not be following best practices in regards to the MVC pattern. My issue is that my views "know" information about my database Here's my situation in psuedo code... My controller invokes a method from my model and passes it directly to the view view.records = tableGateway.getRecords() view.display() in my view each records as record print record.name print record.address ... In my view i have record.name and record.address, info that's hard-coded to my database. Is this bad? What other ways around it are there other than iterating over everything in the controller and basically rewriting the records collection. And that just seems silly. Thanks

    Read the article

  • DataMapper is only returning the last part of this query

    - by Josh K
    So I'm using the following: $r = new Record(); $r->select('ip, count(*) as ipcount'); $r->group_by('ip'); $r->order_by('ipcount', 'desc'); $r->limit(5); $r->get(); foreach($r->all as $record) { echo($record->ip." "); echo($record->ipcount." <br />"); } And I only get the last (fifth) record echo'ed out and no ipcount echoed. Is there a different way to go around this? I'm working on learning DataMapper (hence the questions) and need to figure some of this out. I haven't quite wrapped my head around the whole ORM thing.

    Read the article

  • UpdateModel prefix - ASP.NET MVC

    - by Kristoffer Ahl
    I'm having trouble with TryUpdateModel(). My form fields are named with a prefix but I am using - as my separator and not the default dot. <input type="text" id="Record-Title" name="Record-Title" /> When I try to update the model it does not get updated. If i change the name attribute to Record.Title it works perfectly but that is not what I want to do. bool success = TryUpdateModel(record, "Record"); Is it possible to use a custom separator?

    Read the article

  • Authlogic: Create records on other users' behalf

    - by Friðrik
    Hi Using Authlogic, what is the best way to create a record in rails on other users' behalf? Description: I have a c++ server which handles Tcp connections from many c++ clients, and I want the c++ server to create a new record in the rails database using its REST api. However, the c++ server needs to be authenticated before creating that record. What I want is to attach the original user ID (from the c++ client) to the record (but not the servers) so I know from which user the record came from. One way is for the c++ client to send its persistence token over to the c++ server which sends that token as a parameter to the create action, does that make sense? or are there maybe some better ways to do this? I have a rails app which uses authlogic for authentication. I also have another c++ client which is logs in and provides I have a c++ server which uses

    Read the article

  • Solr date range problem and specific date problem

    - by Hamid
    I index some data include date in solr but when search for specific date, i get some record (not all record) include some record in next day for example: http://localhost:8080/solr/select/?q=pubdate:[2010-03-25T00:00:00Z TO 2010-03-25T23:59:59Z]&start=0&rows=10&indent=on&sort=pubdate desc i have 625000 record in 2010-03-25 but above query result return 325412 that include 14 record from 2010-03-26. Also i try with below query, but not get right result http://localhost:8080/solr/select/?q=pubdate:"2010-03-25T00:00:00Z"&start=0&rows=10&indent=on&sort=pubdate desc How to get right result for specific date ??? Could you please help me? Thanks in advanced Hamid

    Read the article

< Previous Page | 42 43 44 45 46 47 48 49 50 51 52 53  | Next Page >