Search Results

Search found 31311 results on 1253 pages for 'oracle product info'.

Page 477/1253 | < Previous Page | 473 474 475 476 477 478 479 480 481 482 483 484  | Next Page >

  • Cannot use a Like query in a JSP prepared statement?

    - by SeerUK
    OK, first the query code and query: ps = conn.prepareStatement("select instance_id, ? from eam_measurement where resource_id in (select RESOURCE_ID from eam_res_grp_res_map where resource_group_id = ?) and DSN like '?' order by 2"); ps.setString(1,"SUBSTR(DSN,27,16)"); ps.setInt(2,defaultWasGroup); ps.setString(3,"%Module=jvmRuntimeModule:freeMemory%"); rs = ps.executeQuery(); while (rs.next()) { bla blah blah blah ... Returns an empty resultSet. Through basic debugging I have found its the 3rd bind that is the problem i.e. DSN like '?' I have tried all kinds of variations, the most sensible of which seemed to be using: DSN like concat('%',?,'%') bit that doesn' work as I am missing the ' ' either side of the concatenated string so I try DSN like ' concat('%',Module=P_STAG_JDBC01:poolSize,'%') ' order by 2 but I just can't seem to find a way to get them in that works. What am I missing? :) Thanks!

    Read the article

  • Begin Viewing Query Results Before Query Ends

    - by Frank Developer
    OK, so say I have a table with 500K rows, then I ad-hoc query with unsupported indexing which requires a full table scan. I would like to immediately view the first rows returned while the full table scan continues. Then I want to scroll thru the next results. In the meantime, I would like to display the progress of the table scan, example: "SEARCHING.. FOUND 23 OF 500,000 ROWS SO FAR". If I scroll too far ahead, I want to display a message like: "REACHED LAST ROW IN LOOK-AHEAD BUFFER.. QUERY HAS NOT COMPLETED".. Can this be done? Maybe like: spawn/exec, declare scroll cursor, open, fetch, etc.?

    Read the article

  • Loop to check all 14 days in the pay period

    - by Rachel Ann Arndt
    Name: Calc_Anniversary Input: Pay_Date, Hire_Date, Termination_Date Output: "Y" if is the anniversary of the employee's Hire_Date, "N" if it is not, and "T" if he has been terminated before his anniversary. Description: Create local variables to hold the month and day of the employee's Date_of_Hire, Termination_Date, and of the processing date using the TO_CHAR function. First check to see if he was terminated before his anniversary. The anniversary could be on any day during the pay period, so there will be a loop to check all 14 days in the pay period to see if one was his anniversary. CREATE OR replace FUNCTION Calc_anniversary( incoming_anniversary_date IN VARCHAR2) RETURN BOOLEAN IS hiredate VARCHAR2(20); terminationdate VARCHAR(20); employeeid VARCHAR2(38); paydate NUMBER := 0; BEGIN SELECT Count(arndt_raw_time_sheet_data.pay_date) INTO paydate FROM arndt_raw_time_sheet_data WHERE paydate = incoming_anniversary_date; WHILE paydate <= 14 LOOP SELECT To_char(employee_id, '999'), To_char(hire_date, 'DD-MON'), To_char(termination_date, 'DD-MON') INTO employeeid, hiredate, terminationdate FROM employees, time_sheet WHERE employees.employee_id = time_sheet.employee_id AND paydate = pay_date; IF terminationdate > hiredate THEN RETURN 'T'; ELSE IF To_char(SYSDATE, 'DD-MON') = To_char(hiredate, 'DD-MON')THEN RETURN 'Y'; ELSE RETURN 'N'; END IF; END IF; paydate := paydate + 1; END LOOP; END; Tables I am using CREATE TABLE Employees ( EMPLOYEE_ID INTEGER, FIRST_NAME VARCHAR2(15), LAST_NAME VARCHAR2(25), ADDRESS_LINE_ONE VARCHAR2(35), ADDRESS_LINE_TWO VARCHAR2(35), CITY VARCHAR2(28), STATE CHAR(2), ZIP_CODE CHAR(10), COUNTY VARCHAR2(10), EMAIL VARCHAR2(16), PHONE_NUMBER VARCHAR2(12), SOCIAL_SECURITY_NUMBER VARCHAR2(11), HIRE_DATE DATE, TERMINATION_DATE DATE, DATE_OF_BIRTH DATE, SPOUSE_ID INTEGER, MARITAL_STATUS CHAR(1), ALLOWANCES INTEGER, PERSONAL_TIME_OFF FLOAT, CONSTRAINT pk_employee_id PRIMARY KEY (EMPLOYEE_ID), CONSTRAINT fk_spouse_id FOREIGN KEY (SPOUSE_ID) REFERENCES EMPLOYEES (EMPLOYEE_ID)) / CREATE TABLE Arndt_Raw_Time_Sheet_data ( EMPLOYEE_ID INTEGER, PAY_DATE DATE, HOURS_WORKED FLOAT, SALES_AMOUNT FLOAT, CONSTRAINT pk_employee_id_pay_date_time PRIMARY KEY (EMPLOYEE_ID, PAY_DATE), CONSTRAINT fk_employee_id_time FOREIGN KEY (EMPLOYEE_ID) REFERENCES EMPLOYEES (EMployee_ID)); error FUNCTION Calc_Anniversary compiled Warning: execution completed with warning

    Read the article

  • Are Conditional subquery

    - by Tobias Schulte
    I have a table foo and a table bar, where each foo might have a bar (and a bar might belong to multiple foos). Now I need to select all foos with a bar. My sql looks like this SELECT * FROM foo f WHERE [...] AND ($param IS NULL OR (SELECT ((COUNT(*))>0) FROM bar b WHERE f.bar = b.id)) with $param being replaced at runtime. The question is: Will the subquery be executed even if param is null, or will the dbms optimize the subquery out?

    Read the article

  • DJANGO complex modelling

    - by SledgehammerPL
    Hello. I have such model now: receipt contains components. component contain product. The difference between component and product is, that component has quantity and measure unit: eg. component is 100g sugar - sugar is a product. So I need to make lots of components to satisfy different recipes - 100g sugar is not equal 200g sugar I wonder if I can remodel it to kick off components - in pure sql it's rather easy, but I'm trying to USE django - not making workarounds. class Receipt(models.Model): name = models.CharField(max_length=128) (...) components = models.ManyToManyField(Component) class Component(models.Model): quantity = models.FloatField(max_length=9) unit = models.ForeignKey(Unit) product = models.ForeignKey(Product) class Product(models.Model): name = models.CharField(max_length = 128) TIA

    Read the article

  • odp.net SQL query retrieve set of rows from two input arrays.

    - by Karl Trumstedt
    I have a table with a primary key consisting of two columns. I want to retrieve a set of rows based on two input arrays, each corresponding to one primary key column. select pkt1.id, pkt1.id2, ... from PrimaryKeyTable pkt1, table(:1) t1, table(:2) t2 where pkt1.id = t1.column_value and pkt1.id2 = t2.column_value I then bind the values with two int[] in odp.net. This returns all different combinations of my resulting rows. So if I am expecting 13 rows I receive 169 rows (13*13). The problem is that each value in t1 and t2 should be linked. Value t1[4] should be used with t2[4] and not all the different values in t2. Using distinct solves my problem, but I'm wondering if my approach is wrong. Anyone have any pointers on how to solve this the best way? One way might be to use a for-loop accessing each index in t1 and t2 sequentially, but I wonder what will be more efficient. Edit: actually distinct won't solve my problem, it just did it based on my input-values (all values in t2 = 0)

    Read the article

  • Cost of logic in a query

    - by FrustratedWithFormsDesigner
    I have a query that looks something like this: select xmlelement("rootNode", (case when XH.ID is not null then xmlelement("xhID", XH.ID) else xmlelement("xhID", xmlattributes('true' AS "xsi:nil"), XH.ID) end), (case when XH.SER_NUM is not null then xmlelement("serialNumber", XH.SER_NUM) else xmlelement("serialNumber", xmlattributes('true' AS "xsi:nil"), XH.SER_NUM) end), /*repeat this pattern for many more columns from the same table...*/ FROM XH WHERE XH.ID = 'SOMETHINGOROTHER' It's ugly and I don't like it, and it is also the slowest executing query (there are others of similar form, but much smaller and they aren't causing any major problems - yet). Maintenance is relatively easy as this is mostly a generated query, but my concern now is for performance. I am wondering how much of an overhead there is for all of these case expressions. To see if there was any difference, I wrote another version of this query as: select xmlelement("rootNode", xmlforest(XH.ID, XH.SER_NUM,... (I know that this query does not produce exactly the same, thing, my plan was to move the logic to PL/SQL or XSL) I tried to get execution plans for both versions, but they are the same. I'm guessing that the logic does not get factored into the execution plan. My gut tells me the second version should execute faster, but I'd like some way to prove that (other than writing a PL/SQL test function with timing statements before and after the query and running that code over and over again to get a test sample). Is it possible to get a good idea of how much the case-when will cost? Also, I could write the case-when using the decode function instead. Would that perform better (than case-statements)?

    Read the article

  • Refactoring PL/SQL triggers - extract procedures

    - by Juraj
    Hello, we have application where database contains large parts of business logic in triggers, with a update subsequently firing triggers on several other tables. I want to refactor the mess and wanted to start by extracting procedures from triggers, but can't find any reliable tool to do this. Using "Extract procedure" in both SQL Developer and Toad failed to properly handle :new and :old trigger variables. If you had similar problem with triggers, did you find a way around it? EDIT: Ideally, only columns that are referenced by extracted code would be sent as in/out parameters, like: Example of original code to be extracted from trigger: ..... if :new.col1 = some_var then :new.col1 := :old.col1 end if ..... would become : procedure proc(in old_col1 varchar2, in out new_col1 varchar2, some_var varchar2) is begin if new_col1 = some_var then new_col1 := old_col1 end if; end; ...... proc(:old.col1,:new.col1, some_var);

    Read the article

  • Persist data when the table was not mapped (JPA EclipseLink)

    - by enrique
    Hi everybody I need some help in persisting data into a table that has not been mapped... The issue is that the database we have has a table which all of its columns are foreign keys so by mapping the whole database all of the tables are correctly mapped. However that table called "category" is not mapped. The way in which we can browse the data is by passing for the table I mentioned using the @jointable annotation which was set by the system in the other tables with which "category" has a relation. So we can go ahead and using the collections and perform a query. But the issue comes when I want to persist data into that table because there´s no any entity. We tried to persist through the collections but no luck. Then I have just tried by creating the entity with its PK and Facade all by hand. However when I try to persist using the Merge method the system tries to perform an Insert when it is supposed to perform an Update. So obviously it returns an error. Does anybody have an idea on this situation? Thanks.-

    Read the article

  • 600 tables in DDBB

    - by Michael
    Hi all, I'm a very young software architect. Now I'm working in a very large and I have to lead a group of developers to rewrite all the mortgage system of the bank. I'm looking at database tables and I realize that there is no any data model, neither documentation. The worst part is that there are about 1000 tables in dev environment, and like 600 in production. I trust more the production environment, but anyway, what can I do? I mean, I can suicide me or something, but is there any good reverse engineering tool, so at least I could get the schema definition with the relations between tables and comments extracted from the fields? Can you advice me something? Thanks in advance.

    Read the article

  • Fill in missing values in a SELECT statement

    - by benjamin button
    If i have a table with two fields.customer id and order. let's say i have in total order ID 1,2,3,4 all the customer can have all the four orders.like below 1234 1 1234 2 1234 3 1234 4 3245 3 3245 4 5436 2 5436 4 you can see above that 3245 customer doesnt have order id 1 and 2. how could i print in the query output like 3245 1 3245 2 5436 1 5436 3 EDIT: i dont have order table but i have list of order's like we can hard code it in the query(1,2,3,4) i dont have an orders table.

    Read the article

  • How should I migrate DDL changes from one environment to the next?

    - by Rl
    I make DDL changes using SQL Developer's GUI. Problem is, I need to apply those same changes to the test environment. I'm wondering how others handle this issue. Currently I'm having to manually write ALTER statements to bring the test environment into alignment with the development environment, but this is prone to error (doing the same thing twice). In cases where there's no important data in the test environment I usually just blow everything away, export the DDL scripts from dev and run them from scratch in test. I know there are triggers that can store each DDL change, but this is a heavily shared environment and I would like to avoid that if possible. Maybe I should just write the DDL stuff manually rather than using the GUI?

    Read the article

  • drop-down combo box-like functionality for queries.

    - by Frank Developer
    Is it possible to provide the following type of fuctionality with informix client tools? As the user types the first two characters of a name, the drop-down list is empty. At the third character, the list fills with just the names beginning with those three characters. At the fourth character, MS-Access completes the first matching name (assuming the combo's AutoExpand is on). Once enough characters are typed to identify the customer, the user tabs to the next field. The time taken to load the combo between keystrokes is minimal. This occurs once only for each entry, unless the user backspaces through the first three characters again. If your list still contains too many records, you can reduce them by another order of magnitude by changing the value of constant conSuburbMin from 3 to 4.

    Read the article

  • Detemining a database's OS with a SQL query?

    - by KaizenSoze
    I'm writing a tool to gather customer configuration information. One of the questions I want to answer, what OS is the customer database running on. I haven't found a generic way to find the OS with SQL and I can't create stored procedures on the customer's database. If there is a way, it's probably vendor specific. Suggestions? Thanks in advance.

    Read the article

  • How can I store large amount of data from a database to XML (speed problem, part three)?

    - by Andrija
    After getting some responses, the current situation is that I'm using this tip: http://www.ibm.com/developerworks/xml/library/x-tipbigdoc5.html (Listing 1. Turning ResultSets into XML), and XMLWriter for Java from http://www.megginson.com/downloads/ . Basically, it reads date from the database and writes them to a file as characters, using column names to create opening and closing tags. While doing so, I need to make two changes to the input stream, namely to the dates and numbers. // Iterate over the set while (rs.next()) { w.startElement("row"); for (int i = 0; i < count; i++) { Object ob = rs.getObject(i + 1); if (rs.wasNull()) { ob = null; } String colName = meta.getColumnLabel(i + 1); if (ob != null ) { if (ob instanceof Timestamp) { w.dataElement(colName, Util.formatDate((Timestamp)ob, dateFormat)); } else if (ob instanceof BigDecimal){ w.dataElement(colName, Util.transformToHTML(new Integer(((BigDecimal)ob).intValue()))); } else { w.dataElement(colName, ob.toString()); } } else { w.emptyElement(colName); } } w.endElement("row"); } The SQL that gets the results has the to_number command (e.g. to_number(sif.ID) ID ) and the to_date command (e.g. TO_DATE (sif.datum_do, 'DD.MM.RRRR') datum_do). The problems are that the returning date is a timestamp, meaning I don't get 14.02.2010 but rather 14.02.2010 00:00:000 so I have to format it to the dd.mm.yyyy format. The second problem are the numbers; for some reason, they are in database as varchar2 and can have leading zeroes that need to be stripped; I'm guessing I could do that in my SQL with the trim function so the Util.transformToHTML is unnecessary (for clarification, here's the method): public static String transformToHTML(Integer number) { String result = ""; try { result = number.toString(); } catch (Exception e) {} return result; } What I'd like to know is a) Can I get the date in the format I want and skip additional processing thus shortening the processing time? b) Is there a better way to do this? We're talking about XML files that are in the 50 MB - 250 MB filesize category.

    Read the article

  • how to optimize this query?

    - by jest
    hi! the query is: select employee_id, last_name, salary, round((salary+(salary*0.15)), 0) as NewSalary, (round((salary+(salary*0.15)), 0) - salary) as “IncreaseAmount” from employees; can i optimize this round((salary+(salary*0.15)), 0) part in anyway, so that it doesn't appear twice..i tried giving it an alias but didn't work :(

    Read the article

  • How do I insert data into a object relational table with multiple ref in the schema.

    - by Yiling
    I have a table with a schema of Table(number, ref, ref, varchar2, varchar2,...). How would I insert a row of data into this table? When I do: "insert into table values (1, select ref(p), ref(d), '239 F.3d 1343', '35 USC § 283', ... from plaintiff p, defendant d where p.name='name1' and d.name='name2');" I get a "missing expression" error. If I do: "insert into table 1, select ref(p), ref(d), ... from plaintiff p, defendant where p.name=...;" I get a "missing keyword VALUES" error.

    Read the article

  • Error in connecting Eclipse to SQL Server

    - by user3721900
    This is the syntax error Jun 10, 2014 5:15:51 PM org.apache.catalina.core.AprLifecycleListener init INFO: The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: C:\Program Files (x86)\Java\jre7\bin;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;C:/Program Files (x86)/Java/jre7/bin/client;C:/Program Files (x86)/Java/jre7/bin;C:/Program Files (x86)/Java/jre7/lib/i386;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Program Files (x86)\Intel\iCLS Client\;C:\Program Files\Intel\iCLS Client\;C:\Program Files (x86)\AMD APP\bin\x86_64;C:\Program Files (x86)\AMD APP\bin\x86;C:\Program Files\Common Files\Microsoft Shared\Windows Live;C:\Program Files (x86)\Common Files\Microsoft Shared\Windows Live;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program Files (x86)\Windows Live\Shared;C:\Program Files (x86)\ATI Technologies\ATI.ACE\Core-Static;C:\Program Files\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files\Intel\Intel(R) Management Engine Components\IPT;C:\Program Files (x86)\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files (x86)\Intel\Intel(R) Management Engine Components\IPT;C:\Program Files\Microsoft\Web Platform Installer\;C:\Program Files (x86)\Microsoft ASP.NET\ASP.NET Web Pages\v1.0\;C:\Program Files (x86)\Windows Kits\8.0\Windows Performance Toolkit\;C:\Program Files\Microsoft SQL Server\110\Tools\Binn\;C:\Program Files (x86)\Microchip\MPLAB C32 Suite\bin;C:\Program Files\Java\jdk1.7.0_25\bin;C:\Program Files (x86)\Java\jdk1.7.0_03\bin;c:\Program Files (x86)\Microsoft SQL Server\100\Tools\Binn\VSShell\Common7\IDE\;c:\Program Files (x86)\Microsoft SQL Server\100\Tools\Binn\;c:\Program Files\Microsoft SQL Server\100\Tools\Binn\;c:\Program Files (x86)\Microsoft SQL Server\100\DTS\Binn\;c:\Program Files\Microsoft SQL Server\100\DTS\Binn\;C:\Program Files (x86)\Google\google_appengine\;C:\Users\Patrick\Desktop\2013-2014 2nd Sem Files\Eclipsee\eclipse;;. Jun 10, 2014 5:15:51 PM org.apache.tomcat.util.digester.SetPropertiesRule begin WARNING: [SetPropertiesRule]{Server/Service/Engine/Host/Context} Setting property 'source' to 'org.eclipse.jst.jee.server:B2B' did not find a matching property. Jun 10, 2014 5:15:51 PM org.apache.coyote.AbstractProtocol init INFO: Initializing ProtocolHandler ["http-bio-8080"] Jun 10, 2014 5:15:51 PM org.apache.coyote.AbstractProtocol init INFO: Initializing ProtocolHandler ["ajp-bio-8009"] Jun 10, 2014 5:15:51 PM org.apache.catalina.startup.Catalina load INFO: Initialization processed in 544 ms Jun 10, 2014 5:15:51 PM org.apache.catalina.core.StandardService startInternal INFO: Starting service Catalina Jun 10, 2014 5:15:51 PM org.apache.catalina.core.StandardEngine startInternal INFO: Starting Servlet Engine: Apache Tomcat/7.0.42 Jun 10, 2014 5:15:52 PM org.apache.coyote.AbstractProtocol start INFO: Starting ProtocolHandler ["http-bio-8080"] Jun 10, 2014 5:15:52 PM org.apache.coyote.AbstractProtocol start INFO: Starting ProtocolHandler ["ajp-bio-8009"] Jun 10, 2014 5:15:52 PM org.apache.catalina.startup.Catalina start INFO: Server startup in 374 ms com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect syntax near '`'. This is my code package b2b.fishermall; public class ConnectionString extends SqlStringCommands { public String getDriver(){ return "com.microsoft.sqlserver.jdbc.SQLServerDriver"; } public String getURL() { return "jdbc:sqlserver://localhost:1433;databaseName=B2B;integratedSecurity=true;"; } public String getUsername() { return ""; } public String getDbPassword() { return ""; } }

    Read the article

  • Proxy Service Is Response Required

    - by hakish
    hi, I want to know what exactly is the functionality of isResponseRequired (MQ Transport)option in proxy and business service. My understanding was that a Proxy can read from a queue and business service can write to a queue. Is it possible for a Proxy to write the response to a queue?

    Read the article

  • SQL query showing missing expression

    - by Ashok Dasari
    Query to pull the data between Yseterday 6AM to today 6AM ... SELECT lot_id, log_time, batch_no, eqp_id, STATION_ID, EXTRACTVALUE (META_DATA, '/lot_info/A3') AS A3, EXTRACTVALUE (META_DATA, '/lot_info/A3Info') AS A3Info, EXTRACTVALUE ( META_DATA, '/lot_info/apc_status_info' ) AS apc_status_info FROM t_dlis_log_history WHERE ( (EQP_ID = 'ALC4360') OR (EQP_ID = 'ALC4361') OR (EQP_ID = 'ALC1360') OR (EQP_ID = 'ALC1361') OR (EQP_ID = 'ALC1362') OR (EQP_ID = 'ALC1363') OR (EQP_ID = 'ALC1364') OR (EQP_ID = 'ALC1365') OR (EQP_ID = 'ALC355') OR (EQP_ID = 'ALC353') OR (EQP_ID = 'ALC4350') OR (EQP_ID = 'ALC354') ) AND (( log_time >= DATEADD ( HOUR, 6, CONVERT(VARCHAR (10), GETDATE (), 110) ) AND ( log_time <= DATEADD ( HOUR, 6, CONVERT(VARCHAR (10), GETDATE () + 1, 110) ) ) ) It is showing error missing expression ...

    Read the article

  • Indexing XMLType columns

    - by Chris
    Hello, I am working with a XMLType and currently experiencing significant performance issues and would like to incorporate indexing to the column type. Currently I am taking the approach of using the XMLTable() and XQuery functions to create a virtual table. I would like to use this Virtual Table to create a function based index on the table containing the XMLType, but I am receiving this error: Error report: SQL Error: ORA-00907: missing right parenthesis 00907. 00000 - "missing right parenthesis" *Cause: *Action: This is the index.. any assistance would be greatly appreciated. CREATE INDEX indx_medicinalproduct ON d.ProductName XMLTable('for $i at $a in /safetyreport/patient//drug for $j in $i/medicinalproduct return element r { $i/medicinalproduct }' PASSING s.safetyreport COLUMNS ProductName varchar2(70) PATH 'medicinalproduct') d;

    Read the article

  • How to get notified of changes on a read only table? (I.e., Price drop notifications.)

    - by mirthlab
    Let's say I have these tables/models: Product - id - last_updated_date - name - price User - id - name Wishlist - id - user_id - product_id The Product table has a few million records and is being updated automatically each night via a data import (inserting into a new table, dropping the old one). I basically have read-only access to that table/model. If a product is on a user's wishlist and the price drops, I'd like to be able to notify that user. What methods can be used to do this? I have a couple of ideas: Keep track of the Product.last_updated_date in the wishlist model and periodically poll the product table to see if it has been updated. This sounds like a horrible/non-scaleable solution. Some sort of Postgres View or Function that triggers when the Product table is updated? I'm new to postgres so I'm not yet sure if this is even possible. Something amazing that you will suggest that I haven't thought of :) Any help in the right direction is greatly appreciated!

    Read the article

  • Performing Inner Join for Multiple Columns in the Same Table

    - by frankiefrank
    I have a scenario which I'm a bit stuck on. Let's say I have a survey about colors, and I have one table for the color data, and another for people's answers. tbColors color_code , color_name 1 , 'blue' 2 , 'green' 3 , 'yellow' 4 , 'red' tbAnswers answer_id , favorite_color , least_favorite_color , color_im_allergic_to 1 , 1 , 2 3 2 , 3 , 1 4 3 , 1 , 1 2 4 , 2 , 3 4 For display I want to write a SELECT that presents the answers table but using the color_name column from tbColors. I understand the "most stupid" way to do it naming tbColors three times in the FROM section, using a different alias for each column to replace. How would a non-stupid way look?

    Read the article

< Previous Page | 473 474 475 476 477 478 479 480 481 482 483 484  | Next Page >