Search Results

Search found 409 results on 17 pages for 'delimiter'.

Page 10/17 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • PHP regular expression for positive number with 0 or 2 decimal places

    - by Peter
    Hi I am trying to use the following regular expression to check whether a string is a positive number with either zero decimal places, or 2: ^\d+(\.(\d{2}))?$ When I try to match this using preg_match, I get the error: Warning: preg_match(): No ending delimiter '^' found in /Library/WebServer/Documents/lib/forms.php on line 862 What am I doing wrong?

    Read the article

  • android error NoSuchElementException

    - by Alexander
    I have returned a cursor string but it contains a delimiter. The delimiter is . I have the string quest.setText(String.valueOf(c.getString(1)));I want to turn the into a new line. What is the best method to achieve this task in android. I understand there is a way to get the delimeter. I want this to achieved for each record. I can itterate through record like so. Cursor c = db.getContact(2); I tried using a string tokenizer but it doesnt seem to work. Here is the code for the tokenizer. I tested it in just plain java and it works without errors. String question = c.getString(1); // quest.setText(String.valueOf(c.getString(1))); //quest.setText(String.valueOf(question)); StringTokenizer st = new StringTokenizer(question,"<ENTER>"); //DisplayContact(c); // StringTokenizer st = new StringTokenizer(question, "=<ENTER>"); while(st.hasMoreTokens()) { String key = st.nextToken(); String val = st.nextToken(); System.out.println(key + "\n" + val); } I then tried running it in android. Here is the error log 06-06 22:31:55.251: E/AndroidRuntime(537): FATAL EXCEPTION: main 06-06 22:31:55.251: E/AndroidRuntime(537): java.util.NoSuchElementException 06-06 22:31:55.251: E/AndroidRuntime(537): at java.util.StringTokenizer.nextToken(StringTokenizer.java:208) 06-06 22:31:55.251: E/AndroidRuntime(537): at alex.android.test.database.quiz.TestdatabasequizActivity$1.onClick(TestdatabasequizActivity.java:95) 06-06 22:31:55.251: E/AndroidRuntime(537): at android.view.View.performClick(View.java:3511) 06-06 22:31:55.251: E/AndroidRuntime(537): at android.view.View$PerformClick.run(View.java:14105) 06-06 22:31:55.251: E/AndroidRuntime(537): at android.os.Handler.handleCallback(Handler.java:605) 06-06 22:31:55.251: E/AndroidRuntime(537): at android.os.Handler.dispatchMessage(Handler.java:92) 06-06 22:31:55.251: E/AndroidRuntime(537): at android.os.Looper.loop(Looper.java:137) 06-06 22:31:55.251: E/AndroidRuntime(537): at android.app.ActivityThread.main(ActivityThread.java:4424) 06-06 22:31:55.251: E/AndroidRuntime(537): at java.lang.reflect.Method.invokeNative(Native Method) 06-06 22:31:55.251: E/AndroidRuntime(537): at java.lang.reflect.Method.invoke(Method.java:511) 06-06 22:31:55.251: E/AndroidRuntime(537): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:784) 06-06 22:31:55.251: E/AndroidRuntime(537): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:551) 06-06 22:31:55.251: E/AndroidRuntime(537): at dalvik.system.NativeStart.main(Native Method) This is the database query public Cursor getContact(long rowId) throws SQLException { Cursor mCursor = db.query(true, DATABASE_TABLE, new String[] {KEY_ROWID, question, possibleAnsOne,possibleAnsTwo, possibleAnsThree,realQuestion,UR}, KEY_ROWID + "=" + rowId, null, null, null, null, null); if (mCursor != null) { mCursor.moveToFirst(); }

    Read the article

  • Control characters as delimiters

    - by Gio Borje
    I have a nodejs TCP server and a client. Basic network communication happens. Client sends "data + STX_CHARACTER + data + ETX_CHARACTER" (just an example). How do I split the string using the STX Control Character as a delimiter or how do I reference the character at all in Javascript.

    Read the article

  • Can a sql server aggregate udf be passed in multiple parameters?

    - by Burg
    I am trying to write an aggregate udf for using Sql Server 2008 and C# 3.5 that implodes an aggregation of data. The kind of syntax I am looking for is: SELECT [dbo].[Implode]([Id], ',') FROM [dbo].[Table] GROUP BY [ForeignID] where the second parameter is the delimiter for the aggregate function. And example return value would be something like: 1,4,56 Is there a way to have multiple parameters in an aggregate udf?

    Read the article

  • Writing csv header removes data from numpy array written below

    - by user338095
    I'm trying to export data to a csv file. It should contain a header (from datastack) and restacked arrays with my data (from datastack). One line in datastack has the same length as dataset. The code below works but it removes parts of the first line from datastack. Any ideas why that could be? s = ','.join(itertools.chain(dataset)) + '\n' newfile = 'export.csv' f = open(newfile,'w') f.write(s) numpy.savetxt(newfile, (numpy.transpose(datastack)), delimiter=', ') f.close()

    Read the article

  • Python equivalent of C++ getline()

    - by Arnab Sen Gupta
    In C++ we can enter multiple lines by giving our own choice of delimiting character in the getline() function.. however I am not able to do the same in Python!! it has only raw_input() and sys.stdin.readline() methods that read till I press enter. Is there any way to customize this so that I can specify my own delimiter?

    Read the article

  • How to return the value from MySql Stored Proc ??

    - by karthik
    I am using the below storedproc to generate the Insert statements of a specified table It is build-ed without any errors. Now i want to return the result set in "V_string" as output of the SP DELIMITER $$ DROP PROCEDURE IF EXISTS `demo`.`InsertGenerator` $$ CREATE DEFINER=`root`@`localhost` PROCEDURE `InsertGenerator`() SWL_return: BEGIN -- SQLWAYS_EVAL# to retrieve column specific information -- SQLWAYS_EVAL# table DECLARE v_string NATIONAL VARCHAR(3000); -- SQLWAYS_EVAL# first half -- SQLWAYS_EVAL# tement DECLARE v_stringData NATIONAL VARCHAR(3000); -- SQLWAYS_EVAL# data -- SQLWAYS_EVAL# statement DECLARE v_dataType NATIONAL VARCHAR(1000); -- SQLWAYS_EVAL# -- SQLWAYS_EVAL# columns DECLARE v_colName NATIONAL VARCHAR(50); DECLARE NO_DATA INT DEFAULT 0; DECLARE cursCol CURSOR FOR SELECT column_name,data_type FROM `columns` -- WHERE table_name = v_tableName; WHERE table_name = 'tbl_users'; DECLARE CONTINUE HANDLER FOR SQLEXCEPTION BEGIN SET NO_DATA = -2; END; DECLARE CONTINUE HANDLER FOR NOT FOUND SET NO_DATA = -1; OPEN cursCol; SET v_string = CONCAT('INSERT ',v_tableName,'('); SET v_stringData = ''; SET NO_DATA = 0; FETCH cursCol INTO v_colName,v_dataType; IF NO_DATA <> 0 then -- NOT SUPPORTED print CONCAT('Table ',@tableName, ' not found, processing skipped.') close cursCol; LEAVE SWL_return; end if; WHILE NO_DATA = 0 DO IF v_dataType in('varchar','char','nchar','nvarchar') then SET v_stringData = CONCAT(v_stringData,'SQLWAYS_EVAL# ll(',v_colName,'SQLWAYS_EVAL# ''+'); ELSE if v_dataType in('text','ntext') then -- SQLWAYS_EVAL# -- SQLWAYS_EVAL# else SET v_stringData = CONCAT(v_stringData,'SQLWAYS_EVAL# ll(cast(',v_colName,'SQLWAYS_EVAL# 00)),'''')+'''''',''+'); ELSE IF v_dataType = 'money' then -- SQLWAYS_EVAL# doesn't get converted -- SQLWAYS_EVAL# implicitly SET v_stringData = CONCAT(v_stringData,'SQLWAYS_EVAL# y,''''''+ isnull(cast(',v_colName,'SQLWAYS_EVAL# 0)),''0.0000'')+''''''),''+'); ELSE IF v_dataType = 'datetime' then SET v_stringData = CONCAT(v_stringData,'SQLWAYS_EVAL# time,''''''+ isnull(cast(',v_colName, 'SQLWAYS_EVAL# 0)),''0'')+''''''),''+'); ELSE IF v_dataType = 'image' then SET v_stringData = CONCAT(v_stringData,'SQLWAYS_EVAL# ll(cast(convert(varbinary,',v_colName, 'SQLWAYS_EVAL# 6)),''0'')+'''''',''+'); ELSE SET v_stringData = CONCAT(v_stringData,'SQLWAYS_EVAL# ll(cast(',v_colName,'SQLWAYS_EVAL# 0)),''0'')+'''''',''+'); end if; end if; end if; end if; end if; SET v_string = CONCAT(v_string,v_colName,','); SET NO_DATA = 0; FETCH cursCol INTO v_colName,v_dataType; END WHILE; END $$ DELIMITER ;

    Read the article

  • Does UrlDecode handle plus (+) correctly?

    - by harpo
    According to RFC 2396, The plus "+", dollar "$", and comma "," characters have been added to those in the "reserved" set, since they are treated as reserved within the query component. Indeed, search this site for "plus + comma , dollar $", and you get http://stackoverflow.com/search?q=plus+%2B+comma+,+dollar+$ Plus is only encoded (by the application) when it's not being used as a delimiter. But as others have observed, .NET's UrlDecode function converts plus to space. Where is this behavior specified?

    Read the article

  • Formatting Strings in a GridView Cell

    - by Coesy
    I pass text in a gridview cell with a pipe delimiter, for example "4|31.99|3", What I'd like to be able to do is format this text to show as ------------- | 4 | |£31.99 / 3%| ------------- As you can see, I need the 4 to be Bold and be on a line of it's own, the 31.99 to be a currency and the 3 to be a percentage. Can this be done in code-behind using a converter or something?

    Read the article

  • Getting n-th line of text output

    - by syker
    I have a script that generates two lines as output each time. I'm really just interested in the second line. Moreover I'm only interested in the text that appears between a pair of #'s on the second line. Additionally, between the hashes, another delimiter is used: ^A. It would be great if I can also break apart each part of text that is ^A-delimited (Note that ^A is SOH special character and can be typed by using Ctrl-A)

    Read the article

  • Handling extra newlines in csv files parsed with Python?

    - by rmihalyi
    I have a CSV file that contains extra newlines in some fields, e.g.: A, B, C, D, E, F 123, 456, tree , very, bla, indigo I tried the following: import csv catalog = csv.reader(open('test.csv', 'rU'), delimiter=",", dialect=csv.excel_tab) for row in catalog: print "Length: ", len(row), row and the result I got was this: Length: 6 ['A', ' B', ' C', ' D', ' E', ' F'] Length: 3 ['123', ' 456', ' tree'] Length: 4 [' ', ' very', ' bla', ' indigo'] Does anyone have any idea how I can quickly remove extraneous newlines? Thanks!

    Read the article

  • As3 split strangeness

    - by coulix
    Hello coders, Super simple example: var Path:String="E:\SWF Security\Acess Current Path\Access SWF URL.swf" var Path1:Array = Path.split("\\") // Split using the backslash as delimiter (No limit the number of returned tokens) trace(Path1) What do you expect path1 to be ? E: ? No its E:SWF SecurityAcess Current PathAccess SWF URL.swf and i have no idea why.

    Read the article

  • HTML tags in ojective c programming

    - by user1661826
    Is there any function in Objective-C to print a list of elements as unordered HTML elements. For ex - [@"a--b-c---d" htmlList: @"--"] returns @ "ab-c-d" the "--" is the delimiter to seperate the string of elements. I have used NSSArray to store the input string and function componentsSeperatedByString:@"--" to get the elements. How do I go ahead in adding the HTML tags li and /li between elements and ul and /ul at the end of the entire list? Please help, Im a beginner in objective -C

    Read the article

  • MYSQL trigger to select from and update the same table gives error #1241 - Operand should contain 1 column(s)

    - by Bipin
    when i try to select and update the same table mysql gives error error #1241 - Operand should contain 1 column(s) The trigger is DELIMITER $$ CREATE TRIGGER visitor_validation BEFORE INSERT ON ratingsvisitors FOR EACH ROW BEGIN SET @ifexists = (SELECT * FROM ratingcounttracks WHERE userid=New.vistorid AND likedate=New.likevalidation AND countfor=New.likeordislike); IF (@ifexists = NULL) THEN INSERT INTO ratingcounttracks(userid, likedate, clickcount,countfor) values (New.vistorid, New.likevalidation ,'1',New.likeordislike); ELSE UPDATE ratingcounttracks SET clickcount=clickcount+1 WHERE userid=New.vistorid AND likedate=New.likevalidation AND countfor=New.likeordislike; END IF; END$$

    Read the article

  • create procedure fails!?

    - by Mark
    Hi, when trying to create a simple procedure in mysql 5.1.47-community it fails everytime i've tried everything! even simple things like this! DELIMITER // CREATE PROCEDURE two () begin SELECT 1+1; end; //

    Read the article

  • .NET dynamic listbox

    - by Mike
    What I wanted to do was create a listbox from a delimited text file. The listbox would populate X # of rows based on the rows of the text file. And the listbox would have 3 columns, each being populated from a specific delimiter. Is this possible in C#? Any starting point would be great!

    Read the article

  • SOA Suite Integration: Part 3: Loading files

    - by Anthony Shorten
    One of the most common scenarios in SOA Integration is the loading of a file into the product from an external source. In Oracle SOA Suite there is a File Adapter that can process many file types into your BPEL process. For this example I will use the File Adapter to load a file of user and emails to update the user object within the Oracle Utilities Application Framework. Remember you can repeat this process with other objects and other file types. Again I am illustrating the ease of integration. The first thing is to create an empty BPEL process that will hold our flow. In Oracle JDeveloper this can be achieved by specifying the Define Service Later template (as other templates have predefined inputs and outputs and in this case we want to specify those). So I will create simpleFileLoad process to house our process. You will start with an empty canvas so you need to first specify the load part of the process using the File Adapter. Select the File Adapter from the Component Palette under BPEL Services and drag and drop it to the left side Partner Links (left is input). You name the Service. In this case I chose LoadFile. Press Next. We will define the interface as part of the wizard so select Define from operation and schema (specified later). Press Next. We are going to choose Read File to denote that we will read the file and specify the default Operation Name as Read. Press Next. The next step is to tell the Adapter the location of the files, how to process them and what to do with them after they have been processed. I am using hardcoded locations in this example but you can have logical locations as well. Press Next. I am now going to tell the adapter how to recognize the files I want to load. In my case I am using CSV files and more importantly I am tell the adapter to run the process for each record in the file it encounters. Press Next. Now, I tell the adapter how often I want to poll for the files. I have taken the defaults. Press Next. At this stage I have no explanation of the format of the input. So I am going to invoke the Native Format Wizard which will guide me through the process of creating the file input format. Clicking the purple cog icon will start the wizard. After an introduction screen (not shown), you specify the format of the input file. The File Adapter supports multiple format types. For this example, I will use Delimited as I am going to load a CSV file. Press Next. The best way for the wizard to work is with a sample. I have a sample file and the wizard will ask how much of the file to use as a template. I will use the defaults. Note: If you are using a language that has other languages other than US-ASCII, it is at this point you specify the character set to use.  Press Next. The sample contains multiple instances of a single record type. The wizard supports complex types as well. We will use the appropriate setting for our file. Press Next. You have to specify the file element and the record element. This will be used by the input wizard to translate the CSV data into an XML structure (this will make sense later). I am using LoadUsers as my file delimiter (root element) and User Record as my record root element. Press Next. As the file is CSV the delimiter is "," so I will also specify that the End Of Line (EOL) indicator indicates the end of a record. Press Next. Up until this point your have not given the columns their names. In my case my sample includes the column names in the first record. This is not always the case but you can specify the names and formats of columns in this dialog (not shown). Press Next. The wizard now generates the schema for the input file. You can specify a name for the schema. I have used userupdate.xsd. We want to verify the schema so press Test. You can test the schema by specifying an input sample. and pressing the green play button. You will see the delimiters you specified earlier for the file and the records. Press Ok to continue. A confirmation screen will be displayed showing you the location of the schema in your project. Press Finish to return to the File Adapter configuration. You will now see the schema and elements prepopulated from the wizard. Press Next. The File Adapter configuration is now complete. Press Finish. Now you need to receive the input from the LoadFile component so we need to place a Receive node in the BPEL process by drag and dropping the Receive component from the Component Palette under BPEL Constructs onto the BPEL process. We link the receive process with the LoadFile component by dragging the left most connect node of the Receive node to the LoadFile component. Once the link is established you need to name the Receive node appropriately and as in the post of the last part of this series you need to generate input variables for the BPEL process to hold the input records in. You need to now add the product Web Service. The process is the same as described in the post of the last part of this series. You drop the Web Service BPEL Service onto the right side of the process and fill in the details of the WSDL URL . You also have to add an Invoke node to call the service and generate the input and outputs variables for the call in the Invoke node. Now, to get the inputs from File to the service. You have to use a Transform (you can use an Assign action but a Transform action is more flexible). You drag and drop the Transform component from the Component Palette under Oracle Extensions and place it between the Receive and Invoke nodes. We name the Transform Node, Mapper File and associate the source of the mapping the schema from the Receive node and the output will be the input variable from the Invoke node. We now build the transform. We first map the user and email attributes by drag and drop the elements from the left to the right. The reason we needed to use the transform is that we will be telling the AS-User service that we want to issue an update action. Remember when we registered the service we actually used Read as the default. If we do not otherwise inform the service to use the Update action it will use the Read action instead (which is not desired). To specify the update action you need to click on the transactionType node on the right and select Set Text to set the action. You need to specify the transactionType of UPD (for update). The mapping is now complete. The final BPEL process is ready for deployment. You then deploy the BPEL process to the server and to test the service by simply dropping a file, in the same pattern/name as you specified, in the directory you specified in the File Adapter. You will see each record as a separate instance entry in the Fusion Middleware Control console. You can now load files into the product. You can repeat this process for each type of file to process. While this was a simple example it illustrates the method of loading data can be achieved using SOA Suite in conjunction with our products.

    Read the article

  • notepad sql Unicode and Non Unicode

    - by RBrattas
    Hi, I have a Microsoft Notepad flate file with data and Vertical Bar as column delimiter. I get following message: cannot convert between unicode and non-unicode string data types It seems it is my nvarchar(max) that creates my problem. I changed to varchar(max); but still the same problem. How do I insert my flate file into my SQL Server 2005? And in the SQL Server 2005 import and export wizard the flate file source advanced tab the OutputColumnWith is 50. Will that say my flate file column is max 50? I hope not because my column is more then 50... Thank you, Rune

    Read the article

  • IOError: [Errno 32] Broken pipe

    - by khati
    I got "IOError: [Errno 32] Broken pipe" while writing files in linux. I am using python to read each line a of csv file and then write into a database table. My code is f = open(path,'r') command = command to connect to database p = Popen(command, shell=True, stdin=PIPE, stdout=PIPE, stderr=PIPE, env=env) query = " COPY myTable( id, name, address) FROM STDIN WITH DELIMITER ';' CSV QUOTE '"'; " p.stdin.write(query.encode('ascii')) *-->(Here exactly I got the error, p.stdin.write(query.encode('ascii')) IOError: [Errno 32] Broken pipe )* So when I run this program in linux, I got error "IOError: [Errno 32] Broken pipe" . However this works fine when I run in windows7. Do I need to do some configuration in Linux sever? Any suggestions will be appreciated. Thank you.

    Read the article

  • Windows Handling Piped Comands Error Redirection

    - by jpmartins
    Warning: I am no expert on building scripts, and sorry for lousy English. In an case of generating a CSV from a database query I'm using the following commands. ... CALL java.exe -classpath ... com.xigole.util.sql.Jisql -user dmfodbc -pf pwd.file -driver com.sybase.jdbc3.jdbc.SybDriver -cstring %constr% -c ; -input 42.sql -formatter csv -delimiter ; 2%LOGFILE% | CALL grep -v -e "SELECT right" -e "executing: " -e " rows affect" %FicheiroR% 2%LOGFILE% ... I'm using windows implementation of grep. The 2%LOGFILE% in both java and grep command is causing an error message indicating the file is being use by another process. The Ugly workaround i have came up with is to put grep error redirect to a temporary %LOGFILE%.aux java ... | grep ... 2%LOGFILE%.aux type %LOGFILE%.aux % %LOGFILE% del %LOGFILE%.aux What is a better solution?

    Read the article

  • using sed to print next line after match

    - by smokinguns
    I came across various examples on printing next line after a match, that use awk and sed. I'm using it in my code and it works fine. Now I want to use a variable instead of a hardcoded value for the pattern match. The search pattern string includes forward slashes "/". How do I use a variable that has "/" in it and use it to print the next line after the match? The following doesn't seem to work: var="/somePath/to/my/home" val=`echo -e "$someStr" | sed -n ':$var:{n;p;}'` In this case, val is always blank. I'm using using ":" as the delimiter instead of "/". I'm on a Mac OS X.

    Read the article

  • MySQL Stored Procedure Parameters are NULL

    - by emmilely
    I am stumped, and was hoping someone here would have a quick and easy answer. I did a fresh install of MySQL 5.5 and am trying to pass parameters into a stored procedure. The parameter values aren't being read by the stored procedure. MySQL doesn't throw an error and processes the code with null parameters. Here is the code: DELIMITER $$ CREATE DEFINER=`root`@`%` PROCEDURE `testing`(IN parameter INTEGER) BEGIN UPDATE table_name SET valueToChange = 'Test' WHERE mainID = @parameter; END And here is the query I'm using to call it: USE database_name; CALL testing(72); Can anyone help?

    Read the article

  • Importing from CSV and sorting by Date

    - by Andrew Rice
    I have the following script that parses an HR output file looking for employees and outputs information such as Hire Dare, First Name, Last Name, Supervisor etc. The problem I have is that in the current format I think the Hire Date column is being treated as a string so in effect it orders the output by month (i.e. 1/1/01 comes before 2/2/98). Is there a way to map that column to a date/time so it sorts properly? Import-CSV -delimiter "`t" Output.tab | Where-Object {$_.'First Nae' -like '*And*'} | Sort-Object 'Hire Date' | ft 'Hire Date', 'First Name'

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >