Search Results

Search found 3935 results on 158 pages for 'garreth 00'.

Page 49/158 | < Previous Page | 45 46 47 48 49 50 51 52 53 54 55 56  | Next Page >

  • mysql alter to table

    - by user485783
    Hi, I drop the mysql alter code below to database via phpmyadmin one by one, it it work fine, is there anyone could help me how to drop it all together at once? or do you know the the samples of php code that may execute it? just let me know please. thanks in advace ALTER TABLE user ADD title varchar(16) COLLATE utf8_bin NOT NULL DEFAULT '' AFTER user_id ALTER TABLE customer ADD title varchar(16) COLLATE utf8_bin NOT NULL DEFAULT '' AFTER customer_id ALTER TABLE customer ADD date_birtdate datetime NOT NULL DEFAULT '0000-00-00 00:00:00' AFTER lastname ALTER TABLE customer ADD security_question varchar(96) COLLATE utf8_bin NOT NULL DEFAULT '' AFTER fax ALTER TABLE customer ADD security_answer varchar(96) COLLATE utf8_bin NOT NULL DEFAULT '' AFTER fax ALTER TABLE customer ADD pin_number text COLLATE utf8_bin AFTER password ALTER TABLE customer ADD notes text COLLATE utf8_bin AFTER bank_number ALTER TABLE customer ADD last_active datetime NOT NULL DEFAULT '0000-00-00 00:00:00' AFTER date_added

    Read the article

  • Saving Blob data from SQLite database to a file

    - by Felipe
    Hello, I'm trying to save blob data from a SQLite database (Safari cache: Cache.db) to a file, but for some reason sqlite won't read the whole blob. I eventually would like to do this in ruby, but for now something that works directly in sqlite command prompt is fine. Also, I've read all of the entries that talk about this here on stackoverflow, but most of them only discuss the performance of saving images in blobs and the one entry that does show to save blobs to file is in C# which does not help me. Here is what I've tried: sqlite select * from cfurl_cache_response limit 1; 3501|0|945281827|0|http://www.gospelz.com/components/com_jomcomment/smilies/guest.gif|2010-02-24 16:20:07 sqlite select receiver_data from cfurl_cache_blob_data where entry_ID = 3501; GIF89a( A hexdump of the original (guest.gif) file shows that sqlite stops reading the blob after the first null value: $ hexdump -C guest.gif 00000000 47 49 46 38 39 61 28 00 28 00 f7 00 00 f1 f5 fd |GIF89a(.(.......| sqlite .output test.gif sqlite select receiver_data from cfurl_cache_blob_data where entry_ID = 3501; $ hexdump -C test.gif 00000000 47 49 46 38 39 61 28 0a |GIF89a(.|

    Read the article

  • OpenVPN Server Ethernet Bridging Question

    - by Hooplad
    Hello All, I am having a difficult time properly configuring an ethernet bridge using OpenVPN 2.0.9 install on CentOS 5 ( VPN server ). The goal that I am trying to complete is to connect a VM ( instance running on the same CentOS machine ) acting as a Microsoft Business Contact Manager server. I would then like this "BCM server" to serve Windows XP clients on 192.168.1.0/24 network as well as clients connecting from VPN ( 10.8.0.0/24 ). The setup as it is now was based off a known working configuration. The problem with the working configuration was that it would allow to the client to connect and access everything running on the VPN server ( SVN, Samba, VM Server ) but not any computers on the 192.168.1.0/24 network. I must disclose that the VPN server is behind a router/firewall. Ports are being forwarded correctly ( again, clients were able to connect to the VPN server with no problem. netcat confirms the udp port is open as well ). current ifconfig output br0 Link encap:Ethernet HWaddr 00:21:5E:4D:3A:C2 inet addr:192.168.1.169 Bcast:192.168.1.255 Mask:255.255.255.0 inet6 addr: fe80::221:5eff:fe4d:3ac2/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:846890 errors:0 dropped:0 overruns:0 frame:0 TX packets:3072351 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:42686842 (40.7 MiB) TX bytes:4540654180 (4.2 GiB) eth0 Link encap:Ethernet HWaddr 00:21:5E:4D:3A:C2 UP BROADCAST RUNNING SLAVE MULTICAST MTU:1500 Metric:1 RX packets:882641 errors:0 dropped:0 overruns:0 frame:0 TX packets:1781383 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:82342803 (78.5 MiB) TX bytes:2614727660 (2.4 GiB) Interrupt:169 eth1 Link encap:Ethernet HWaddr 00:21:5E:4D:3A:C3 UP BROADCAST RUNNING SLAVE MULTICAST MTU:1500 Metric:1 RX packets:650 errors:0 dropped:0 overruns:0 frame:0 TX packets:1347223 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:67403 (65.8 KiB) TX bytes:1959529142 (1.8 GiB) Interrupt:233 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:17452058 errors:0 dropped:0 overruns:0 frame:0 TX packets:17452058 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:94020256229 (87.5 GiB) TX bytes:94020256229 (87.5 GiB) tap0 Link encap:Ethernet HWaddr DE:18:C6:D7:01:63 inet6 addr: fe80::dc18:c6ff:fed7:163/64 Scope:Link UP BROADCAST RUNNING PROMISC MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:3086 errors:0 dropped:166 overruns:0 carrier:0 collisions:0 txqueuelen:100 RX bytes:0 (0.0 b) TX bytes:315099 (307.7 KiB) vmnet1 Link encap:Ethernet HWaddr 00:50:56:C0:00:01 inet addr:192.168.177.1 Bcast:192.168.177.255 Mask:255.255.255.0 inet6 addr: fe80::250:56ff:fec0:1/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:4224 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 b) TX bytes:0 (0.0 b) vmnet8 Link encap:Ethernet HWaddr 00:50:56:C0:00:08 inet addr:192.168.55.1 Bcast:192.168.55.255 Mask:255.255.255.0 inet6 addr: fe80::250:56ff:fec0:8/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:4226 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 b) TX bytes:0 (0.0 b) current route table Kernel IP routing table Destination Gateway Genmask Flags Metric Ref Use Iface 192.168.55.0 * 255.255.255.0 U 0 0 0 vmnet8 192.168.177.0 * 255.255.255.0 U 0 0 0 vmnet1 192.168.1.0 * 255.255.255.0 U 0 0 0 br0 current iptables output Chain INPUT (policy ACCEPT) target prot opt source destination ACCEPT all -- anywhere anywhere ACCEPT all -- anywhere anywhere Chain FORWARD (policy ACCEPT) target prot opt source destination ACCEPT all -- anywhere anywhere Chain OUTPUT (policy ACCEPT) target prot opt source destination server_known_working.conf local banshee port 1194 proto udp dev tap0 ca ca.crt cert banshee_server.crt key banshee_server.key dh dh1024.pem server 10.8.0.0 255.255.255.0 ifconfig-pool-persist ipp.txt push "route 192.168.1.0 255.255.255.0" client-to-client keepalive 10 120 tls-auth ta.key 0 user nobody group nobody persist-key persist-tun status openvpn-status.log verb 4 The following is the current CentOS server config file. server_ethernet_bridged.conf ( current ) local 192.168.1.169 port 1194 proto udp dev tap0 ca ca.crt cert server.crt key server.key dh dh1024.pem ifconfig-pool-persist ipp.txt server-bridge 192.168.1.169 255.255.255.0 192.168.1.200 192.168.1.210 push "route 192.168.1.0 255.255.255.0 192.168.1.1" client-to-client keepalive 10 120 tls-auth ta.key 0 user nobody group nobody persist-key persist-tun status openvpn-status.log verb 6 The following is one of the client's config file that was used with the known working configuration. client.opvn client dev tap proto udp remote XXX.XXX.XXX 1194 resolv-retry infinite nobind persist-key persist-tun ca client.crt cert client.crt key client.key tls-auth client.key 1 verb 3 I have tried the HOWTO provided by OpenVPN as well as others http://www.thebakershome.net/openvpn%5Ftutorial?page=1 with no success. Any help or suggestions would be appreciated.

    Read the article

  • format result of dat calc in linq

    - by d daly
    Hi I have a calculation in a linq query, although it brings back the correct nuber of days, Im not sure how to better format it select new { time = (System.DateTime.Today - cs.date_case_opened), }; just now it shows eg 4:00:00:00 if the difference is 4 days, any ideas how i can present this better? thanks

    Read the article

  • Find the Algorithm that generates the checksum

    - by knivmannen
    I have a sensing device that transmits a 6-byte message along with an 1-byte counter and supposely a checksum. The data looks something like this: ------DATA----------- -Counter- --Checksum?-- 55 FF 00 00 EC FF ---- 60---------- 1F The last four bits in the counter are always set 0, i.e those bits are probably not used. The last byte is assumed to be the checksum since it has a quite peculiar nature. It tends to randomly change as data changes. Now what i need is to find the algorithm to compute this checksum based on --DATA--. what i have tried is all possible CRC-8 polynomials, for each polynomial i have tried to reflect data, toggle it, initiate it with non-zeroes etc etc. Ive come to the conclusion that i am not dealing with a normal crc-algorithm. I have also tried some flether and adler methods without succes, xor stuff back and forth but still i have no clue how to generate the checksum. My biggest concern is, how is the counter used??? Same data but with different countervalue generates different checksums. I have tried to include the counter in my computations but without any luck. Here are some other datasamples: 55 FF 00 00 F0 FF A0 38 66 0B EA FF BF FF C0 CA 5E 18 EA FF B7 FF 60 BD F6 30 16 00 FC FE 10 81 One more thing that might be worth mentioning is that the last byte in the data only takes on the values FF or FE Plz if u have any tips or tricks that i may try post them here, I am truly desperate. Thx

    Read the article

  • Retrieving NSDate from NSString

    - by Olivier de Jonge
    I have an iPhone app. that is receiving data with IRFC 3339 timestamp format (e.g. @"2010-01-29T11:30:00.000+01:00"), as in GData. I want to convert the data to an NSDate NSDateFormatter *inputFormatter = [[[NSDateFormatter alloc] init] autorelease]; [inputFormatter setDateFormat:@"yyyy-MM-dd'T'HH:mm:ss.SSS"]; [currentEntry setStartTime:[inputFormatter dateFromString: ][currentEntry startTimeString]]]; But I'm missing out how to convert the last part of the string @"2010-01-29T11:30:00.000+01:00": the time offset. Anyone knows what I have to add to this String to take the time offset in account too?

    Read the article

  • How to avoid timestamp issue in a long query?

    - by pingi
    Hi, I have the following 2 tables: items: id int primary key bla text events: id_items int num int when timestamp without time zone ble text composite primary key: id_items, num and want to select to each item the most recent event (the newest 'when'). I wrote an request, but I don't know if it could be written more efficiently. Also on PostgreSQL there is a issue with comparing Timestamp objects: 2010-05-08T10:00:00.123 == 2010-05-08T10:00:00.321 so I select with 'MAX(num)' Any thoughts how to make it better? Thanks. SELECT i.*, ea.* FROM items AS i JOIN ( SELECT t.s AS t_s, t.c AS t_c, max(e.num) AS o FROM events AS e JOIN ( SELECT DISTINCT id_item AS s, MAX(when) AS c FROM events GROUP BY s ORDER BY c ) AS t ON t.s = e.id_item AND e.when = t.c GROUP BY t.s, t.c ) AS tt ON tt.t_s = i.id JOIN events AS ea ON ea.id_item = tt.t_s AND ea.cas = tt.t_c AND ea.num = tt.o;

    Read the article

  • Any array function or user defined function to pick up only first occurence of value from the array?

    - by OM The Eternity
    If i have an array Array ( [0] => Array ( [0] => 137 [id] => 137 [1] => 153 [trackid] => 153 [2] => jos_menu [table_name] => jos_menu [3] => UPDATE [operation] => UPDATE [4] => 0 [oldvalue] => 0 [5] => 62 [newvalue] => 62 [6] => checked_out [field] => checked_out [7] => 0 [live] => 0 [8] => 2010-05-11 17:46:28 [changedone] => 2010-05-11 17:46:28 ) [1] => Array ( [0] => 138 [id] => 138 [1] => 153 [trackid] => 153 [2] => jos_menu [table_name] => jos_menu [3] => UPDATE [operation] => UPDATE [4] => 0000-00-00 00:00:00 [oldvalue] => 0000-00-00 00:00:00 [5] => 2010-05-11 12:16:28 [newvalue] => 2010-05-11 12:16:28 [6] => checked_out_time [field] => checked_out_time [7] => 0 [live] => 0 [8] => 2010-05-11 17:46:28 [changedone] => 2010-05-11 17:46:28 ) [2] => Array ( [0] => 139 [id] => 139 [1] => 153 [trackid] => 153 [2] => jos_menu [table_name] => jos_menu [3] => UPDATE [operation] => UPDATE [4] => Subhash [oldvalue] => Subhash [5] => Subhashgfhfgh [newvalue] => Subhashgfhfgh [6] => name [field] => name [7] => 0 [live] => 0 [8] => 2010-05-11 17:46:35 [changedone] => 2010-05-11 17:46:35 ) [3] => Array ( [0] => 140 [id] => 140 [1] => 153 [trackid] => 153 [2] => jos_menu [table_name] => jos_menu [3] => UPDATE [operation] => UPDATE [4] => subhash [oldvalue] => subhash [5] => subhashhfhf [newvalue] => subhashhfhf [6] => alias [field] => alias [7] => 0 [live] => 0 [8] => 2010-05-11 17:46:35 [changedone] => 2010-05-11 17:46:35 ) [4] => Array ( [0] => 141 [id] => 141 [1] => 153 [trackid] => 153 [2] => jos_menu [table_name] => jos_menu [3] => UPDATE [operation] => UPDATE [4] => 62 [oldvalue] => 62 [5] => 0 [newvalue] => 0 [6] => checked_out [field] => checked_out [7] => 0 [live] => 0 [8] => 2010-05-11 17:46:35 [changedone] => 2010-05-11 17:46:35 ) [5] => Array ( [0] => 142 [id] => 142 [1] => 153 [trackid] => 153 [2] => jos_menu [table_name] => jos_menu [3] => UPDATE [operation] => UPDATE [4] => 2010-05-11 12:16:28 [oldvalue] => 2010-05-11 12:16:28 [5] => 0000-00-00 00:00:00 [newvalue] => 0000-00-00 00:00:00 [6] => checked_out_time [field] => checked_out_time [7] => 0 [live] => 0 [8] => 2010-05-11 17:46:35 [changedone] => 2010-05-11 17:46:35 ) ) Now here u can see that the index "field" has repeated values i.e. "checked_out" and "checked_out_time" other indexes have single occurrence, now what should I do to select/grab the first occurrence of the repetitive values only?

    Read the article

  • SQL Server Query Question

    - by Lp1
    Running SQL Server 2008, and I am definitely a new SQL user. I have a table that has 4 columns: EmpNum, User, Action, Updatetime A user logs into, and out of a system, it is registered in the database. For example, if user1 logs into the system, then out 5 minutes later, a simple query (select * from update) would look like: EmpNum User Action Updatetime 1 User1 I 2010-01-01 23:00:00:000 1 User1 O 2010-01-01 23:05:00:000 I'm trying to query the Empnum, User, Action, I(in time), O(out time), and the total time.

    Read the article

  • Date conversion in Java

    - by llm
    How can I take a string in a format such as: 2008-06-02 00:00:00.0 and convert it to: 02-Jun-2008? Can I somehow take the original string, convert it to a Date object, then use a formatter to get the final output (rather than parsing the string myself)? Thanks!

    Read the article

  • SQLite REGEXP initializer not working in production on Heroku

    - by morcutt
    I am using this to create a REGEXP in SQLite with rails because SQLite does not support REGEXP. When running this app on Heroku rather than the localhost it does not work. Is the initializer not being run when the app launches? The log files are providing .. 2011-03-04T18:35:36-08:00 app[web.1]: ActiveRecord::StatementInvalid (PGError: ERROR: syntax error at or near "REGEXP" 2011-03-04T18:35:36-08:00 app[web.1]: LINE 1: ... "posts".* FROM "posts" WHERE (message REGEXP '(?... 2011-03-04T18:35:36-08:00 app[web.1]: ^ 2011-03-04T18:35:36-08:00 app[web.1]: : SELECT "posts".* FROM "posts" WHERE (message REGEXP '(?:^|\s+)/(\w+)' and user_id = 1)): Which are similar to what the development files produced if I had deleted the implemented code. It seems as though the REGEXP initializer is not being run at startup.

    Read the article

  • SQL indexes for "not equal" searches

    - by bortzmeyer
    The SQL index allows to find quickly a string which matches my query. Now, I have to search in a big table the strings which do not match. Of course, the normal index does not help and I have to do a slow sequential scan: essais=> \d phone_idx Index "public.phone_idx" Column | Type --------+------ phone | text btree, for table "public.phonespersons" essais=> EXPLAIN SELECT person FROM PhonesPersons WHERE phone = '+33 1234567'; QUERY PLAN ------------------------------------------------------------------------------- Index Scan using phone_idx on phonespersons (cost=0.00..8.41 rows=1 width=4) Index Cond: (phone = '+33 1234567'::text) (2 rows) essais=> EXPLAIN SELECT person FROM PhonesPersons WHERE phone != '+33 1234567'; QUERY PLAN ---------------------------------------------------------------------- Seq Scan on phonespersons (cost=0.00..18621.00 rows=999999 width=4) Filter: (phone <> '+33 1234567'::text) (2 rows) I understand (see Mark Byers' very good explanations) that PostgreSQL can decide not to use an index when it sees that a sequential scan would be faster (for instance if almost all the tuples match). But, here, "not equal" searches are really slower. Any way to make these "is not equal to" searches faster? Here is another example, to address Mark Byers' excellent remarks. The index is used for the '=' query (which returns the vast majority of tuples) but not for the '!=' query: essais=> EXPLAIN ANALYZE SELECT person FROM EmailsPersons WHERE tld(email) = 'fr'; QUERY PLAN ------------------------------------------------------------------------------------------------------------------------------------ Index Scan using tld_idx on emailspersons (cost=0.25..4010.79 rows=97033 width=4) (actual time=0.137..261.123 rows=97110 loops=1) Index Cond: (tld(email) = 'fr'::text) Total runtime: 444.800 ms (3 rows) essais=> EXPLAIN ANALYZE SELECT person FROM EmailsPersons WHERE tld(email) != 'fr'; QUERY PLAN -------------------------------------------------------------------------------------------------------------------- Seq Scan on emailspersons (cost=0.00..27129.00 rows=2967 width=4) (actual time=1.004..1031.224 rows=2890 loops=1) Filter: (tld(email) <> 'fr'::text) Total runtime: 1037.278 ms (3 rows) DBMS is PostgreSQL 8.3 (but I can upgrade to 8.4).

    Read the article

  • Any array function or user defned function to pick up only first occurance of value from the array?

    - by OM The Eternity
    If i have an array Array ( [0] => Array ( [0] => 137 [id] => 137 [1] => 153 [trackid] => 153 [2] => jos_menu [table_name] => jos_menu [3] => UPDATE [operation] => UPDATE [4] => 0 [oldvalue] => 0 [5] => 62 [newvalue] => 62 [6] => checked_out [field] => checked_out [7] => 0 [live] => 0 [8] => 2010-05-11 17:46:28 [changedone] => 2010-05-11 17:46:28 ) [1] => Array ( [0] => 138 [id] => 138 [1] => 153 [trackid] => 153 [2] => jos_menu [table_name] => jos_menu [3] => UPDATE [operation] => UPDATE [4] => 0000-00-00 00:00:00 [oldvalue] => 0000-00-00 00:00:00 [5] => 2010-05-11 12:16:28 [newvalue] => 2010-05-11 12:16:28 [6] => checked_out_time [field] => checked_out_time [7] => 0 [live] => 0 [8] => 2010-05-11 17:46:28 [changedone] => 2010-05-11 17:46:28 ) [2] => Array ( [0] => 139 [id] => 139 [1] => 153 [trackid] => 153 [2] => jos_menu [table_name] => jos_menu [3] => UPDATE [operation] => UPDATE [4] => Subhash [oldvalue] => Subhash [5] => Subhashgfhfgh [newvalue] => Subhashgfhfgh [6] => name [field] => name [7] => 0 [live] => 0 [8] => 2010-05-11 17:46:35 [changedone] => 2010-05-11 17:46:35 ) [3] => Array ( [0] => 140 [id] => 140 [1] => 153 [trackid] => 153 [2] => jos_menu [table_name] => jos_menu [3] => UPDATE [operation] => UPDATE [4] => subhash [oldvalue] => subhash [5] => subhashhfhf [newvalue] => subhashhfhf [6] => alias [field] => alias [7] => 0 [live] => 0 [8] => 2010-05-11 17:46:35 [changedone] => 2010-05-11 17:46:35 ) [4] => Array ( [0] => 141 [id] => 141 [1] => 153 [trackid] => 153 [2] => jos_menu [table_name] => jos_menu [3] => UPDATE [operation] => UPDATE [4] => 62 [oldvalue] => 62 [5] => 0 [newvalue] => 0 [6] => checked_out [field] => checked_out [7] => 0 [live] => 0 [8] => 2010-05-11 17:46:35 [changedone] => 2010-05-11 17:46:35 ) [5] => Array ( [0] => 142 [id] => 142 [1] => 153 [trackid] => 153 [2] => jos_menu [table_name] => jos_menu [3] => UPDATE [operation] => UPDATE [4] => 2010-05-11 12:16:28 [oldvalue] => 2010-05-11 12:16:28 [5] => 0000-00-00 00:00:00 [newvalue] => 0000-00-00 00:00:00 [6] => checked_out_time [field] => checked_out_time [7] => 0 [live] => 0 [8] => 2010-05-11 17:46:35 [changedone] => 2010-05-11 17:46:35 ) ) Now here u can see that the index "field" has repeated values i.e. "checked_out" and "checked_out_time" other indexes have single occurrence, now what should I do to select/grab the first occurrence of the repetitive values only?

    Read the article

  • Calculate actual time from different timezones in php

    - by Thinker
    Hai, I would like to calculate actual time from different timezones. I have two mysql tables 1.Timezone table: This table contians two fields (offset value, country name) values are ( -12:00, Eniwetok, Kwajalein) (-11:00, Midway Island, Samoa).... so on 2.My employee table This table contains the employees details with two field names (created_on, timezone) and the values are (current timestamp, -11:00) (current timestamp, -12:00)..... so on Now, If my employee logged in to my system, i want to display the time according to their timezones values from tables. Basically if my Current timestamp is :2010-05-10 15:41:39 offset values is: +5:30 my result should be : 10 May 2010 3:41pm the same for different offset values for the same current timestamp. I believe i explained the issue correctly, if you want any information, please ask me.

    Read the article

  • Exporting and reformatting data out of MS Excel

    - by Matt H
    I have a huge Excel spreadsheet containing telephone calling rates to a number of different countries. The format of the columns is: Country, RateLocality, Prefixes, Rate, Wholesale e.g. Afganistan, Default, 93;930;931;9321;9322;9323;9324;9325;9326;9327;9328;9329;9331;9332;9333;9334;9335;9336;9337;9338;9339;9341;9342;9343;9344;9345;9346;9347;9348;9349;9351;9352;9353;9354;9355;9356;9357;9358;9359;9361;9362;9363;9364;9365;9366;9367;9368;9369;9371;9372;9373;9374;9376;938;939; $ 1.023, $0.455 These rates change every so often and I need to get them into another system that can import them using CSV. The eventual format is: LD PREPEND CODE ie. 00 or 011,CountryCode,Area Code,Comment,Connect Cost,Included Seconds,Per Minute Cost,Pricelist,Increment So to convert that above line I'd have 00,"Afganistan",93,"Default",1.023,60,1.023,10 00,"Afganistan",931,"Default",1.023,60,1.023,10 ... 00,"Afganistan",939,"Default",1.023,60,1.023,10 Where 00, 60 and 10 are hard coded and merged with the other data from excel. How can I export this data into the required format given that I need to reformat it as it goes. Should I export to XML and use XSLT or some other process to massage the data into CSV? If that is the case, how do I do it simply and quickly.

    Read the article

  • Effective way of String splitting

    - by openidsujoy
    I have a completed string like this N:Pay in Cash++RGI:40++R:200++T:Purchase++IP:N++IS:N++PD:PC++UCP:598.80++UPP:0.00++TCP:598.80++TPP:0.00++QE:1++QS:1++CPC:USD++PPC:Points++D:Y++E:Y++IFE:Y++AD:Y++IR:++MV:++CP:~ ~N:ERedemption++RGI:42++R:200++T:Purchase++IP:N++IS:N++PD:PC++UCP:598.80++UPP:0.00++TCP:598.80++TPP:0.00++QE:1++QS:1++CPC:USD++PPC:Points++D:Y++E:Y++IFE:Y++AD:Y++IR:++MV:++CP: this string is like this It's list of PO's(Payment Options) which are separated by ~~ this list may contains one or more PO contains only Key-Value Pairs which separated by : spaces are denoted by ++ I need to extract the values for Key "RGI" and "N". I can do it via for loop , I want a efficient way to do this. any help on this.

    Read the article

  • Count the number of dates between two dates

    - by Matt Mitchell
    I'm looking to count the dates covered (inclusive) between two DateTimes (not .TotalDays) For example: 2012-2-1 14:00 to 2012-2-2 23:00 -> 2 2012-2-1 14:00 to 2012-2-2 10:00 -> 2 2012-2-1 14:00 to 2012-2-1 15:00 -> 1 2012-1-1 00:00 to 2012-12-31 23:59 -> 366 I can get this functionality with the code below: DateTime dt1 = new DateTime(2000,1,2,12,00,00); DateTime dt2 = new DateTime(2000,1,3,03,00,00); int count = 0; for (DateTime date = dt1; date.Date <= dt2.Date; date = date.AddDays(1)) count++; return count; Is there a better way?

    Read the article

  • JSON datetime to SQL Server database via WCF

    - by moikey
    I have noticed a problem over the past couple of days where my dates submitted to an sql server database are wrong. I have a webpage, where users can book facilities. This webpage takes a name, a date, a start time and an end time(BookingID is required for transactions but generated by database), which I format as a JSON string as follows: {"BookingEnd":"\/Date(2012-26-03 09:00:00.000)\/","BookingID":1,"BookingName":"client test 1","BookingStart":"\/Date(2012-26-03 10:00:00.000)\/","RoomID":4} This is then passed to a WCF service, which handles the database insert as follows: [WebInvoke(Method = "POST", RequestFormat = WebMessageFormat.Json, UriTemplate = "createbooking")] void CreateBooking(Booking booking); [DataContract] public class Booking { [DataMember] public int BookingID { get; set; } [DataMember] public string BookingName { get; set; } [DataMember] public DateTime BookingStart { get; set; } [DataMember] public DateTime BookingEnd { get; set; } [DataMember] public int RoomID { get; set; } } Booking.svc public void CreateBooking(Booking booking) { BookingEntity bookingEntity = new BookingEntity() { BookingName = booking.BookingName, BookingStart = booking.BookingStart, BookingEnd = booking.BookingEnd, RoomID = booking.RoomID }; BookingsModel model = new BookingsModel(); model.CreateBooking(bookingEntity); } Booking Model: public void CreateBooking(BookingEntity booking) { using (var conn = new SqlConnection("Data Source=cpm;Initial Catalog=BookingDB;Integrated Security=True")) using (var cmd = conn.CreateCommand()) { conn.Open(); cmd.CommandText = @"IF NOT EXISTS ( SELECT * FROM Bookings WHERE BookingStart = @BookingStart AND BookingEnd = @BookingEnd AND RoomID= @RoomID ) INSERT INTO Bookings ( BookingName, BookingStart, BookingEnd, RoomID ) VALUES ( @BookingName, @BookingStart, @BookingEnd, @RoomID )"; cmd.Parameters.AddWithValue("@BookingName", booking.BookingName); cmd.Parameters.AddWithValue("@BookingStart", booking.BookingStart); cmd.Parameters.AddWithValue("@BookingEnd", booking.BookingEnd); cmd.Parameters.AddWithValue("@RoomID", booking.RoomID); cmd.ExecuteNonQuery(); conn.Close(); } } This updates the database but the time ends up "1970-01-01 00:00:02.013" each time I submit the date in the above json format. However, when I do a query in SQL server management studio with the above date format ("YYYY-MM-DD HH:MM:SS.mmm"), it inserts the correct values. Also, if I submit a millisecond datetime to the wcf, the correct date is being inserted. The problem seems to be with the format I am submitting. I am a little lost with this problem. I don't really see why it is doing this. Any help would be greatly appreciated. Thanks.

    Read the article

  • Anyone Experiencing Slow Builds With VS2010?

    - by MrKWatkins
    Hi, We've recently upgraded to the final release of VS2010 and are experiencing very slow build times compared to the same code under 2008. I was wondering if anyone else is experiencing the same so I can work out whether it's just our environment or not? A few details: Using VS2010 Ultimate on Windows 7 with fairly beefy machines, talking to TFS 2010. The solution has been upgraded from VS2008 but still builds against .NET 3.5 and ASP.NET MVC 1.0. It doesn't seem to be the compilation itself taking long but something else in the build process. This is because even projects that are up to date and don't need compiling are taking a few seconds or so to process. It's not due to an Visual Studio addin because a couple guys in the team haven't installed any. The first build after loading VS2010 is pretty quick, then they seem to slow down over time. For example on of the projects in my solution just took 00:00:00.08 to process after a restart. (The project was up to date and didn't need compiling) I then immediately hit rebuild and it jumps to 00:00:01.33. We're also experiencing the problem with another solution that uses .NET 4.0 that was building perfectly fine under VS2010 RC. There are no build events or anything like that I can blame, just straightforward assembly builds. The IDE is not very responsive during the slow builds. Anyone else has similar problems? Update: It looks like the resolving assembly references is taking a long time. Looking at the MSBuild diagnostic output or the example above the first build has 30ms for ResolveAssemblyReferences, the second build has 800ms. Subsequent builds seem to be taking longer copying stuff around, e.g. CopyFilesToOutputDirectory jumps from 1ms to 27ms.

    Read the article

  • Effective way of String spliting C#

    - by openidsujoy
    I have a completed string like this N:Pay in Cash++RGI:40++R:200++T:Purchase++IP:N++IS:N++PD:PC++UCP:598.80++UPP:0.00++TCP:598.80++TPP:0.00++QE:1++QS:1++CPC:USD++PPC:Points++D:Y++E:Y++IFE:Y++AD:Y++IR:++MV:++CP:~ ~N:ERedemption++RGI:42++R:200++T:Purchase++IP:N++IS:N++PD:PC++UCP:598.80++UPP:0.00++TCP:598.80++TPP:0.00++QE:1++QS:1++CPC:USD++PPC:Points++D:Y++E:Y++IFE:Y++AD:Y++IR:++MV:++CP: this string is like this It's list of PO's(Payment Options) which are separated by ~~ this list may contains one or more PO contains only Key-Value Pairs which separated by : spaces are denoted by ++ I need to extract the values for Key "RGI" and "N". I can do it via for loop , I want a efficient way to do this. any help on this.

    Read the article

  • Effective way of String splitting C#

    - by openidsujoy
    I have a completed string like this N:Pay in Cash++RGI:40++R:200++T:Purchase++IP:N++IS:N++PD:PC++UCP:598.80++UPP:0.00++TCP:598.80++TPP:0.00++QE:1++QS:1++CPC:USD++PPC:Points++D:Y++E:Y++IFE:Y++AD:Y++IR:++MV:++CP:~ ~N:ERedemption++RGI:42++R:200++T:Purchase++IP:N++IS:N++PD:PC++UCP:598.80++UPP:0.00++TCP:598.80++TPP:0.00++QE:1++QS:1++CPC:USD++PPC:Points++D:Y++E:Y++IFE:Y++AD:Y++IR:++MV:++CP: this string is like this It's list of PO's(Payment Options) which are separated by ~~ this list may contains one or more PO contains only Key-Value Pairs which separated by : spaces are denoted by ++ I need to extract the values for Key "RGI" and "N". I can do it via for loop , I want a efficient way to do this. any help on this.

    Read the article

  • Convert date in text format to datetime format in T-SQL

    - by Rob
    I have a client supplied file that is loaded in to our SQL Server database. This file contains text based date values i.e. (05102010) and I need to read them from a db column and convert them to a normal date time value = '2010-05-10 00:00:00.000' as part of a clean-up process. Any guidance would be greatly appreciated.

    Read the article

  • SQL Server query

    - by carrot_programmer_3
    Hi, I have a SQL Server DB containing a registrations table that I need to plot on a graph over time. The issue is that I need to break this down by where the user registered from (e.g. website, wap site, or a mobile application). the resulting output data should look like this... [date] [num_reg_website] [num_reg_wap_site] [num_reg_mobileapp] 1 FEB 2010,24,35,64 2 FEB 2010,23,85,48 3 FEB 2010,29,37,79 etc... The source table is as follows... UUID(int), signupdate(datetime), requestsource(varchar(50)) some smple data in this table looks like this... 1001,2010-02-2:00:12:12,'website' 1002,2010-02-2:00:10:17,'app' 1003,2010-02-3:00:14:19,'website' 1004,2010-02-4:00:16:18,'wap' 1005,2010-02-4:00:18:16,'website' Running the following query returns one data column 'total registrations' for the website registrations but I'm not sure how to do this for multiple columns unfortunatly.... select CAST(FLOOR(CAST([signupdate]AS FLOAT ))AS DATETIME) as [signupdate], count(UUID) as 'total registrations' FROM [UserRegistrationRequests] WHERE requestsource = 'website' group by CAST(FLOOR(CAST([signupdate]AS FLOAT ))AS DATETIME)

    Read the article

  • Rails rabl json format

    - by brabertaser1992
    I'm displaying data from my db in json format using rabl to parse it for android... My json looks like so: [ { "bank":{ "central_office_address":"ololo", "license":"12312312", "location_id":3, "name":"Pbank", "tax_number":"12312312", "year_of_foundation":1987 } }, { "bank":{ "central_office_address":"sdfsdf sdf", "license":"321312", "location_id":3, "name":"Bbank", "tax_number":"321321", "year_of_foundation":1999 } } ] I need my json in a format like: { "contacts": [ { "id": "c200", "name": "Ravi Tamada", "email": "[email protected]", "address": "xx-xx-xxxx,x - street, x - country", "gender" : "male", "phone": { "mobile": "+91 0000000000", "home": "00 000000", "office": "00 000000" } }, { "id": "c201", "name": "Johnny Depp", "email": "[email protected]", "address": "xx-xx-xxxx,x - street, x - country", "gender" : "male", "phone": { "mobile": "+91 0000000000", "home": "00 000000", "office": "00 000000" } } ] } Such data is normally parsed in java.... My rabl view: object @banks attributes :central_office_address, :license, :location_id, :name, :tax_number, :year_of_foundation How do I change its output to match the second example?

    Read the article

< Previous Page | 45 46 47 48 49 50 51 52 53 54 55 56  | Next Page >