Search Results

Search found 2739 results on 110 pages for 'reader'.

Page 23/110 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • What You Said: How Do You Set Reminders?

    - by Jason Fitzpatrick
    Earlier this week we asked you to share your favorite tricks for staying on top of your tasks with timely reminders. Now we’re back to highlight some great reader tips (including a bit of software older than some of our readers). Most of us have to-do lists longer than we can do in a given day (or week!) and a constantly changing set of demands and next-actions. Having a timely and effective reminder system is the difference between dropping the ball and getting things done; how exactly that reminder system plays out, however, varied greatly from reader to reader. OJMDC sticks with analog reminders: Sticky notes in the middle of my monitor and in my wallet. I’ve tried my phone apps but I typically disregard them. HTG Explains: Is UPnP a Security Risk? How to Monitor and Control Your Children’s Computer Usage on Windows 8 What Happened to Solitaire and Minesweeper in Windows 8?

    Read the article

  • How to simulate inner join on very large files in java (without running out of memory)

    - by Constantin
    I am trying to simulate SQL joins using java and very large text files (INNER, RIGHT OUTER and LEFT OUTER). The files have already been sorted using an external sort routine. The issue I have is I am trying to find the most efficient way to deal with the INNER join part of the algorithm. Right now I am using two Lists to store the lines that have the same key and iterate through the set of lines in the right file once for every line in the left file (provided the keys still match). In other words, the join key is not unique in each file so would need to account for the Cartesian product situations ... left_01, 1 left_02, 1 right_01, 1 right_02, 1 right_03, 1 left_01 joins to right_01 using key 1 left_01 joins to right_02 using key 1 left_01 joins to right_03 using key 1 left_02 joins to right_01 using key 1 left_02 joins to right_02 using key 1 left_02 joins to right_03 using key 1 My concern is one of memory. I will run out of memory if i use the approach below but still want the inner join part to work fairly quickly. What is the best approach to deal with the INNER join part keeping in mind that these files may potentially be huge public class Joiner { private void join(BufferedReader left, BufferedReader right, BufferedWriter output) throws Throwable { BufferedReader _left = left; BufferedReader _right = right; BufferedWriter _output = output; Record _leftRecord; Record _rightRecord; _leftRecord = read(_left); _rightRecord = read(_right); while( _leftRecord != null && _rightRecord != null ) { if( _leftRecord.getKey() < _rightRecord.getKey() ) { write(_output, _leftRecord, null); _leftRecord = read(_left); } else if( _leftRecord.getKey() > _rightRecord.getKey() ) { write(_output, null, _rightRecord); _rightRecord = read(_right); } else { List<Record> leftList = new ArrayList<Record>(); List<Record> rightList = new ArrayList<Record>(); _leftRecord = readRecords(leftList, _leftRecord, _left); _rightRecord = readRecords(rightList, _rightRecord, _right); for( Record equalKeyLeftRecord : leftList ){ for( Record equalKeyRightRecord : rightList ){ write(_output, equalKeyLeftRecord, equalKeyRightRecord); } } } } if( _leftRecord != null ) { write(_output, _leftRecord, null); _leftRecord = read(_left); while(_leftRecord != null) { write(_output, _leftRecord, null); _leftRecord = read(_left); } } else { if( _rightRecord != null ) { write(_output, null, _rightRecord); _rightRecord = read(_right); while(_rightRecord != null) { write(_output, null, _rightRecord); _rightRecord = read(_right); } } } _left.close(); _right.close(); _output.flush(); _output.close(); } private Record read(BufferedReader reader) throws Throwable { Record record = null; String data = reader.readLine(); if( data != null ) { record = new Record(data.split("\t")); } return record; } private Record readRecords(List<Record> list, Record record, BufferedReader reader) throws Throwable { int key = record.getKey(); list.add(record); record = read(reader); while( record != null && record.getKey() == key) { list.add(record); record = read(reader); } return record; } private void write(BufferedWriter writer, Record left, Record right) throws Throwable { String leftKey = (left == null ? "null" : Integer.toString(left.getKey())); String leftData = (left == null ? "null" : left.getData()); String rightKey = (right == null ? "null" : Integer.toString(right.getKey())); String rightData = (right == null ? "null" : right.getData()); writer.write("[" + leftKey + "][" + leftData + "][" + rightKey + "][" + rightData + "]\n"); } public static void main(String[] args) { try { BufferedReader leftReader = new BufferedReader(new FileReader("LEFT.DAT")); BufferedReader rightReader = new BufferedReader(new FileReader("RIGHT.DAT")); BufferedWriter output = new BufferedWriter(new FileWriter("OUTPUT.DAT")); Joiner joiner = new Joiner(); joiner.join(leftReader, rightReader, output); } catch (Throwable e) { e.printStackTrace(); } } } After applying the ideas from the proposed answer, I changed the loop to this private void join(RandomAccessFile left, RandomAccessFile right, BufferedWriter output) throws Throwable { long _pointer = 0; RandomAccessFile _left = left; RandomAccessFile _right = right; BufferedWriter _output = output; Record _leftRecord; Record _rightRecord; _leftRecord = read(_left); _rightRecord = read(_right); while( _leftRecord != null && _rightRecord != null ) { if( _leftRecord.getKey() < _rightRecord.getKey() ) { write(_output, _leftRecord, null); _leftRecord = read(_left); } else if( _leftRecord.getKey() > _rightRecord.getKey() ) { write(_output, null, _rightRecord); _pointer = _right.getFilePointer(); _rightRecord = read(_right); } else { long _tempPointer = 0; int key = _leftRecord.getKey(); while( _leftRecord != null && _leftRecord.getKey() == key ) { _right.seek(_pointer); _rightRecord = read(_right); while( _rightRecord != null && _rightRecord.getKey() == key ) { write(_output, _leftRecord, _rightRecord ); _tempPointer = _right.getFilePointer(); _rightRecord = read(_right); } _leftRecord = read(_left); } _pointer = _tempPointer; } } if( _leftRecord != null ) { write(_output, _leftRecord, null); _leftRecord = read(_left); while(_leftRecord != null) { write(_output, _leftRecord, null); _leftRecord = read(_left); } } else { if( _rightRecord != null ) { write(_output, null, _rightRecord); _rightRecord = read(_right); while(_rightRecord != null) { write(_output, null, _rightRecord); _rightRecord = read(_right); } } } _left.close(); _right.close(); _output.flush(); _output.close(); } UPDATE While this approach worked, it was terribly slow and so I have modified this to create files as buffers and this works very well. Here is the update ... private long getMaxBufferedLines(File file) throws Throwable { long freeBytes = Runtime.getRuntime().freeMemory() / 2; return (freeBytes / (file.length() / getLineCount(file))); } private void join(File left, File right, File output, JoinType joinType) throws Throwable { BufferedReader leftFile = new BufferedReader(new FileReader(left)); BufferedReader rightFile = new BufferedReader(new FileReader(right)); BufferedWriter outputFile = new BufferedWriter(new FileWriter(output)); long maxBufferedLines = getMaxBufferedLines(right); Record leftRecord; Record rightRecord; leftRecord = read(leftFile); rightRecord = read(rightFile); while( leftRecord != null && rightRecord != null ) { if( leftRecord.getKey().compareTo(rightRecord.getKey()) < 0) { if( joinType == JoinType.LeftOuterJoin || joinType == JoinType.LeftExclusiveJoin || joinType == JoinType.FullExclusiveJoin || joinType == JoinType.FullOuterJoin ) { write(outputFile, leftRecord, null); } leftRecord = read(leftFile); } else if( leftRecord.getKey().compareTo(rightRecord.getKey()) > 0 ) { if( joinType == JoinType.RightOuterJoin || joinType == JoinType.RightExclusiveJoin || joinType == JoinType.FullExclusiveJoin || joinType == JoinType.FullOuterJoin ) { write(outputFile, null, rightRecord); } rightRecord = read(rightFile); } else if( leftRecord.getKey().compareTo(rightRecord.getKey()) == 0 ) { String key = leftRecord.getKey(); List<File> rightRecordFileList = new ArrayList<File>(); List<Record> rightRecordList = new ArrayList<Record>(); rightRecordList.add(rightRecord); rightRecord = consume(key, rightFile, rightRecordList, rightRecordFileList, maxBufferedLines); while( leftRecord != null && leftRecord.getKey().compareTo(key) == 0 ) { processRightRecords(outputFile, leftRecord, rightRecordFileList, rightRecordList, joinType); leftRecord = read(leftFile); } // need a dispose for deleting files in list } else { throw new Exception("DATA IS NOT SORTED"); } } if( leftRecord != null ) { if( joinType == JoinType.LeftOuterJoin || joinType == JoinType.LeftExclusiveJoin || joinType == JoinType.FullExclusiveJoin || joinType == JoinType.FullOuterJoin ) { write(outputFile, leftRecord, null); } leftRecord = read(leftFile); while(leftRecord != null) { if( joinType == JoinType.LeftOuterJoin || joinType == JoinType.LeftExclusiveJoin || joinType == JoinType.FullExclusiveJoin || joinType == JoinType.FullOuterJoin ) { write(outputFile, leftRecord, null); } leftRecord = read(leftFile); } } else { if( rightRecord != null ) { if( joinType == JoinType.RightOuterJoin || joinType == JoinType.RightExclusiveJoin || joinType == JoinType.FullExclusiveJoin || joinType == JoinType.FullOuterJoin ) { write(outputFile, null, rightRecord); } rightRecord = read(rightFile); while(rightRecord != null) { if( joinType == JoinType.RightOuterJoin || joinType == JoinType.RightExclusiveJoin || joinType == JoinType.FullExclusiveJoin || joinType == JoinType.FullOuterJoin ) { write(outputFile, null, rightRecord); } rightRecord = read(rightFile); } } } leftFile.close(); rightFile.close(); outputFile.flush(); outputFile.close(); } public void processRightRecords(BufferedWriter outputFile, Record leftRecord, List<File> rightFiles, List<Record> rightRecords, JoinType joinType) throws Throwable { for(File rightFile : rightFiles) { BufferedReader rightReader = new BufferedReader(new FileReader(rightFile)); Record rightRecord = read(rightReader); while(rightRecord != null){ if( joinType == JoinType.LeftOuterJoin || joinType == JoinType.RightOuterJoin || joinType == JoinType.FullOuterJoin || joinType == JoinType.InnerJoin ) { write(outputFile, leftRecord, rightRecord); } rightRecord = read(rightReader); } rightReader.close(); } for(Record rightRecord : rightRecords) { if( joinType == JoinType.LeftOuterJoin || joinType == JoinType.RightOuterJoin || joinType == JoinType.FullOuterJoin || joinType == JoinType.InnerJoin ) { write(outputFile, leftRecord, rightRecord); } } } /** * consume all records having key (either to a single list or multiple files) each file will * store a buffer full of data. The right record returned represents the outside flow (key is * already positioned to next one or null) so we can't use this record in below while loop or * within this block in general when comparing current key. The trick is to keep consuming * from a List. When it becomes empty, re-fill it from the next file until all files have * been consumed (and the last node in the list is read). The next outside iteration will be * ready to be processed (either it will be null or it points to the next biggest key * @throws Throwable * */ private Record consume(String key, BufferedReader reader, List<Record> records, List<File> files, long bufferMaxRecordLines ) throws Throwable { boolean processComplete = false; Record record = records.get(records.size() - 1); while(!processComplete){ long recordCount = records.size(); if( record.getKey().compareTo(key) == 0 ){ record = read(reader); while( record != null && record.getKey().compareTo(key) == 0 && recordCount < bufferMaxRecordLines ) { records.add(record); recordCount++; record = read(reader); } } processComplete = true; // if record is null, we are done if( record != null ) { // if the key has changed, we are done if( record.getKey().compareTo(key) == 0 ) { // Same key means we have exhausted the buffer. // Dump entire buffer into a file. The list of file // pointers will keep track of the files ... processComplete = false; dumpBufferToFile(records, files); records.clear(); records.add(record); } } } return record; } /** * Dump all records in List of Record objects to a file. Then, add that * file to List of File objects * * NEED TO PLACE A LIMIT ON NUMBER OF FILE POINTERS (check size of file list) * * @param records * @param files * @throws Throwable */ private void dumpBufferToFile(List<Record> records, List<File> files) throws Throwable { String prefix = "joiner_" + files.size() + 1; String suffix = ".dat"; File file = File.createTempFile(prefix, suffix, new File("cache")); BufferedWriter writer = new BufferedWriter(new FileWriter(file)); for( Record record : records ) { writer.write( record.dump() ); } files.add(file); writer.flush(); writer.close(); }

    Read the article

  • How to mock an SqlDataReader using Moq - Update

    - by Simon G
    Hi, I'm new to moq and setting up mocks so i could do with a little help. Title says it all really - how do I mock up an SqlDataReader using Moq? Thanks Update After further testing this is what I have so far: private IDataReader MockIDataReader() { var moq = new Mock<IDataReader>(); moq.Setup( x => x.Read() ).Returns( true ); moq.Setup( x => x.Read() ).Returns( false ); moq.SetupGet<object>( x => x["Char"] ).Returns( 'C' ); return moq.Object; } private class TestData { public char ValidChar { get; set; } } private TestData GetTestData() { var testData = new TestData(); using ( var reader = MockIDataReader() ) { while ( reader.Read() ) { testData = new TestData { ValidChar = reader.GetChar( "Char" ).Value }; } } return testData; } The issue you is when I do reader.Read in my GetTestData() method its always empty. I need to know how to do something like reader.Stub( x => x.Read() ).Repeat.Once().Return( true ) as per the rhino mock example: http://stackoverflow.com/questions/1792984/mocking-a-datareader-and-getting-a-rhino-mocks-exceptions-expectationviolationexc

    Read the article

  • Class Mapping Error: 'T' must be a non-abstract type with a public parameterless constructor

    - by Amit Ranjan
    Hi, While mapping class i am getting error 'T' must be a non-abstract type with a public parameterless constructor in order to use it as parameter 'T' in the generic type or method. Below is my SqlReaderBase Class public abstract class SqlReaderBase<T> : ConnectionProvider { #region Abstract Methods protected abstract string commandText { get; } protected abstract CommandType commandType { get; } protected abstract Collection<IDataParameter> GetParameters(IDbCommand command); **protected abstract MapperBase<T> GetMapper();** #endregion #region Non Abstract Methods /// <summary> /// Method to Execute Select Queries for Retrieveing List of Result /// </summary> /// <returns></returns> public Collection<T> ExecuteReader() { //Collection of Type on which Template is applied Collection<T> collection = new Collection<T>(); // initializing connection using (IDbConnection connection = GetConnection()) { try { // creates command for sql operations IDbCommand command = connection.CreateCommand(); // assign connection to command command.Connection = connection; // assign query command.CommandText = commandText; //state what type of query is used, text, table or Sp command.CommandType = commandType; // retrieves parameter from IDataParameter Collection and assigns it to command object foreach (IDataParameter param in GetParameters(command)) command.Parameters.Add(param); // Establishes connection with database server connection.Open(); // Since it is designed for executing Select statements that will return a list of results // so we will call command's execute reader method that return a Forward Only reader with // list of results inside. using (IDataReader reader = command.ExecuteReader()) { try { // Call to Mapper Class of the template to map the data to its // respective fields MapperBase<T> mapper = GetMapper(); collection = mapper.MapAll(reader); } catch (Exception ex) // catch exception { throw ex; // log errr } finally { reader.Close(); reader.Dispose(); } } } catch (Exception ex) { throw ex; } finally { connection.Close(); connection.Dispose(); } } return collection; } #endregion } What I am trying to do is , I am executine some command and filling my class dynamically. The class is given below: namespace FooZo.Core { public class Restaurant { #region Private Member Variables private int _restaurantId = 0; private string _email = string.Empty; private string _website = string.Empty; private string _name = string.Empty; private string _address = string.Empty; private string _phone = string.Empty; private bool _hasMenu = false; private string _menuImagePath = string.Empty; private int _cuisine = 0; private bool _hasBar = false; private bool _hasHomeDelivery = false; private bool _hasDineIn = false; private int _type = 0; private string _restaurantImagePath = string.Empty; private string _serviceAvailableTill = string.Empty; private string _serviceAvailableFrom = string.Empty; public string Name { get { return _name; } set { _name = value; } } public string Address { get { return _address; } set { _address = value; } } public int RestaurantId { get { return _restaurantId; } set { _restaurantId = value; } } public string Website { get { return _website; } set { _website = value; } } public string Email { get { return _email; } set { _email = value; } } public string Phone { get { return _phone; } set { _phone = value; } } public bool HasMenu { get { return _hasMenu; } set { _hasMenu = value; } } public string MenuImagePath { get { return _menuImagePath; } set { _menuImagePath = value; } } public string RestaurantImagePath { get { return _restaurantImagePath; } set { _restaurantImagePath = value; } } public int Type { get { return _type; } set { _type = value; } } public int Cuisine { get { return _cuisine; } set { _cuisine = value; } } public bool HasBar { get { return _hasBar; } set { _hasBar = value; } } public bool HasHomeDelivery { get { return _hasHomeDelivery; } set { _hasHomeDelivery = value; } } public bool HasDineIn { get { return _hasDineIn; } set { _hasDineIn = value; } } public string ServiceAvailableFrom { get { return _serviceAvailableFrom; } set { _serviceAvailableFrom = value; } } public string ServiceAvailableTill { get { return _serviceAvailableTill; } set { _serviceAvailableTill = value; } } #endregion public Restaurant() { } } } For filling my class properties dynamically i have another class called MapperBase Class with following methods: public abstract class MapperBase<T> where T : new() { protected T Map(IDataRecord record) { T instance = new T(); string fieldName; PropertyInfo[] properties = typeof(T).GetProperties(); for (int i = 0; i < record.FieldCount; i++) { fieldName = record.GetName(i); foreach (PropertyInfo property in properties) { if (property.Name == fieldName) { property.SetValue(instance, record[i], null); } } } return instance; } public Collection<T> MapAll(IDataReader reader) { Collection<T> collection = new Collection<T>(); while (reader.Read()) { collection.Add(Map(reader)); } return collection; } } There is another class which inherits the SqlreaderBaseClass called DefaultSearch. Code is below public class DefaultSearch: SqlReaderBase<Restaurant> { protected override string commandText { get { return "Select Name from vw_Restaurants"; } } protected override CommandType commandType { get { return CommandType.Text; } } protected override Collection<IDataParameter> GetParameters(IDbCommand command) { Collection<IDataParameter> parameters = new Collection<IDataParameter>(); parameters.Clear(); return parameters; } protected override MapperBase<Restaurant> GetMapper() { MapperBase<Restaurant> mapper = new RMapper(); return mapper; } } But whenever I tried to build , I am getting error 'T' must be a non-abstract type with a public parameterless constructor in order to use it as parameter 'T' in the generic type or method. Even T here is Restaurant has a Parameterless Public constructor.

    Read the article

  • Exceptions with DateTime parsing in RSS feed in C#

    - by hIpPy
    I'm trying to parse Rss2, Atom feeds using SyndicationFeedFormatter and SyndicationFeed objects. But I'm getting XmlExceptions while parsing DateTime field like pubDate and/or lastBuildDate. Wed, 24 Feb 2010 18:56:04 GMT+00:00 does not work Wed, 24 Feb 2010 18:56:04 GMT works So, it's throwing due to the timezone field. As a workaround, for familiar feeds I would manually fix those DateTime nodes - by catching the XmlException, loading the Rss into an XmlDocument, fixing those nodes' value, creating a new XmlReader and then returning the formatter from this new XmlReader object (code not shown). But for this approach to work, I need to know beforehand which nodes cause exception. SyndicationFeedFormatter syndicationFeedFormatter = null; XmlReaderSettings settings = new XmlReaderSettings(); using (XmlReader reader = XmlReader.Create(url, settings)) { try { syndicationFeedFormatter = SyndicationFormatterFactory.CreateFeedFormatter(reader); syndicationFeedFormatter.ReadFrom(reader); } catch (XmlException xexp) { // fix those datetime nodes with exceptions and read again. } return syndicationFeedFormatter; } rss feed: http://news.google.com/news?pz=1&cf=all&ned=us&hl=en&q=test&cf=all&output=rss exception detials: XmlException Error in line 1 position 376. An error was encountered when parsing a DateTime value in the XML. at System.ServiceModel.Syndication.Rss20FeedFormatter.DateFromString(String dateTimeString, XmlReader reader) at System.ServiceModel.Syndication.Rss20FeedFormatter.ReadXml(XmlReader reader, SyndicationFeed result) at System.ServiceModel.Syndication.Rss20FeedFormatter.ReadFrom(XmlReader reader) at ... cs:line 171 <rss version="2.0"> <channel> ... <pubDate>Wed, 24 Feb 2010 18:56:04 GMT+00:00</pubDate> <lastBuildDate>Wed, 24 Feb 2010 18:56:04 GMT+00:00</lastBuildDate> <-----exception ... <item> ... <pubDate>Wed, 24 Feb 2010 16:17:50 GMT+00:00</pubDate> <lastBuildDate>Wed, 24 Feb 2010 18:56:04 GMT+00:00</lastBuildDate> </item> ... </channel> </rss> Is there a better way to achieve this? Please help. Thanks.

    Read the article

  • StackOverflow in VB.NET SQLite query

    - by Majgel
    I have an StackOverflowException in one of my DB functions that I don't know how to deal with. I have a SQLite database with one table "tblEmployees" that holds records for each employees (several thousand posts) and usually this function runs without any problem. But sometimes after the the function is called a thousand times it breaks with an StackOverflowException at the line "ReturnTable.Load(reader)" with the message: An unhandled exception of type 'System.StackOverflowException' occurred in System.Data.SQLite.dll If I restart the application it has no problem to continue with the exact same post it last crashed on. I can also make the exactly same DB-call from SQLite Admin at the crash time without no problems. Here is the code: Public Function GetNextEmployeeInQueue() As String Dim NextEmployeeInQueue As String = Nothing Dim query As [String] = "SELECT FirstName FROM tblEmployees WHERE Checked=0 LIMIT 1;" Try Dim ReturnTable As New DataTable() Dim mycommand As New SQLiteCommand(cnn) mycommand.CommandText = query Dim reader As SQLiteDataReader = mycommand.ExecuteReader() ReturnTable.Load(reader) reader.Close() If ReturnTable.Rows.Count > 0 Then NextEmployeeInQueue = ReturnTable.Rows(0)("FirstName").ToString() Else MsgBox("No more employees found in queue") End If Catch fail As Exception MessageBox.Show("Error: " & fail.Message.ToString()) End Try If NextEmployeeInQueue IsNot Nothing Then Return NextEmployeeInQueue Else Return "No more records in queue" End If End Function When crashes, the reader has "Property evaluation failed." in all values. I assume there is some problem with allocated memory that isn't released correctly, but can't figure out what object it's all about. The DB-connection opens when the DB-class object is created and closes on main form Dispose. Should I maybe dispose the mycommand object every time in a finally block? But wouldn't that result in a closed DB-connection?

    Read the article

  • Using returned XML from an authorised HTTP request in vb.NET

    - by Nathan
    Hi, How can I use the returned XML from the reader in a xmltextreader? ' Create the web request request = DirectCast(WebRequest.Create("https://mobilevikings.com/api/1.0/rest/mobilevikings/sim_balance.xml"), HttpWebRequest) ' Add authentication to request request.Credentials = New NetworkCredential("username", "password") ' Get response response = DirectCast(request.GetResponse(), HttpWebResponse) ' Get the response stream into a reader reader = New StreamReader(response.GetResponseStream()) Thanks in advance, Nathan.

    Read the article

  • How to Send user to two different web pages when login

    - by Pradeep
    protected static Boolean Authentication(string username, string password) { string sqlstring; sqlstring = "Select Username, Password, UserType from Userprofile WHERE Username='" + username + "' and Password ='" + password + "'"; // create a connection with sqldatabase System.Data.SqlClient.SqlConnection con = new System.Data.SqlClient.SqlConnection( "Data Source=PRADEEP-LAPTOP\\SQLEXPRESS;Initial Catalog=BookStore;Integrated Security=True"); // create a sql command which will user connection string and your select statement string System.Data.SqlClient.SqlCommand comm = new System.Data.SqlClient.SqlCommand(sqlstring, con); // create a sqldatabase reader which will execute the above command to get the values from sqldatabase System.Data.SqlClient.SqlDataReader reader; // open a connection with sqldatabase con.Open(); // execute sql command and store a return values in reade reader = comm.ExecuteReader(); // check if reader hase any value then return true otherwise return false if (reader.Read()) return true; else return false; } Boolean blnresult; blnresult = Authentication(Login2.UserName, Login2.Password); if (blnresult == true) { Session["User_ID"] = getIDFromName(Login2.UserName); Session["Check"] = true; Session["Username"] = Login2.UserName; Response.Redirect("Index.aspx"); } so a user like Staff or even Administrators loging to same Index.aspx. i want to change it to different web pages. how to change sites for each user types. i have seperate user types. and i have taken UserType in the Authentication function.

    Read the article

  • Parsing XML with the PHP XMLReader

    - by coffeecoder
    Hi Guys, I am writing program that reads in some XML from the $_POST variable and then parses using the PHP XMLReader and the data extracted input into a database. I am using the XMLReader as the XML supplied will more than likely be too big to place into memory. However I am having some issues, my XML and basic code as are follows: '<?xml version="1.0"?> <data_root> <data> <info>value</info> </data> <action>value</action> </data_root>' $request = $_REQUEST['xml']; $reader = new XMLReader(); $reader->XML($request); while($reader->read()){ //processing code } $reader->close() My problem is that the code will work perfectly if the XML being passed does not have the <?xml version="1.0"?> line, but if i include it, and it will be included when the application goes into a live production environment, the $reader->read() code for the while loop does not work and the XML is not parsed within the while loop. Has anyone seen similar behaviour before or knows why this could be happening? Thanks in advance.

    Read the article

  • Reading a WAV file into VST.Net to process with a plugin

    - by Paul
    Hello, I'm trying to use the VST.Net and NAudio frameworks to build an application that processes audio using a VST plugin. Ideally, the application should load a wav or mp3 file, process it with the VST, and then write a new file. I have done some poking around with the VST.Net library and was able to compile and run the samples (specifically the VST Host one). What I have not figured out is how to load an audio file into the program and have it write a new file back out. I'd like to be able to configure the properties for the VST plugin via C#, and be able to process the audio with 2 or more consecutive VSTs. Using NAudio, I was able to create this simple script to copy an audio file. Now I just need to get the output from the WaveFileReader into the VST.Net framework somehow. private void processAudio() { reader = new WaveFileReader("c:/bass.wav"); writer = new WaveFileWriter("c:/bass-copy.wav", reader.WaveFormat); int read; while ((read = reader.Read(buffer, 0, buffer.Length)) > 0) { writer.WriteData(buffer, 0, read); } textBox1.Text = "done"; reader.Close(); reader.Dispose(); writer.Close(); writer.Dispose(); } Please help!! Thanks References: http://vstnet.codeplex.com (VST.Net) http://naudio.codeplex.com (NAudio)

    Read the article

  • Using generics with XmlSerializer

    - by MainMa
    Hi, When using XML serialization in C#, I use code like this: public MyObject LoadData() { XmlSerializer xmlSerializer = new XmlSerializer(typeof(MyObject)); using (TextReader reader = new StreamReader(settingsFileName)) { return (MyObject)xmlSerializer.Deserialize(reader); } } (and similar code for deserialization). It requires casting and is not really nice. Is there a way, directly in .NET Framework, to use generics with serialization? That is to say to write something like: public MyObject LoadData() { // Generics here. XmlSerializer<MyObject> xmlSerializer = new XmlSerializer(); using (TextReader reader = new StreamReader(settingsFileName)) { // No casts nevermore. return xmlSerializer.Deserialize(reader); } }

    Read the article

  • Why does perl crash with "*** glibc detected *** perl: munmap_chunk(): invalid pointer"?

    - by sid_com
    #!/usr/bin/env perl use warnings; use strict; use 5.012; use XML::LibXML::Reader; my $reader = XML::LibXML::Reader->new( location => 'http://www.heise.de/' ) or die $!; while ( $reader->read ) { say $reader->name; } At the end of the output from this script I get this error-messages: * glibc detected * perl: munmap_chunk(): invalid pointer: 0x0000000000b362e0 * ======= Backtrace: ========= /lib64/libc.so.6[0x7fb84952fc76] ... ======= Memory map: ======== 00400000-0053d000 r-xp 00000000 08:01 182002 /usr/local/bin/perl ... Is this due a bug? perl -V: Summary of my perl5 (revision 5 version 12 subversion 0) configuration: Platform: osname=linux, osvers=2.6.31.12-0.2-desktop, archname=x86_64-linux uname='linux linux1 2.6.31.12-0.2-desktop #1 smp preempt 2010-03-16 21:25:39 +0100 x86_64 x86_64 x86_64 gnulinux ' config_args='-Dnoextensions=ODBM_File' hint=recommended, useposix=true, d_sigaction=define useithreads=undef, usemultiplicity=undef useperlio=define, d_sfio=undef, uselargefiles=define, usesocks=undef use64bitint=define, use64bitall=define, uselongdouble=undef usemymalloc=n, bincompat5005=undef Compiler: cc='cc', ccflags ='-fno-strict-aliasing -pipe -fstack-protector -I/usr/local/include -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64', optimize='-O2', cppflags='-fno-strict-aliasing -pipe -fstack-protector -I/usr/local/include' ccversion='', gccversion='4.4.1 [gcc-4_4-branch revision 150839]', gccosandvers='' intsize=4, longsize=8, ptrsize=8, doublesize=8, byteorder=12345678 d_longlong=define, longlongsize=8, d_longdbl=define, longdblsize=16 ivtype='long', ivsize=8, nvtype='double', nvsize=8, Off_t='off_t', lseeksize=8 alignbytes=8, prototype=define Linker and Libraries: ld='cc', ldflags =' -fstack-protector -L/usr/local/lib' libpth=/usr/local/lib /lib /usr/lib /lib64 /usr/lib64 /usr/local/lib64 libs=-lnsl -ldl -lm -lcrypt -lutil -lc perllibs=-lnsl -ldl -lm -lcrypt -lutil -lc libc=/lib/libc-2.10.1.so, so=so, useshrplib=false, libperl=libperl.a gnulibc_version='2.10.1' Dynamic Linking: dlsrc=dl_dlopen.xs, dlext=so, d_dlsymun=undef, ccdlflags='-Wl,-E' cccdlflags='-fPIC', lddlflags='-shared -O2 -L/usr/local/lib -fstack-protector' Characteristics of this binary (from libperl): Compile-time options: PERL_DONT_CREATE_GVSV PERL_MALLOC_WRAP USE_64_BIT_ALL USE_64_BIT_INT USE_LARGE_FILES USE_PERLIO USE_PERL_ATOF Built under linux Compiled at Apr 15 2010 13:25:46 @INC: /usr/local/lib/perl5/site_perl/5.12.0/x86_64-linux /usr/local/lib/perl5/site_perl/5.12.0 /usr/local/lib/perl5/5.12.0/x86_64-linux /usr/local/lib/perl5/5.12.0 .

    Read the article

  • Using StringBuilder to process csv files to save heap space

    - by portoalet
    I am reading a csv file that has about has about 50,000 lines and 1.1MiB in size (and can grow larger). In Code1, I use String to process the csv, while in Code2 I use StringBuilder (only one thread executes the code, so no concurrency issues) Using StringBuilder makes the code a little bit harder to read that using normal String class. Am I prematurely optimizing things with StringBuilder in Code2 to save a bit of heap space and memory? Code1 fr = new FileReader(file); BufferedReader reader = new BufferedReader(fr); String line = reader.readLine(); while ( line != null ) { int separator = line.indexOf(','); String symbol = line.substring(0, seperator); int begin = separator; separator = line.indexOf(',', begin+1); String price = line.substring(begin+1, seperator); // Publish this update publisher.publishQuote(symbol, price); // Read the next line of fake update data line = reader.readLine(); } Code2 fr = new FileReader(file); StringBuilder stringBuilder = new StringBuilder(reader.readLine()); while( stringBuilder.toString() != null ) { int separator = stringBuilder.toString().indexOf(','); String symbol = stringBuilder.toString().substring(0, separator); int begin = separator; separator = stringBuilder.toString().indexOf(',', begin+1); String price = stringBuilder.toString().substring(begin+1, separator); publisher.publishQuote(symbol, price); stringBuilder.replace(0, stringBuilder.length(), reader.readLine()); }

    Read the article

  • Oracle Data Provider and casting

    - by mrjoltcola
    I use Oracle's specific data provider, not the Microsoft provider that is being discontinued. The thing I've found about ODP.NET is how picky it is with data types. Where JDBC and other ADO providers just convert and make things work, ODP.NET will throw an invalid cast exception unless you get it exactly right. Consider this code: String strSQL = "SELECT DOCUMENT_SEQ.NEXTVAL FROM DUAL"; OracleCommand cmd = new OracleCommand(strSQL, conn); reader = cmd.ExecuteReader(); if (reader != null && reader.Read()) { Int64 id = reader.GetInt64(0); return id; } Due to ODP.NET's pickiness on conversion, this doesn't work. My usual options are: 1) Retrieve into a Decimal and return it with a cast to an Int64 (I don't like this because Decimal is just overkill, and at least once I remember reading it was deprecated...) Decimal id = reader.GetDecimal(0); return (Int64)id; 2) Or cast in the SQL statement to make sure it fits into Int64, like NUMBER(18) String strSQL = "SELECT CAST(DOCUMENT_SEQ.NEXTVAL AS NUMBER(18)) FROM DUAL"; I do (2), because I feel its just not clean pulling a number into a .NET Decimal when my domain types are Int32 or Int64. Other providers I've used are nice (smart) enough to do the conversion on the fly. Any suggestions from the ODP.NET gurus?

    Read the article

  • PyQt4, QThread and opening big files without freezing the GUI

    - by jmrbcu
    Hi, I would like to ask how to read a big file from disk and maintain the PyQt4 UI responsive (not blocked). I had moved the load of the file to a QThread subclass but my GUI thread get freezed. Any suggestions? I think it must be something with the GIL but I don't know how to sort it? EDIT: I am using vtkGDCMImageReader from the GDCM project to read a multiframe DICOM image and display it with vtk and pyqt4. I do this load in a different thread (QThread) but my app freeze until the image is loaded. here is an example code: class ReadThread(QThread): def __init__(self, file_name): super(ReadThread, self).__init__(self) self.file_name = file_name self.reader.vtkgdcm.vtkGDCMImageReader() def run(self): self.reader.SetFileName(self.file_name) self.reader.Update() self.emit(QtCore.SIGNAL('image_loaded'), self.reader.GetOutput())

    Read the article

  • VisualBasic.net Database Boiler Plate

    - by Shiftbit
    Is there any built in .net Classes to assist in the reduction of boiler plate code? I have numerous database operations going on and I find that I am reproducing the connection, command, transaction and occassianlly data set. I am aware of the Java Related Question, however, the solutions pertain to Java. I was wondering if anyone was aware of a .net solution? http://stackoverflow.com/questions/1072925/remove-boilerplate-from-db-code Public Sub ReadData(ByVal connectionString As String) Dim queryString As String = "SELECT EmpNo, EName FROM Emp" Using connection As New OracleConnection(connectionString) Dim command As New OracleCommand(queryString, connection) connection.Open() Using reader As OracleDataReader = command.ExecuteReader() ' Always call Read before accessing data. While reader.Read() Console.WriteLine(reader.GetInt32(0).ToString() + ", " _ + reader.GetString(1)) End While End Using End Using End Sub MSDN

    Read the article

  • ASP.Net HttpModule List<> content

    - by test
    We have created an http module that loads data on initialize into a list, and uses that list on every endrequest event of application. However we got a null reference exception on the module at endRequest event complaining that the content is null. Yet we are sure that at module initialization we have put non-null objects correctly. How is this possible? Any ideas? Thanks in advance. The place where i fill the data into the list what named as "Pages" : using (SqlDataReader reader = cmd.ExecuteReader()) { while (reader.Read()) { LandingPageDetails details = new LandingPageDetails(); details.DomainId = Convert.ToInt32(reader["DomainId"]); details.PageId = Convert.ToInt32(reader["TrackingPageId"]); details.PageAddress = Convert.ToString(reader["PageAddress"]); if (!string.IsNullOrEmpty(details.PageAddress)) { Pages.Add(details); } } } // Find the related record on EndRequesst event LandingPageDetails result = Pages.Find ( delegate(LandingPageDetails current) { return current.PageAddress.Equals(url, StringComparison.InvariantCultureIgnoreCase); } ); The line what we get the error return current.PageAddress.Equals(url, StringComparison.InvariantCultureIgnoreCase); "current" is the null object P.S : It works properly, but one day later contents become null. Possibly, there is something wrong on IIS. Our server is Windows 2003, IIS 6.0

    Read the article

  • How to ignore comments when reading a XML file into a XmlDocument?

    - by tunnuz
    Possible duplicate: How to remove all comment tags from XmlDocument Hello, I am trying to read a XML document with C#, I am doing it this way: XmlDocument myData = new XmlDocument(); myData.Load("datafile.xml"); anyway, I sometimes get comments when reading XmlNode.ChildNodes. For the benefit of who's experiencing the same requirement, here's how I did it at the end: /** Validate a file, return a XmlDocument, exclude comments */ private XmlDocument LoadAndValidate( String fileName ) { // Create XML reader settings XmlReaderSettings settings = new XmlReaderSettings(); settings.IgnoreComments = true; // Exclude comments settings.ProhibitDtd = false; settings.ValidationType = ValidationType.DTD; // Validation // Create reader based on settings XmlReader reader = XmlReader.Create(fileName, settings); try { // Will throw exception if document is invalid XmlDocument document = new XmlDocument(); document.Load(reader); return document; } catch (XmlSchemaException) { return null; } } Thank you Tommaso

    Read the article

  • Passing Byte (VB.NET)

    - by yae
    Hi I need to pass encoded string to php page. To convert string to iso: Dim result As Byte() = Encoding.Convert(Encoding.UTF8, Encoding.GetEncoding("iso-8859-1"), input) I have this code to pass string, but how I must do it to pass Byte (variable result) instead of the string (variable MyVarString)? Dim client As WebClient Dim data As Stream Dim reader As StreamReader Dim baseurl As String baseurl = "http://example.com/api/mypage2.php" client = New WebClient client.Headers.Add("user-agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)") client.QueryString.Add("mensaje", MyVarString) data = client.OpenRead(baseurl) reader = New StreamReader(data) s = reader.ReadToEnd() data.Close() reader.Close() Etc.

    Read the article

  • Multithreading in PHP

    - by Jack
    I am working on windows. I am building a twitter application which periodically checks for new tweets as well as allows users to update their status. I have written separate PHP files for reading (reader.php) and writing tweets (writer.php). The only problem is how do I periodically read the tweets. There are a few ways which I can think of - 1) Use a time-based job scheduler (like Cron) to periodically run the reader.php. How do I do this? 2) Use multithreading to run both reader and writer.php and use a timer function in reader.php Suggestions?

    Read the article

  • RDF-raptor-parser

    - by naveen
    Hi All, I am trying to parse the rdf file but I am getting error while executing following code in ubuntu RDF::Reader.open("http://datagraph.org/jhacker/foaf.rdf") do |reader| reader.each_statement do |statement| puts statement.inspect end end as LoadError: Could not open library 'libraptor': libraptor: cannot open shared object file: No such file or directory. Could not open library 'libraptor.so': libraptor.so: cannot open shared object file: No such file or directory I installed all the required gems: rdf rdf-raptor ffi rdf-json rdf-trix Please help me how to rectify this problem thanks in advance Naveenkumr.R

    Read the article

  • Can shared memory be read and validated without mutexes?

    - by Bribles
    On Linux I'm using shmget and shmat to setup a shared memory segment that one process will write to and one or more processes will read from. The data that is being shared is a few megabytes in size and when updated is completely rewritten; it's never partially updated. I have my shared memory segment laid out as follows: ------------------------- | t0 | actual data | t1 | ------------------------- where t0 and t1 are copies of the time when the writer began its update (with enough precision such that successive updates are guaranteed to have differing times). The writer first writes to t1, then copies in the data, then writes to t0. The reader on the other hand reads t0, then the data, then t1. If the reader gets the same value for t0 and t1 then it considers the data consistent and valid, if not, it tries again. Does this procedure ensure that if the reader thinks the data is valid then it actually is? Do I need to worry about out-of-order execution (OOE)? If so, would the reader using memcpy to get the entire shared memory segment overcome the OOE issues on the reader side? (This assumes that memcpy performs it's copy linearly and ascending through the address space. Is that assumption valid?)

    Read the article

  • How to find out whether SqlCe query Has rows?

    - by Tomas
    Hi, In my simple db I use SqlCe and I cannot figure out how to correctly find out whether the query has rows or not. HasRows does not work. So far I have this: _DbCommand.CommandText="SELECT * FROM X" SqlCeDataReader reader=_DbCommand.ExecuteQuery(); if (reader.FieldCount!=0) //I thought it could work (O rows - 0 fields?), but its true even with 0 rows { while (reader.Read()) { // } } Thanks

    Read the article

  • Using application roles with DataReader

    - by Shahar
    I have an application that should use an application role from the database. I'm trying to make this work with queries that are actually run using Subsonic (2). To do this, I created my own DataProvider, which inherits from Subsonic's SqlDataProvider. It overrides the CreateConnection function, and calls sp_appsetrole to set the application role after the connection is created. This part works fine, and I'm able to get data using the application role. The problem comes when I try to unset the application role. I couldn't find any place in the code where my provider is called after the query is done, so I tried to add my own, by changing SubSonic code. The problem is that Subsonic uses a data reader. It loads data from the data reader, and then closes it. If I unset the application role before the data reader is closed, I get an error saying: There is already an open DataReader associated with this Command which must be closed first. If I unset the application role after the data reader is closed, I get an error saying ExecuteNonQuery requires an open and available Connection. The connection's current state is closed. I can't seem to find a way to close the data reader without closing the connection.

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >