Search Results

Search found 16604 results on 665 pages for 'jd long'.

Page 46/665 | < Previous Page | 42 43 44 45 46 47 48 49 50 51 52 53  | Next Page >

  • .Net MVC - Restful URL's - The specified path, file name, or both are too long. The fully qualified

    - by Truegilly
    Hello, im creating a MVC application thats following a restfull URL approach Im am experiencing the following error... "The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters." This error occurs when my URL length = 225 chars. Surly I can have much longer URL's without this problem and doesn’t this relate to file paths rather than URL's ? Im sure some of you MVC guys have experienced this ;) is there a way round it ?? where am i going wrong ?? thank you for your time Truegilly

    Read the article

  • Correct way to textually report the remaining time on a long running process?

    - by Ryan
    So you have a long running process, perhaps with a progress bar, and you want a text estimate of the remaining time, eg: "5 minutes remaining" "30 seconds remaining" etc. If you don't actually want to report clock time (due to accuracy or resolution or update-rate issues) but want to stick to the text summary, what is the correct paradigm? Is "one minute" left displayed from 0 to 60 seconds? or from 1:00 to 1:59? Say there's 1:35 Left - is that "2 minutes remaining" or "1 minute remaining"? Do you just pare it down to "A few minutes left" when you're less than 3 minutes? What is the preferred (least user-frustrating) method?

    Read the article

  • MySQL null/empty-string too long for VarChar(5) ?

    - by rlb.usa
    I have a stored procedure that takes an input var_zip varchar(5). My webapp calls this procedure and gives it the parameter: cmd.Parameters.Add(new MySqlParameter("var_Zip", MySqlDbType.VarChar, 5)); cmd.Parameters["var_Zip"].Value = txtZip.Text; txtZip is empty. So it is either giving empty string '', or null. Doesn't matter to me which. But it is throwing an error: MySql.Data.MySqlClient.MySqlException: Data too long for column 'var_Zip' at row 1 What's going on and how can I fix it? I don't understand.

    Read the article

  • Catch f5 in Silverlight?

    - by JD
    Hi to all I have an app that uses the navigation controls. What I would like is for my app to stay on the page it is showing rather than reload the whole application when f5 (refresh) is pressed? Is this at all possible? JD.

    Read the article

  • Microsoft Silverlight cannot be used in browsers running in 64 bit mode.

    - by JD
    Hi, I have taken ownership of an application which is a .net win forms application which hosts a System.Windows.Forms.WebBrowser. When the application launches it makes a http request to load a xap file. I immediately see "install silverlight" icon and on clicking get: "Microsoft Silverlight cannot be used in browsers running in 64 bit mode". The app was written initially on a 32 bit machine although I have a 64 bit machine. Any ideas what I need to tweak to get this running? JD

    Read the article

  • Sketchflow removing a component

    - by JD
    Hi, I have a screen and I have a section where a component screen is inserted. I have a cancel button on the component screen and was wondering if it is possible to remove the component from the main screen using the cancel button. Is this at all possible? So, once the component is shown, cancel on the component screen removes it so the first screen is shown again. JD Ps. I am using Blend 3.

    Read the article

  • Why attached property Set and Get static methods are not called in XAML?

    - by JD
    Hi, I have set break points on my attached properties SetXXX and GetXXX static methods. In Xaml, I have assigned values to the attached property. However, I was expecting the Set or Get static methods to be called but they are not. The attached property works as expected and if I call SetXXX and GetXXX methods in code, then it works are expected. Why are the methods not called when set from Xaml? JD.

    Read the article

  • DataContext to DB

    - by JD
    Hi all, I have designed my DB using the ORM in VS 2008. What is the best way to export this to an SQL server so it will create the tables and relations on SQL Server? Thanks, JD

    Read the article

  • Saving russian text in SQL server

    - by JD
    Hi, I have a database (SQL server express 2008) which has a column that is defined as text. When we try to store some text which is in chinese, it is not saved. I read that the field should be ntext. I will now have to a conversion to my table to create the column as ntext. Would I have to do anything with the collation of the database which is set to Latin? JD

    Read the article

  • when do we have a virtual memory problem using Fastmm4?

    - by JD
    Hi, We have an application whose virtual memory rises and keep going for over a day. After two days it has climbed to about 500MB. I have tried profiling the applications which hits a database as well as makes lots of http and soap requests but I Fastmm4 shows there are no leaks. I am not sure how or when memory is claimed and if there is a problem here with the rising virtual memory? JD

    Read the article

  • How to upload a file with JQueryMobile for an iphone/ipad?

    - by JD
    Hi, I have just started looking at JqueryMobile and want to know if the following is possible? I want to be able to select a file (similar to in html) and then show some sort of dialogue that allows me to browse to a "folder" and select the file. It would then be up to me to send some sort of AJAX command to upload it to the server (where I am using Asp.net MVC 2) Since I do not have access to an iPhone or iPad, I was told that I would only have access to the documents folder. JD

    Read the article

  • How to convert binary read/write to non-binary read/write in C++

    - by Phenom
    I have some C++ code from somewhere that reads and writes data in binary format. I want to see what it's reading and writing in the file, so I want to convert it's binary read and write to non-binary read and write. Also, when I convert the binary write to non-binary write, I want it to still be able to read in the information correctly. How can this be done? The write function: int btwrite(short rrn, BTPAGE *page_ptr) { // long lseek(), addr; long addr; addr = (long) rrn * (long) PAGESIZE + HEADERSIZE; lseek(btfd, addr, 0); return (write(btfd, page_ptr, PAGESIZE)); } The read function: int btread(short rrn, BTPAGE *page_ptr) { // long lseek(), addr; long addr; addr = (long)rrn * (long)PAGESIZE + HEADERSIZE; lseek(btfd, addr, 0); return ( read(btfd, page_ptr, PAGESIZE) ); }

    Read the article

  • Can I spead out a long running stored proc accross multiple CPU's?

    - by Russ
    [Also on SuperUser - http://superuser.com/questions/116600/can-i-spead-out-a-long-running-stored-proc-accross-multiple-cpus] I have a stored procedure in SQL server the gets, and decrypts a block of data. ( Credit cards in this case. ) Most of the time, the performance is tolerable, but there are a couple customers where the process is painfully slow, taking literally 1 minute to complete. ( Well, 59377ms to return from SQL Server to be exact, but it can vary by a few hundred ms based on load ) When I watch the process, I see that SQL is only using a single proc to perform the whole process, and typically only proc 0. Is there a way I can change my stored proc so that SQL can multi-thread the process? Is it even feasible to cheat and to break the calls in half, ( top 50%, bottom 50% ), and spread the load, as a gross hack? ( just spit-balling here ) My stored proc: USE [Commerce] GO /****** Object: StoredProcedure [dbo].[GetAllCreditCardsByCustomerId] Script Date: 03/05/2010 11:50:14 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[GetAllCreditCardsByCustomerId] @companyId UNIQUEIDENTIFIER, @DecryptionKey NVARCHAR (MAX) AS SET NoCount ON DECLARE @cardId uniqueidentifier DECLARE @tmpdecryptedCardData VarChar(MAX); DECLARE @decryptedCardData VarChar(MAX); DECLARE @tmpTable as Table ( CardId uniqueidentifier, DecryptedCard NVarChar(Max) ) DECLARE creditCards CURSOR FAST_FORWARD READ_ONLY FOR Select cardId from CreditCards where companyId = @companyId and Active=1 order by addedBy desc --2 OPEN creditCards --3 FETCH creditCards INTO @cardId -- prime the cursor WHILE @@Fetch_Status = 0 BEGIN --OPEN creditCards DECLARE creditCardData CURSOR FAST_FORWARD READ_ONLY FOR select convert(nvarchar(max), DecryptByCert(Cert_Id('Oh-Nay-Nay'), EncryptedCard, @DecryptionKey)) FROM CreditCardData where cardid = @cardId order by valueOrder OPEN creditCardData FETCH creditCardData INTO @tmpdecryptedCardData -- prime the cursor WHILE @@Fetch_Status = 0 BEGIN print 'CreditCardData' print @tmpdecryptedCardData set @decryptedCardData = ISNULL(@decryptedCardData, '') + @tmpdecryptedCardData print '@decryptedCardData' print @decryptedCardData; FETCH NEXT FROM creditCardData INTO @tmpdecryptedCardData -- fetch next END CLOSE creditCardData DEALLOCATE creditCardData insert into @tmpTable (CardId, DecryptedCard) values ( @cardId, @decryptedCardData ) set @decryptedCardData = '' FETCH NEXT FROM creditCards INTO @cardId -- fetch next END select CardId, DecryptedCard FROM @tmpTable CLOSE creditCards DEALLOCATE creditCards

    Read the article

  • Can I spread out a long running stored proc accross multiple CPU's?

    - by Russ
    [Also on SuperUser - http://superuser.com/questions/116600/can-i-spead-out-a-long-running-stored-proc-accross-multiple-cpus] I have a stored procedure in SQL server the gets, and decrypts a block of data. ( Credit cards in this case. ) Most of the time, the performance is tolerable, but there are a couple customers where the process is painfully slow, taking literally 1 minute to complete. ( Well, 59377ms to return from SQL Server to be exact, but it can vary by a few hundred ms based on load ) When I watch the process, I see that SQL is only using a single proc to perform the whole process, and typically only proc 0. Is there a way I can change my stored proc so that SQL can multi-thread the process? Is it even feasible to cheat and to break the calls in half, ( top 50%, bottom 50% ), and spread the load, as a gross hack? ( just spit-balling here ) My stored proc: USE [Commerce] GO /****** Object: StoredProcedure [dbo].[GetAllCreditCardsByCustomerId] Script Date: 03/05/2010 11:50:14 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[GetAllCreditCardsByCustomerId] @companyId UNIQUEIDENTIFIER, @DecryptionKey NVARCHAR (MAX) AS SET NoCount ON DECLARE @cardId uniqueidentifier DECLARE @tmpdecryptedCardData VarChar(MAX); DECLARE @decryptedCardData VarChar(MAX); DECLARE @tmpTable as Table ( CardId uniqueidentifier, DecryptedCard NVarChar(Max) ) DECLARE creditCards CURSOR FAST_FORWARD READ_ONLY FOR Select cardId from CreditCards where companyId = @companyId and Active=1 order by addedBy desc --2 OPEN creditCards --3 FETCH creditCards INTO @cardId -- prime the cursor WHILE @@Fetch_Status = 0 BEGIN --OPEN creditCards DECLARE creditCardData CURSOR FAST_FORWARD READ_ONLY FOR select convert(nvarchar(max), DecryptByCert(Cert_Id('Oh-Nay-Nay'), EncryptedCard, @DecryptionKey)) FROM CreditCardData where cardid = @cardId order by valueOrder OPEN creditCardData FETCH creditCardData INTO @tmpdecryptedCardData -- prime the cursor WHILE @@Fetch_Status = 0 BEGIN print 'CreditCardData' print @tmpdecryptedCardData set @decryptedCardData = ISNULL(@decryptedCardData, '') + @tmpdecryptedCardData print '@decryptedCardData' print @decryptedCardData; FETCH NEXT FROM creditCardData INTO @tmpdecryptedCardData -- fetch next END CLOSE creditCardData DEALLOCATE creditCardData insert into @tmpTable (CardId, DecryptedCard) values ( @cardId, @decryptedCardData ) set @decryptedCardData = '' FETCH NEXT FROM creditCards INTO @cardId -- fetch next END select CardId, DecryptedCard FROM @tmpTable CLOSE creditCards DEALLOCATE creditCards

    Read the article

  • Can I spread out a long running stored proc accross multiple CPU's?

    - by Russ
    [Also on SuperUser - http://superuser.com/questions/116600/can-i-spead-out-a-long-running-stored-proc-accross-multiple-cpus] I have a stored procedure in SQL server the gets, and decrypts a block of data. ( Credit cards in this case. ) Most of the time, the performance is tolerable, but there are a couple customers where the process is painfully slow, taking literally 1 minute to complete. ( Well, 59377ms to return from SQL Server to be exact, but it can vary by a few hundred ms based on load ) When I watch the process, I see that SQL is only using a single proc to perform the whole process, and typically only proc 0. Is there a way I can change my stored proc so that SQL can multi-thread the process? Is it even feasible to cheat and to break the calls in half, ( top 50%, bottom 50% ), and spread the load, as a gross hack? ( just spit-balling here ) My stored proc: USE [Commerce] GO /****** Object: StoredProcedure [dbo].[GetAllCreditCardsByCustomerId] Script Date: 03/05/2010 11:50:14 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[GetAllCreditCardsByCustomerId] @companyId UNIQUEIDENTIFIER, @DecryptionKey NVARCHAR (MAX) AS SET NoCount ON DECLARE @cardId uniqueidentifier DECLARE @tmpdecryptedCardData VarChar(MAX); DECLARE @decryptedCardData VarChar(MAX); DECLARE @tmpTable as Table ( CardId uniqueidentifier, DecryptedCard NVarChar(Max) ) DECLARE creditCards CURSOR FAST_FORWARD READ_ONLY FOR Select cardId from CreditCards where companyId = @companyId and Active=1 order by addedBy desc --2 OPEN creditCards --3 FETCH creditCards INTO @cardId -- prime the cursor WHILE @@Fetch_Status = 0 BEGIN --OPEN creditCards DECLARE creditCardData CURSOR FAST_FORWARD READ_ONLY FOR select convert(nvarchar(max), DecryptByCert(Cert_Id('Oh-Nay-Nay'), EncryptedCard, @DecryptionKey)) FROM CreditCardData where cardid = @cardId order by valueOrder OPEN creditCardData FETCH creditCardData INTO @tmpdecryptedCardData -- prime the cursor WHILE @@Fetch_Status = 0 BEGIN print 'CreditCardData' print @tmpdecryptedCardData set @decryptedCardData = ISNULL(@decryptedCardData, '') + @tmpdecryptedCardData print '@decryptedCardData' print @decryptedCardData; FETCH NEXT FROM creditCardData INTO @tmpdecryptedCardData -- fetch next END CLOSE creditCardData DEALLOCATE creditCardData insert into @tmpTable (CardId, DecryptedCard) values ( @cardId, @decryptedCardData ) set @decryptedCardData = '' FETCH NEXT FROM creditCards INTO @cardId -- fetch next END select CardId, DecryptedCard FROM @tmpTable CLOSE creditCards DEALLOCATE creditCards

    Read the article

  • Most efficient way to handle coordinate maps in Java

    - by glowcoder
    I have a rectangular tile-based layout. It's your typical Cartesian system. I would like to have a single class that handles two lookup styles Get me the set of players at position X,Y Get me the position of player with key K My current implementation is this: class CoordinateMap<V> { Map<Long,Set<V>> coords2value; Map<V,Long> value2coords; // convert (int x, int y) to long key - this is tested, works for all values -1bil to +1bil // My map will NOT require more than 1 bil tiles from the origin :) private Long keyFor(int x, int y) { int kx = x + 1000000000; int ky = y + 1000000000; return (long)kx | (long)ky << 32; } // extract the x and y from the keys private int[] coordsFor(long k) { int x = (int)(k & 0xFFFFFFFF) - 1000000000; int y = (int)((k >>> 32) & 0xFFFFFFFF) - 1000000000; return new int[] { x,y }; } } From there, I proceed to have other methods that manipulate or access the two maps accordingly. My question is... is there a better way to do this? Sure, I've tested my class and it works fine. And sure, something inside tells me if I want to reference the data by two different keys, I need two different maps. But I can also bet I'm not the first to run into this scenario. Thanks!

    Read the article

  • How do I update a progress bar in Cocoa during a long running loop?

    - by Nic
    Hi, I've got a while loop, that runs for many seconds and that's why I want to update a progress bar (NSProgressIndicator) during that process, but it updates only once after the loop has finished. The same happens if I want to update a label text, by the way. I believe, my loop prevents other things of that application to happen. There must be another technique. Does this have to do with threads or something? Am I on the right track? Can someone please give me a simple example, how to “optimize” my application? My application is a Cocoa Application (Xcode 3.2.1) with these two methods in my Example_AppDelegate.m: // This method runs when a start button is clicked. - (IBAction)startIt:(id)sender { [progressbar setDoubleValue:0.0]; [progressbar startAnimation:sender]; running = YES; // this is a instance variable int i = 0; while (running) { if (i++ = processAmount) { // processAmount is something like 1000000 running = NO; continue; } // Update progress bar double progr = (double)i / (double)processAmount; NSLog(@"progr: %f", progr); // Logs values between 0.0 and 1.0 [progressbar setDoubleValue:progr]; [progressbar needsDisplay]; // Do I need this? // Do some more hard work here... } } // This method runs when a stop button is clicked, but as long // as -startIt is busy, a click on the stop button does nothing. - (IBAction)stopIt:(id)sender { NSLog(@"Stop it!"); running = NO; [progressbar stopAnimation:sender]; } I'm really new to Objective-C, Cocoa and applications with a UI. Thank you very much for any helpful answer.

    Read the article

  • Jqyery Bugs?? Long decimal number after two numbers multiply...

    - by Jerry
    Hi all I am working on a shopping site and I am trying to calculate the subtotal of products. I got my price from a array and quantity from getJSON response array. Two of them multiply comes to my subtotal. I can change the quantity and it will comes out different subtotal. However,when I change the quantity to certain number, the final subtotal is like 259.99999999994 or some long decimal number. I use console.log to check the $price and $qty. Both of them are in the correct format ex..299.99 and 6 quantity.I have no idea what happen. I would appreciate it if someone can help me about it. Here is my Jquery code. $(".price").each(function(index, price){ $price=$(this); //get the product id and the price shown on the page var id=$price.closest('tr').attr('id'); var indiPrice=$($price).html(); //take off $ indiPrice=indiPrice.substring(1) //make sure it is number format var aindiPrice=Number(indiPrice); //push into the array productIdPrice[id]=(aindiPrice); var url=update.php $.getJSON( url, {productId:tableId, //tableId is from the other jquery code which refers to qty:qty}, productId function(responseProduct){ $.each(responseProduct, function(productIndex, Qty){ //loop the return data if(productIdPrice[productIndex]){ //get the price from the previous array we create X Qty newSub=productIdPrice[productIndex]*Number(Qty); //productIdPrice[productIndex] are the price like 199.99 or 99.99 // Qty are Quantity like 9 or 10 or 3 sum+=newSub; newSub.toFixed(2); //try to solve the problem with toFixed but didn't work console.log("id: "+productIdPrice[productIndex]) console.log("Qty: "+Qty); console.log(newSub); **//newSub sometime become XXXX.96999999994** }; Thanks again!

    Read the article

  • Why i cant save a long text on my MySQL database?

    - by DomingoSL
    im trying to save to my data base a long text (about 2500 chars) input by my users using a web form and passed to the server using php. When i look in phpmyadmin, the text gets crop. How can i config my table in order to get the complete text? This is my table config: CREATE TABLE `extra_879` ( `id` bigint(20) NOT NULL auto_increment, `id_user` bigint(20) NOT NULL, `title` varchar(300) NOT NULL, `content` varchar(3000) NOT NULL, PRIMARY KEY (`id`), UNIQUE KEY `id_user` (`id_user`) ) ENGINE=MyISAM DEFAULT CHARSET=latin1 AUTO_INCREMENT=4 ; Take a look of the field content that have a limit of 3000 chars, but the texts always gets crop at 690 chars. Thanks for any help! EDIT: I found the problem but i dont know how to solve it. The query is getting crop always in the same char, an special char: ù EDIT 2: This is the cropped query: INSERT INTO extra_879 (id,id_user,title,content) VALUES (NULL,'1','Informazione Extra',' Riconoscimenti Laurea di ingegneria presa a le 22 anni e in il terso posto della promozione Diploma analista di sistemi ottenuto il rating massimo 20/20, primo posto della promozione. Borsa di Studio (offerta dal Ministero Esteri Italiano) vinta nel 2010 (Valutazione del territorio attraverso le nueve tecnologie) Pubblicazione di paper; Stima del RCS della nave CCGS radar sulla base dei risultati di H. Leong e H. Wilson. http://www.ing.uc.edu.vek-azozayalarchivospdf/PAPER-Sarmiento.pdf Tesi di laurea: PROGETTAZIONE E REALIZZAZIONE DI UN SIS-TEMA DI TELEMETRIA GSM PER IL CONTROLLO DELLO STATO DI TRANSITO VEICOLARE E CLIMA (ottenuto il punteggio pi') It gets crop just when the (ottenuto il punteggio più alto) phrase, just when ù appear... EDIT 3: I using jquery + ajax to send the query $.ajax({type: "POST", url: "handler.php", data: "e_text="+ $('#e_text').val() + "&e_title="+ $('#extra_title').val(),

    Read the article

  • PHP: What is an efficient way to parse a text file containing very long lines?

    - by Shaun
    I'm working on a parser in php which is designed to extract MySQL records out of a text file. A particular line might begin with a string corresponding to which table the records (rows) need to be inserted into, followed by the records themselves. The records are delimited by a backslash and the fields (columns) are separated by commas. For the sake of simplicity, let's assume that we have a table representing people in our database, with fields being First Name, Last Name, and Occupation. Thus, one line of the file might be as follows [People] = "\Han,Solo,Smuggler\Luke,Skywalker,Jedi..." Where the ellipses (...) could be additional people. One straightforward approach might be to use fgets() to extract a line from the file, and use preg_match() to extract the table name, records, and fields from that line. However, let's suppose that we have an awful lot of Star Wars characters to track. So many, in fact, that this line ends up being 200,000+ characters/bytes long. In such a case, taking the above approach to extract the database information seems a bit inefficient. You have to first read hundreds of thousands of characters into memory, then read back over those same characters to find regex matches. Is there a way, similar to the Java String next(String pattern) method of the Scanner class constructed using a file, that allows you to match patterns in-line while scanning through the file? The idea is that you don't have to scan through the same text twice (to read it from the file into a string, and then to match patterns) or store the text redundantly in memory (in both the file line string and the matched patterns). Would this even yield a significant increase in performance? It's hard to tell exactly what PHP or Java are doing behind the scenes.

    Read the article

  • Jquery Bugs?? Long decimal number after two numbers multiply...

    - by Jerry
    Hi all I am working on a shopping site and I am trying to calculate the subtotal of products. I got my price from a array and quantity from getJSON response array. Two of them multiply comes to my subtotal. I can change the quantity and it will comes out different subtotal. However,when I change the quantity to certain number, the final subtotal is like 259.99999999994 or some long decimal number. I use console.log to check the $price and $qty. Both of them are in the correct format ex..299.99 and 6 quantity.I have no idea what happen. I would appreciate it if someone can help me about it. Here is my Jquery code. $(".price").each(function(index, price){ $price=$(this); //get the product id and the price shown on the page var id=$price.closest('tr').attr('id'); var indiPrice=$($price).html(); //take off $ indiPrice=indiPrice.substring(1) //make sure it is number format var aindiPrice=Number(indiPrice); //push into the array productIdPrice[id]=(aindiPrice); var url=update.php $.getJSON( url, {productId:tableId, //tableId is from the other jquery code which refers to qty:qty}, productId function(responseProduct){ $.each(responseProduct, function(productIndex, Qty){ //loop the return data if(productIdPrice[productIndex]){ //get the price from the previous array we create X Qty newSub=productIdPrice[productIndex]*Number(Qty); //productIdPrice[productIndex] are the price like 199.99 or 99.99 // Qty are Quantity like 9 or 10 or 3 sum+=newSub; newSub.toFixed(2); //try to solve the problem with toFixed but didn't work console.log("id: "+productIdPrice[productIndex]) console.log("Qty: "+Qty); console.log(newSub); **//newSub sometime become XXXX.96999999994** }; Thanks again!

    Read the article

  • BCrypt says long, similar passwords are equivalent - problem with me, the gem, or the field of crypt

    - by PreciousBodilyFluids
    I've been experimenting with BCrypt, and found the following. If it matters, I'm running ruby 1.9.2dev (2010-04-30 trunk 27557) [i686-linux] require 'bcrypt' # bcrypt-ruby gem, version 2.1.2 @long_string_1 = 'f287ed6548e91475d06688b481ae8612fa060b2d402fdde8f79b7d0181d6a27d8feede46b833ecd9633b10824259ebac13b077efb7c24563fce0000670834215' @long_string_2 = 'f6ebeea9b99bcae4340670360674482773a12fd5ef5e94c7db0a42800813d2587063b70660294736fded10217d80ce7d3b27c568a1237e2ca1fecbf40be5eab8' def salted(string) @long_string_1 + string + @long_string_2 end encrypted_password = BCrypt::Password.create(salted('password'), :cost => 10) puts encrypted_password #=> $2a$10$kNMF/ku6VEAfLFEZKJ.ZC.zcMYUzvOQ6Dzi6ZX1UIVPUh5zr53yEu password = BCrypt::Password.new(encrypted_password) puts password.is_password?(salted('password')) #=> true puts password.is_password?(salted('passward')) #=> true puts password.is_password?(salted('75747373')) #=> true puts password.is_password?(salted('passwor')) #=> false At first I thought that once the passwords got to a certain length, the dissimilarities would just be lost in all the hashing, and only if they were very dissimilar (i.e. a different length) would they be recognized as different. That didn't seem very plausible to me, from what I know of hash functions, but I didn't see a better explanation. Then, I tried shortening each of the long_strings to see where BCrypt would start being able to tell them apart, and I found that if I shortened each of the long strings to 100 characters or so, the final attempt ('passwor') would start returning true as well. So now I don't know what to think. What's the explanation for this?

    Read the article

  • In Javascript, is it true that function aliasing works as long as the function being aliased doesn't

    - by Jian Lin
    In Javascript, if we are aliasing a function, such as in: f = g; f = obj.display; obj.f = foo; all the 3 lines above, they will work as long as the function / method on the right hand side doesn't touch this? Since we are passing in all the arguments, the only way it can mess up is when the function / method on the right uses this? Actually, line 1 is probably ok if g is also a property of window? If g is referencing obj.display, then the same problem is there. In line 2, when obj.display touches this, it is to mean the obj, but when f() is invoked, the this is window, so they are different. In line 3, it is the same: when f() is invoked inside of obj's code, then the this is obj, while foo might be using this to refer to window if it were a property of window. (global function). So line 2 can be written as f = function() { obj.display.apply(obj, arguments) } and line 3: obj.f = function() { foo.apply(window, arguments) } Is this the correct method, and are there there other methods besides this?

    Read the article

< Previous Page | 42 43 44 45 46 47 48 49 50 51 52 53  | Next Page >