Search Results

Search found 8367 results on 335 pages for 'temporal difference'.

Page 10/335 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • Thread vs async execution. What's different?

    - by Eonil
    I believed any kind of asynchronous execution makes a thread in invisible area. But if so, Async codes does not offer any performance gain than threaded codes. But I can't understand why so many developers are making many features async form. Could you explain about difference and cost of them?

    Read the article

  • C# compare algorithms

    - by public static
    Hi, Are there any open source algorithms in c# that solve the problem of creating a difference between two text files? It would be super cool if it had some way of highlighting what exact areas where changed in the text document also.

    Read the article

  • C Differences on windows and Unix OS

    - by zapping
    Is there any difference in C that is written in Windows and Unix. I teach C as well as C++ but some of my students have come back saying some of the sample programs does not run for them in Unix. Unix is alien for me. Unfortunately no experience with it whatsoever. All i know is to spell it. If there are any differences then i should be advising our department to invest on systems for Unix as currently there are no Unix systems in our lab. I do not want my students to feel that they have been denied or kept aloof from something.

    Read the article

  • Delphi: EInvalidOp in neural network class (TD-lambda)

    - by user89818
    I have the following draft for a neural network class. This neural network should learn with TD-lambda. It is started by calling the getRating() function. But unfortunately, there is an EInvalidOp (invalid floading point operation) error after about 1000 iterations in the following lines: neuronsHidden[j] := neuronsHidden[j]+neuronsInput[t][i]*weightsInput[i][j]; // input -> hidden weightsHidden[j][k] := weightsHidden[j][k]+LEARNING_RATE_HIDDEN*tdError[k]*eligibilityTraceOutput[j][k]; // adjust hidden->output weights according to TD-lambda Why is this error? I can't find the mistake in my code :( Can you help me? Thank you very much in advance! unit uNeuronalesNetz; interface uses Windows, Messages, SysUtils, Variants, Classes, Graphics, Controls, Forms, Dialogs, ExtCtrls, StdCtrls, Grids, Menus, Math; const NEURONS_INPUT = 43; // number of neurons in the input layer NEURONS_HIDDEN = 60; // number of neurons in the hidden layer NEURONS_OUTPUT = 1; // number of neurons in the output layer NEURONS_TOTAL = NEURONS_INPUT+NEURONS_HIDDEN+NEURONS_OUTPUT; // total number of neurons in the network MAX_TIMESTEPS = 42; // maximum number of timesteps possible (after 42 moves: board is full) LEARNING_RATE_INPUT = 0.25; // in ideal case: decrease gradually in course of training LEARNING_RATE_HIDDEN = 0.15; // in ideal case: decrease gradually in course of training GAMMA = 0.9; LAMBDA = 0.7; // decay parameter for eligibility traces type TFeatureVector = Array[1..43] of SmallInt; // definition of the array type TFeatureVector TArtificialNeuralNetwork = class // definition of the class TArtificialNeuralNetwork private // GENERAL SETTINGS START learningMode: Boolean; // does the network learn and change its weights? // GENERAL SETTINGS END // NETWORK CONFIGURATION START neuronsInput: Array[1..MAX_TIMESTEPS] of Array[1..NEURONS_INPUT] of Extended; // array of all input neurons (their values) for every timestep neuronsHidden: Array[1..NEURONS_HIDDEN] of Extended; // array of all hidden neurons (their values) neuronsOutput: Array[1..NEURONS_OUTPUT] of Extended; // array of output neurons (their values) weightsInput: Array[1..NEURONS_INPUT] of Array[1..NEURONS_HIDDEN] of Extended; // array of weights: input->hidden weightsHidden: Array[1..NEURONS_HIDDEN] of Array[1..NEURONS_OUTPUT] of Extended; // array of weights: hidden->output // NETWORK CONFIGURATION END // LEARNING SETTINGS START outputBefore: Array[1..NEURONS_OUTPUT] of Extended; // the network's output value in the last timestep (the one before) eligibilityTraceHidden: Array[1..NEURONS_INPUT] of Array[1..NEURONS_HIDDEN] of Array[1..NEURONS_OUTPUT] of Extended; // array of eligibility traces: hidden layer eligibilityTraceOutput: Array[1..NEURONS_TOTAL] of Array[1..NEURONS_TOTAL] of Extended; // array of eligibility traces: output layer reward: Array[1..MAX_TIMESTEPS] of Array[1..NEURONS_OUTPUT] of Extended; // the reward value for all output neurons in every timestep tdError: Array[1..NEURONS_OUTPUT] of Extended; // the network's error value for every single output neuron t: Byte; // current timestep cyclesTrained: Integer; // number of cycles trained so far (learning rates could be decreased accordingly) last50errors: Array[1..50] of Extended; // LEARNING SETTINGS END public constructor Create; // create the network object and do the initialization procedure UpdateEligibilityTraces; // update the eligibility traces for the hidden and output layer procedure tdLearning; // learning algorithm: adjust the network's weights procedure ForwardPropagation; // propagate the input values through the network to the output layer function getRating(state: TFeatureVector; explorative: Boolean): Extended; // get the rating for a given state (feature vector) function HyperbolicTangent(x: Extended): Extended; // calculate the hyperbolic tangent [-1;1] procedure StartNewCycle; // start a new cycle with everything set to default except for the weights procedure setLearningMode(activated: Boolean=TRUE); // switch the learning mode on/off procedure setInputs(state: TFeatureVector); // transfer the given feature vector to the input layer (set input neurons' values) procedure setReward(currentReward: SmallInt); // set the reward for the current timestep (with learning then or without) procedure nextTimeStep; // increase timestep t function getCyclesTrained(): Integer; // get the number of cycles trained so far procedure Visualize(imgHidden: Pointer); // visualize the neural network's hidden layer end; implementation procedure TArtificialNeuralNetwork.UpdateEligibilityTraces; var i, j, k: Integer; begin // how worthy is a weight to be adjusted? for j := 1 to NEURONS_HIDDEN do begin for k := 1 to NEURONS_OUTPUT do begin eligibilityTraceOutput[j][k] := LAMBDA*eligibilityTraceOutput[j][k]+(neuronsOutput[k]*(1-neuronsOutput[k]))*neuronsHidden[j]; for i := 1 to NEURONS_INPUT do begin eligibilityTraceHidden[i][j][k] := LAMBDA*eligibilityTraceHidden[i][j][k]+(neuronsOutput[k]*(1-neuronsOutput[k]))*weightsHidden[j][k]*neuronsHidden[j]*(1-neuronsHidden[j])*neuronsInput[t][i]; end; end; end; end; procedure TArtificialNeuralNetwork.setReward; VAR i: Integer; begin for i := 1 to NEURONS_OUTPUT do begin // +1 = player A wins // 0 = draw // -1 = player B wins reward[t][i] := currentReward; end; end; procedure TArtificialNeuralNetwork.tdLearning; var i, j, k: Integer; begin if learningMode then begin for k := 1 to NEURONS_OUTPUT do begin if reward[t][k] = 0 then begin tdError[k] := GAMMA*neuronsOutput[k]-outputBefore[k]; // network's error value when reward is 0 end else begin tdError[k] := reward[t][k]-outputBefore[k]; // network's error value in the final state (reward received) end; for j := 1 to NEURONS_HIDDEN do begin weightsHidden[j][k] := weightsHidden[j][k]+LEARNING_RATE_HIDDEN*tdError[k]*eligibilityTraceOutput[j][k]; // adjust hidden->output weights according to TD-lambda for i := 1 to NEURONS_INPUT do begin weightsInput[i][j] := weightsInput[i][j]+LEARNING_RATE_INPUT*tdError[k]*eligibilityTraceHidden[i][j][k]; // adjust input->hidden weights according to TD-lambda end; end; end; end; end; procedure TArtificialNeuralNetwork.ForwardPropagation; var i, j, k: Integer; begin for j := 1 to NEURONS_HIDDEN do begin neuronsHidden[j] := 0; for i := 1 to NEURONS_INPUT do begin neuronsHidden[j] := neuronsHidden[j]+neuronsInput[t][i]*weightsInput[i][j]; // input -> hidden end; neuronsHidden[j] := HyperbolicTangent(neuronsHidden[j]); // activation of hidden neuron j end; for k := 1 to NEURONS_OUTPUT do begin neuronsOutput[k] := 0; for j := 1 to NEURONS_HIDDEN do begin neuronsOutput[k] := neuronsOutput[k]+neuronsHidden[j]*weightsHidden[j][k]; // hidden -> output end; neuronsOutput[k] := HyperbolicTangent(neuronsOutput[k]); // activation of output neuron k end; end; procedure TArtificialNeuralNetwork.setLearningMode; begin learningMode := activated; end; constructor TArtificialNeuralNetwork.Create; var i, j, k: Integer; begin inherited Create; Randomize; // initialize random numbers generator learningMode := TRUE; cyclesTrained := -2; // only set to -2 because it will be increased twice in the beginning StartNewCycle; for j := 1 to NEURONS_HIDDEN do begin for k := 1 to NEURONS_OUTPUT do begin weightsHidden[j][k] := abs(Random-0.5); // initialize weights: 0 <= random < 0.5 end; for i := 1 to NEURONS_INPUT do begin weightsInput[i][j] := abs(Random-0.5); // initialize weights: 0 <= random < 0.5 end; end; for i := 1 to 50 do begin last50errors[i] := 0; end; end; procedure TArtificialNeuralNetwork.nextTimeStep; begin t := t+1; end; procedure TArtificialNeuralNetwork.StartNewCycle; var i, j, k, m: Integer; begin t := 1; // start in timestep 1 cyclesTrained := cyclesTrained+1; // increase the number of cycles trained so far for j := 1 to NEURONS_HIDDEN do begin neuronsHidden[j] := 0; for k := 1 to NEURONS_OUTPUT do begin eligibilityTraceOutput[j][k] := 0; outputBefore[k] := 0; neuronsOutput[k] := 0; for m := 1 to MAX_TIMESTEPS do begin reward[m][k] := 0; end; end; for i := 1 to NEURONS_INPUT do begin for k := 1 to NEURONS_OUTPUT do begin eligibilityTraceHidden[i][j][k] := 0; end; end; end; end; function TArtificialNeuralNetwork.getCyclesTrained; begin result := cyclesTrained; end; procedure TArtificialNeuralNetwork.setInputs; var k: Integer; begin for k := 1 to NEURONS_INPUT do begin neuronsInput[t][k] := state[k]; end; end; function TArtificialNeuralNetwork.getRating; begin setInputs(state); ForwardPropagation; result := neuronsOutput[1]; if not explorative then begin tdLearning; // adjust the weights according to TD-lambda ForwardPropagation; // calculate the network's output again outputBefore[1] := neuronsOutput[1]; // set outputBefore which will then be used in the next timestep UpdateEligibilityTraces; // update the eligibility traces for the next timestep nextTimeStep; // go to the next timestep end; end; function TArtificialNeuralNetwork.HyperbolicTangent; begin if x > 5500 then // prevent overflow result := 1 else result := (Exp(2*x)-1)/(Exp(2*x)+1); end; end.

    Read the article

  • How do I create a list of timedeltas in python?

    - by eunhealee
    I've been searching through this website and have seen multiple references to time deltas, but haven't quite found what I'm looking for. Basically, I have a list of messages that are received by a comms server and I want to calcuate the latency time between each message out and in. It looks like this: 161336.934072 - TMsg out: [O] enter order. RefID [123] OrdID [4568] 161336.934159 - TMsg in: [A] accepted. ordID [456] RefNumber [123] Mixed in with these messages are other messages as well, however, I only want to capture the difference between the Out messages and in messages with the same RefID. So far, to sort out from the main log which messages are Tmessages I've been doing this, but it's really inefficient. I don't need to be making new files everytime.: big_file = open('C:/Users/kdalton/Documents/Minicomm.txt', 'r') small_file1 = open('small_file1.txt', 'w') for line in big_file: if 'T' in line: small_file1.write(line) big_file.close() small_file1.close() How do I calculate the time deltas between the two messages and sort out these messages from the main log?

    Read the article

  • RSpec: in-depth differences between before(:all) and before(:each)

    - by gmile
    Ok, so I've ran into a very strange issue, directly connected with before blocks. I'm doing a integration testing via Watir and RSpec. For a simple test to check if user can perform a login I'm creating a 'user' record in the db by means of factory_girl. So I put the following code: before(:each) do @user = Factory(:user) end if "should perform a login" do # do stuff end In do stuff I call a browser and see how the user tries to login. Unfortunately, somehow he cannot do that — "Username isn't valid". After some investigation I discovered that if I put the code for creating user in before(:all) block, everything magically works. How's that? What's the difference between :all and :each in this context? Also, If I put the code for creating user actually in the test body, it still doesn't work (i.e. user somehow isn't added to the DB or something).

    Read the article

  • Comparison tool with easy line insertion

    - by Miro Kropacek
    Back in good old days I used to use a tool for file comparison with one incredible feature -- you open file1, file2, see a difference, no magic here. But then you could insert an empty line(s) into file1 with one keyboard combo and into file2 with another keyboard combo. So you could easily adjust how are C / asm function aligned in case diff engine failed to recognize similar stuff. Of course, after the adjust (insertion / removal of one or more lines in either file) whole diff was "recalculated". I fail to find similar feature in diff, KDiff, ... I'd prefer Linux app but I'm OK with Windows app as last resort... Thanks for any hint!

    Read the article

  • Getting number of days between [NSDate date] and string @"2010-11-12"

    - by grobald
    Hello i have a question. in short: i need a function which receives [NSDate date] and string @"2010-11-12" and returns the amount of days between those two dates. more explanation: I need to store a date from a server in the format @"2010-11-12" in my NSUserdefaults. The meaning of this date is the expireDate of a feature in an iPhone App. Everytime i press on a button for this feature i need to check if the difference in days between the current time-[NSDate date] and @"2010-11-12" is greater than 0. That means tahat the feature is disabled. its making me crazy mabey its to dead simple.

    Read the article

  • Difference between httpd and httpd-devel package on CentOS 5

    - by superbarney
    I'm trying to update PHP 5.1 to 5.3 on CentOS 5.10. On the server, httpd-devel is installed but trying to install php53 on CentOS 5, it wants to install httpd. This is a production server so I need to know if it's safe to install httpd when httpd-devel is already installed. The php package (5.1.6) on CentOS 5 does not have the httpd dependency. What is the difference between httpd-devel and httpd?

    Read the article

  • What is the difference between safety and security?

    - by Lernkurve
    Question What is the difference between safety and security in the context of information management or computer science? Elaboration This could be the canonical answer for people searching for it. Let me know if superuser.com is the wrong site for this question. I have, of course, googled it and haven't found an answer that seemed short and to the point. Wikipedia wasn't very helpful either: safety, information security.

    Read the article

  • Difference between Xen PV, Xen KVM and HVM?

    - by JP19
    Hi, I know that Xen is usually better than OpenVZ as the provider cannot oversell in Xen. However, what is the difference between Xen PV, Xen KVM and HVM (I was going through this provider's specs? Which one is better for what purposes and why? Edit: For an end-user who will just be hosting websites, which is better? From efficiency or other point of view, is there any advantage of one over the other?

    Read the article

  • Difference ProxyPass and RewriteRule

    - by Wesho
    I just came across a case where ProxyPass (ProxyPassMatch to be exact) is being used in an Apache configuration file. This mod_proxy rule is being used to proxy from a whole cluster to one specific server, when a certain file is requested which only resides on that server. Now I'm a bit confused since I can't grasp why something like this cannot be achieved using a RewriteRule. So in essence I want to ask: What is the difference between ProxyPassMatch and a RewriteRule in this case?

    Read the article

  • What's the difference between RDP vs VNC ?

    - by Jonathan
    Okay I was playing around with the iPhone Jaadu App and I realized I download the wrong desktop client. So what's really the difference between the RDP and VNC? (because there is Jaadu RDP and Jaadu VNC 2 different app) They both provide the same function? features ?

    Read the article

  • Difference between resin and resin pro

    - by riteshmnayak
    I planning to deploy resin for a project that I am working on but cannot figure out the version of resin I must use. The downloads page lists two products, Resin and Resin Pro with dev, stable snapshots. What is the difference between the pro version and the plain version? Is pro a paid version or something?

    Read the article

  • Difference between Software Services & IT Consulting.

    - by Rohit
    I have been looking into sites of IT companies but I am confused with the terms they use for their offerings. Some write: "Software Services, IT Consulting", some write: "Technology, Consulting", some write" "Product engineering, Application Development". Can someone clarify what is the difference between: (1) Software services & IT Consulting. (2) Technology and Consulting.

    Read the article

  • Is there a difference between the actual chip intended for Dual Channel vs Triple Channel

    - by JimDel
    Is there a difference between the actual chips intended for Dual Channel vs Triple Channel. I bought a set of triple channel memory but I'm only using 2 of the 3 chips. I'm getting the error "Display driver NVIDIA Windows Kernel Mode Driver, Version 197.45 stopped responding and has successfully recovered" There seems to be a TON of discussion on the web about this and some say it might be RAM related. This is the ram I'm using. Thanks

    Read the article

  • What exactly is the difference between update-manager -d and the process

    - by jldugger
    The official recommendation of Ubuntu is to use sudo do-release-upgrade to do an online upgrade from one version to the next. Historically many of my Debianite friends and myself have simply altered apt's sources.list and run apt-get dist-upgrade. I follow Ubuntu's recommendations, but I've always wondered what the magic difference between these two processes is. What, exactly, does do-release-upgrade do, on say an upgrade from 9.04 to 9.10? (Examples from other releases welcome.)

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >