Search Results

Search found 129 results on 6 pages for 'mathematica 5'.

Page 6/6 | < Previous Page | 2 3 4 5 6 

  • Algorithm for finding symmetries of a tree

    - by Paxinum
    I have n sectors, enumerated 0 to n-1 counterclockwise. The boundaries between these sectors are infinite branches (n of them). The sectors live in the complex plane, and for n even, sector 0 and n/2 are bisected by the real axis, and the sectors are evenly spaced. These branches meet at certain points, called junctions. Each junction is adjacent to a subset of the sectors (at least 3 of them). Specifying the junctions, (in pre-fix order, lets say, starting from junction adjacent to sector 0 and 1), and the distance between the junctions, uniquely describes the tree. Now, given such a representation, how can I see if it is symmetric wrt the real axis? For example, n=6, the tree (0,1,5)(1,2,4,5)(2,3,4) have three junctions on the real line, so it is symmetric wrt the real axis. If the distances between (015) and (1245) is equal to distance from (1245) to (234), this is also symmetric wrt the imaginary axis. The tree (0,1,5)(1,2,5)(2,4,5)(2,3,4) have 4 junctions, and this is never symmetric wrt either imaginary or real axis, but it has 180 degrees rotation symmetry if the distance between the first two and the last two junctions in the representation are equal. Edit: This is actually for my research. I have posted the question at mathoverflow as well, but my days in competition programming tells me that this is more like an IOI task. Code in mathematica would be excellent, but java, python, or any other language readable by a human suffices. Here are some examples (pretend the double edges are single and we have a tree) http://www2.math.su.se/~per/files.php?file=contr_ex_1.pdf http://www2.math.su.se/~per/files.php?file=contr_ex_2.pdf http://www2.math.su.se/~per/files.php?file=contr_ex_5.pdf Example 1 is described as (0,1,4)(1,2,4)(2,3,4)(0,4,5) with distances (2,1,3). Example 2 is described as (0,1,4)(1,2,4)(2,3,4)(0,4,5) with distances (2,1,1). Example 5 is described as (0,1,4,5)(1,2,3,4) with distances (2). So, given the description/representation, I want to find some algorithm to decide if it is symmetric wrt real, imaginary, and rotation 180 degrees. The last example have 180 degree symmetry. (These symmetries corresponds to special kinds of potential in the Schroedinger equation, which has nice properties in quantum mechanics.)

    Read the article

  • A Guided Tour of Complexity

    - by JoshReuben
    I just re-read Complexity – A Guided Tour by Melanie Mitchell , protégé of Douglas Hofstadter ( author of “Gödel, Escher, Bach”) http://www.amazon.com/Complexity-Guided-Tour-Melanie-Mitchell/dp/0199798109/ref=sr_1_1?ie=UTF8&qid=1339744329&sr=8-1 here are some notes and links:   Evolved from Cybernetics, General Systems Theory, Synergetics some interesting transdisciplinary fields to investigate: Chaos Theory - http://en.wikipedia.org/wiki/Chaos_theory – small differences in initial conditions (such as those due to rounding errors in numerical computation) yield widely diverging outcomes for chaotic systems, rendering long-term prediction impossible. System Dynamics / Cybernetics - http://en.wikipedia.org/wiki/System_Dynamics – study of how feedback changes system behavior Network Theory - http://en.wikipedia.org/wiki/Network_theory – leverage Graph Theory to analyze symmetric  / asymmetric relations between discrete objects Algebraic Topology - http://en.wikipedia.org/wiki/Algebraic_topology – leverage abstract algebra to analyze topological spaces There are limits to deterministic systems & to computation. Chaos Theory definitely applies to training an ANN (artificial neural network) – different weights will emerge depending upon the random selection of the training set. In recursive Non-Linear systems http://en.wikipedia.org/wiki/Nonlinear_system – output is not directly inferable from input. E.g. a Logistic map: Xt+1 = R Xt(1-Xt) Different types of bifurcations, attractor states and oscillations may occur – e.g. a Lorenz Attractor http://en.wikipedia.org/wiki/Lorenz_system Feigenbaum Constants http://en.wikipedia.org/wiki/Feigenbaum_constants express ratios in a bifurcation diagram for a non-linear map – the convergent limit of R (the rate of period-doubling bifurcations) is 4.6692016 Maxwell’s Demon - http://en.wikipedia.org/wiki/Maxwell%27s_demon - the Second Law of Thermodynamics has only a statistical certainty – the universe (and thus information) tends towards entropy. While any computation can theoretically be done without expending energy, with finite memory, the act of erasing memory is permanent and increases entropy. Life & thought is a counter-example to the universe’s tendency towards entropy. Leo Szilard and later Claude Shannon came up with the Information Theory of Entropy - http://en.wikipedia.org/wiki/Entropy_(information_theory) whereby Shannon entropy quantifies the expected value of a message’s information in bits in order to determine channel capacity and leverage Coding Theory (compression analysis). Ludwig Boltzmann came up with Statistical Mechanics - http://en.wikipedia.org/wiki/Statistical_mechanics – whereby our Newtonian perception of continuous reality is a probabilistic and statistical aggregate of many discrete quantum microstates. This is relevant for Quantum Information Theory http://en.wikipedia.org/wiki/Quantum_information and the Physics of Information - http://en.wikipedia.org/wiki/Physical_information. Hilbert’s Problems http://en.wikipedia.org/wiki/Hilbert's_problems pondered whether mathematics is complete, consistent, and decidable (the Decision Problem – http://en.wikipedia.org/wiki/Entscheidungsproblem – is there always an algorithm that can determine whether a statement is true).  Godel’s Incompleteness Theorems http://en.wikipedia.org/wiki/G%C3%B6del's_incompleteness_theorems  proved that mathematics cannot be both complete and consistent (e.g. “This statement is not provable”). Turing through the use of Turing Machines (http://en.wikipedia.org/wiki/Turing_machine symbol processors that can prove mathematical statements) and Universal Turing Machines (http://en.wikipedia.org/wiki/Universal_Turing_machine Turing Machines that can emulate other any Turing Machine via accepting programs as well as data as input symbols) that computation is limited by demonstrating the Halting Problem http://en.wikipedia.org/wiki/Halting_problem (is is not possible to know when a program will complete – you cannot build an infinite loop detector). You may be used to thinking of 1 / 2 / 3 dimensional systems, but Fractal http://en.wikipedia.org/wiki/Fractal systems are defined by self-similarity & have non-integer Hausdorff Dimensions !!!  http://en.wikipedia.org/wiki/List_of_fractals_by_Hausdorff_dimension – the fractal dimension quantifies the number of copies of a self similar object at each level of detail – eg Koch Snowflake - http://en.wikipedia.org/wiki/Koch_snowflake Definitions of complexity: size, Shannon entropy, Algorithmic Information Content (http://en.wikipedia.org/wiki/Algorithmic_information_theory - size of shortest program that can generate a description of an object) Logical depth (amount of info processed), thermodynamic depth (resources required). Complexity is statistical and fractal. John Von Neumann’s other machine was the Self-Reproducing Automaton http://en.wikipedia.org/wiki/Self-replicating_machine  . Cellular Automata http://en.wikipedia.org/wiki/Cellular_automaton are alternative form of Universal Turing machine to traditional Von Neumann machines where grid cells are locally synchronized with their neighbors according to a rule. Conway’s Game of Life http://en.wikipedia.org/wiki/Conway's_Game_of_Life demonstrates various emergent constructs such as “Glider Guns” and “Spaceships”. Cellular Automatons are not practical because logical ops require a large number of cells – wasteful & inefficient. There are no compilers or general program languages available for Cellular Automatons (as far as I am aware). Random Boolean Networks http://en.wikipedia.org/wiki/Boolean_network are extensions of cellular automata where nodes are connected at random (not to spatial neighbors) and each node has its own rule –> they demonstrate the emergence of complex  & self organized behavior. Stephen Wolfram’s (creator of Mathematica, so give him the benefit of the doubt) New Kind of Science http://en.wikipedia.org/wiki/A_New_Kind_of_Science proposes the universe may be a discrete Finite State Automata http://en.wikipedia.org/wiki/Finite-state_machine whereby reality emerges from simple rules. I am 2/3 through this book. It is feasible that the universe is quantum discrete at the plank scale and that it computes itself – Digital Physics: http://en.wikipedia.org/wiki/Digital_physics – a simulated reality? Anyway, all behavior is supposedly derived from simple algorithmic rules & falls into 4 patterns: uniform , nested / cyclical, random (Rule 30 http://en.wikipedia.org/wiki/Rule_30) & mixed (Rule 110 - http://en.wikipedia.org/wiki/Rule_110 localized structures – it is this that is interesting). interaction between colliding propagating signal inputs is then information processing. Wolfram proposes the Principle of Computational Equivalence - http://mathworld.wolfram.com/PrincipleofComputationalEquivalence.html - all processes that are not obviously simple can be viewed as computations of equivalent sophistication. Meaning in information may emerge from analogy & conceptual slippages – see the CopyCat program: http://cognitrn.psych.indiana.edu/rgoldsto/courses/concepts/copycat.pdf Scale Free Networks http://en.wikipedia.org/wiki/Scale-free_network have a distribution governed by a Power Law (http://en.wikipedia.org/wiki/Power_law - much more common than Normal Distribution). They are characterized by hubs (resilience to random deletion of nodes), heterogeneity of degree values, self similarity, & small world structure. They grow via preferential attachment http://en.wikipedia.org/wiki/Preferential_attachment – tipping points triggered by positive feedback loops. 2 theories of cascading system failures in complex systems are Self-Organized Criticality http://en.wikipedia.org/wiki/Self-organized_criticality and Highly Optimized Tolerance http://en.wikipedia.org/wiki/Highly_optimized_tolerance. Computational Mechanics http://en.wikipedia.org/wiki/Computational_mechanics – use of computational methods to study phenomena governed by the principles of mechanics. This book is a great intuition pump, but does not cover the more mathematical subject of Computational Complexity Theory – http://en.wikipedia.org/wiki/Computational_complexity_theory I am currently reading this book on this subject: http://www.amazon.com/Computational-Complexity-Christos-H-Papadimitriou/dp/0201530821/ref=pd_sim_b_1   stay tuned for that review!

    Read the article

  • What is the coolest thing you can do in <10 lines of simple code? Help me inspire beginners!

    - by Tom Ritter
    I'm looking for the coolest thing you can do in a few lines of simple code. I'm sure you can write a Mandelbrot set in Haskell in 15 lines but it's difficult to follow. My goal is to inspire students that programming is cool. We know that programming is cool because you can create anything you imagine - it's the ultimate creative outlet. I want to inspire these beginners and get them over as many early-learning humps as I can. Now, my reasons are selfish. I'm teaching an Intro to Computing course to a group of 60 half-engineering, half business majors; all freshmen. They are the students who came from underprivileged High schools. From my past experience, the group is generally split as follows: a few rock-stars, some who try very hard and kind of get it, the few who try very hard and barely get it, and the few who don't care. I want to reach as many of these groups as effectively as I can. Here's an example of how I'd use a computer program to teach: Here's an example of what I'm looking for: a 1-line VBS script to get your computer to talk to you: CreateObject("sapi.spvoice").Speak InputBox("Enter your text","Talk it") I could use this to demonstrate order of operations. I'd show the code, let them play with it, then explain that There's a lot going on in that line, but the computer can make sense of it, because it knows the rules. Then I'd show them something like this: 4(5*5) / 10 + 9(.25 + .75) And you can see that first I need to do is (5*5). Then I can multiply for 4. And now I've created the Object. Dividing by 10 is the same as calling Speak - I can't Speak before I have an object, and I can't divide before I have 100. Then on the other side I first create an InputBox with some instructions for how to display it. When I hit enter on the input box it evaluates or "returns" whatever I entered. (Hint: 'oooooo' makes a funny sound) So when I say Speak, the right side is what to Speak. And I get that from the InputBox. So when you do several things on a line, like: x = 14 + y; You need to be aware of the order of things. First we add 14 and y. Then we put the result (what it evaluates to, or returns) into x. That's my goal, to have a bunch of these cool examples to demonstrate and teach the class while they have fun. I tried this example on my roommate and while I may not use this as the first lesson, she liked it and learned something. Some cool mathematica programs that make beautiful graphs or shapes that are easy to understand would be good ideas and I'm going to look into those. Here are some complicated actionscript examples but that's a bit too advanced and I can't teach flash. What other ideas do you have?

    Read the article

  • Code golf: Word frequency chart

    - by ChristopheD
    The challenge: Build an ASCII chart of the most commonly used words in a given text. The rules: Only accept a-z and A-Z (alphabetic characters) as part of a word. Ignore casing (She == she for our purpose). Ignore the following words (quite arbitary, I know): the, and, of, to, a, i, it, in, or, is Clarification: considering don't: this would be taken as 2 different 'words' in the ranges a-z and A-Z: (don and t). Optionally (it's too late to be formally changing the specifications now) you may choose to drop all single-letter 'words' (this could potentially make for a shortening of the ignore list too). Parse a given text (read a file specified via command line arguments or piped in; presume us-ascii) and build us a word frequency chart with the following characteristics: Display the chart (also see the example below) for the 22 most common words (ordered by descending frequency). The bar width represents the number of occurences (frequency) of the word (proportionally). Append one space and print the word. Make sure these bars (plus space-word-space) always fit: bar + [space] + word + [space] should be always <= 80 characters (make sure you account for possible differing bar and word lenghts: e.g.: the second most common word could be a lot longer then the first while not differing so much in frequency). Maximize bar width within these constraints and scale the bars appropriately (according to the frequencies they represent). An example: The text for the example can be found here (Alice's Adventures in Wonderland, by Lewis Carroll). This specific text would yield the following chart: _________________________________________________________________________ |_________________________________________________________________________| she |_______________________________________________________________| you |____________________________________________________________| said |____________________________________________________| alice |______________________________________________| was |__________________________________________| that |___________________________________| as |_______________________________| her |____________________________| with |____________________________| at |___________________________| s |___________________________| t |_________________________| on |_________________________| all |______________________| this |______________________| for |______________________| had |_____________________| but |____________________| be |____________________| not |___________________| they |__________________| so For your information: these are the frequencies the above chart is built upon: [('she', 553), ('you', 481), ('said', 462), ('alice', 403), ('was', 358), ('that ', 330), ('as', 274), ('her', 248), ('with', 227), ('at', 227), ('s', 219), ('t' , 218), ('on', 204), ('all', 200), ('this', 181), ('for', 179), ('had', 178), (' but', 175), ('be', 167), ('not', 166), ('they', 155), ('so', 152)] A second example (to check if you implemented the complete spec): Replace every occurence of you in the linked Alice in Wonderland file with superlongstringstring: ________________________________________________________________ |________________________________________________________________| she |_______________________________________________________| superlongstringstring |_____________________________________________________| said |______________________________________________| alice |________________________________________| was |_____________________________________| that |______________________________| as |___________________________| her |_________________________| with |_________________________| at |________________________| s |________________________| t |______________________| on |_____________________| all |___________________| this |___________________| for |___________________| had |__________________| but |_________________| be |_________________| not |________________| they |________________| so The winner: Shortest solution (by character count, per language). Have fun! Edit: Table summarizing the results so far (2012-02-15) (originally added by user Nas Banov): Language Relaxed Strict ========= ======= ====== GolfScript 130 143 Perl 185 Windows PowerShell 148 199 Mathematica 199 Ruby 185 205 Unix Toolchain 194 228 Python 183 243 Clojure 282 Scala 311 Haskell 333 Awk 336 R 298 Javascript 304 354 Groovy 321 Matlab 404 C# 422 Smalltalk 386 PHP 450 F# 452 TSQL 483 507 The numbers represent the length of the shortest solution in a specific language. "Strict" refers to a solution that implements the spec completely (draws |____| bars, closes the first bar on top with a ____ line, accounts for the possibility of long words with high frequency etc). "Relaxed" means some liberties were taken to shorten to solution. Only solutions shorter then 500 characters are included. The list of languages is sorted by the length of the 'strict' solution. 'Unix Toolchain' is used to signify various solutions that use traditional *nix shell plus a mix of tools (like grep, tr, sort, uniq, head, perl, awk).

    Read the article

< Previous Page | 2 3 4 5 6