Search Results

Search found 18914 results on 757 pages for 'turing machine'.

Page 2/757 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Practical non-Turing-complete languages?

    - by Kyle Cronin
    Nearly all programming languages used are Turing Complete, and while this affords the language to represent any computable algorithm, it also comes with its own set of problems. Seeing as all the algorithms I write are intended to halt, I would like to be able to represent them in a language that guarantees they will halt. Regular expressions used for matching strings and finite state machines are used when lexing, but I'm wondering if there's a more general, broadly language that's not Turing complete? edit: I should clarify, by 'general purpose' I don't necessarily want to be able to write all halting algorithms in the language (I don't think that such a language would exist) but I suspect that there are common threads in halting proofs that can be generalized to produce a language in which all algorithms are guaranteed to halt. There's also another way to tackle this problem - eliminate the need for theoretically infinite memory. Once you limit the amount of memory the machine is allowed, the number of states the machine is in is finite and countable, and therefore you can determine if the algorithm will halt (by not allowing the machine to move into a state it's been in before).

    Read the article

  • JFLAP Turing Machine shortcut problem

    - by Robert Lamb
    In JFLAP (http://jflap.org), there are some shortcuts for Turing machine transitions. One of these shortcuts allows you to transition as long as the current tape symbol isn't the indicated symbol. For example, the transition !g,x;R basically says "Take this transition if the current tape symbol is not g". So far, so good. But the transition I want is !?,~;R which basically says "Move right as long as the current symbol is not the end-of-string (empty cell) symbol". The problem is I cannot figure out how to type in "!?". The JFLAP online documentation (http://www.jflap.org/tutorial/turing/one/index.html#syntax) has this to say: The first shortcut is that there exists the option of using the “!” character to convey the meaning of “any character but this character.” For example, concerning the transition (!a; x, R), if the head encounters any character but an “a”, it will replace the character with an “x” and move right. To write the expression “!?”, just type a “1” in when inputting a command. My question is...how do I actually do what that last sentence is trying to explain to me? Thanks for your help! Robert

    Read the article

  • Make Your Own Paper-Craft Enigma Machine [DIY Project]

    - by Asian Angel
    If you love tinkering around with ciphers and want a fun DIY project for the upcoming weekend, then we have just the thing for you. Using common household items you can construct your own personal Enigma machine that will be completely compatible with all the settings of a real Enigma machine (models I, M1, M2 and M3). Visit the second link below for the step-by-step instructions and enjoy putting together this awesome DIY project! PDF Templates for the Enigma Machine Note: This is a direct link for the PDF file itself and the templates are sized for printing on 2 A4 sheets of paper. Enigma/Paper Enigma Instruction Homepage [via BoingBoing] HTG Explains: What Is RSS and How Can I Benefit From Using It? HTG Explains: Why You Only Have to Wipe a Disk Once to Erase It HTG Explains: Learn How Websites Are Tracking You Online

    Read the article

  • Resources on learning to program in machine code?

    - by AceofSpades
    I'm a student, fresh into programming and loving it, from Java to C++ and down to C. I moved backwards to the barebones and thought to go further down to Assembly. But, to my surprise, a lot of people said it's not as fast as C and there is no use. They suggested learning either how to program a kernel or writing a C compiler. My dream is to learn to program in binary (machine code) or maybe program bare metal (program micro-controller physically) or write bios or boot loaders or something of that nature. The only possible thing I heard after so much research is that a hex editor is the closest thing to machine language I could find in this age and era. Are there other things I'm unaware of? Are there any resources to learn to program in machine code? Preferably on a 8-bit micro-controller/microprocessor. This question is similar to mine, but I'm interested in practical learning first and then understanding the theory.

    Read the article

  • Resources for Virtual Machine programming

    - by good_computer
    I am a beginner (a little more than that) programmer of C. I am really interested in the field of Virtual Machines. When I read about the Python VM, the PyPy project, the advancements in JVM technology, Google V8, the Erlang VM, I really get excited about these amazing pieces of technology, and really want to get my hands dirty building them or contributing to one of these projects. I need to know.. what are the things (language, concepts, algorithms, math, etc?) I need to know/learn to be able to build a virtual machine any books or other resources that will be helpful career prospects for a virtual machine engineer (but this is least important for me for now) (one more side question: somewhere I'd read something like JVM is on the cutting edge of virtual machine technology -- that it is the most advanced VM so far -- is that true?) Please give me a LONG answer detailing all that you know.

    Read the article

  • Turing-Complete language possibilities?

    - by I can't tell you my name.
    In every Turing-Complete language, is it possible to create a working Compiler for itself which first runs on an interpreter written in some other language and then compiles it's own source code? (Bootstrapping) Standards-Compilant C++ compiler which outputs binaries for, e.g.: Windows? Regex Parser and Evaluater? World of Warcraft clone? (Assuming the language gets the necessary API bindings as, for example, OpenGL and the WoW source code is available) (Everything here theoretical) Let's take Brainf*ck as an example language.

    Read the article

  • A Turing Machine Question

    - by Hellnar
    Greetings, I have been struggling to find a question regarding this theoretical question, even tho it is not directly a programming question, I believe it is really related. Assume a type of Turing machine which cannot have more than 1000 squares. What would be the relationship between the set of such type of recognizable languages and set of normal recognizable languages.

    Read the article

  • Constructs for wrapping a hardware state machine

    - by Henry Gomersall
    I am using a piece of hardware with a well defined C API. The hardware is stateful, with the relevant API calls needing to be in the correct order for the hardware to work properly. The API calls themselves will always return, passing back a flag that advises whether the call was successful, or if not, why not. The hardware will not be left in some ill defined state. In effect, the API calls advise indirectly of the current state of the hardware if the state is not correct to perform a given operation. It seems to be a pretty common hardware API style. My question is this: Is there a well established design pattern for wrapping such a hardware state machine in a high level language, such that consistency is maintained? My development is in Python. I ideally wish the hardware state machine to be abstracted to a much simpler state machine and wrapped in an object that represents the hardware. I'm not sure what should happen if an attempt is made to create multiple objects representing the same piece of hardware. I apologies for the slight vagueness, I'm not very knowledgeable in this area and so am fishing for assistance of the description as well!

    Read the article

  • Database Machine, 11gR2 és a Tivoli Data Protection is együttmuködik

    - by Fekete Zoltán
    Felmerült a kérdés, hogy a Database Machine környezetben végezhetjük-e a mentéseket Tivoli Data Protection for Oracle szoftverrel. A válasz: IGEN. Az IBM Tivoli Data Protection for Oracle V5.5.2 on Linux x86_64 immár bevizsgáltatott az Oracle 11gR2 RAC-cal. Az aktuális support információ itt található: A V5.5.2-re kattintva megtaláljátok a következo adatokat: Oracle Enterprise Linux 5 with any of the following Oracle releases: * 64-bit Oracle Standard or Enterprise Server 10gR2, 11g, or 11gR2 * 64-bit Real Application Clusters (RAC) 10gR2, 11g, or 11gR2 Ez az információ elegendo és remek :), mivel a Database Machine komponensek Oracle Enterprise Linux operációs rendszeren muködnek, 64 bites architektúrában és az Oracle RMAN (Recovery Manager) elemet tudja használni a TDP.

    Read the article

  • Database Machine gyakorlati tapasztalatok!

    - by Fekete Zoltán
    Ketto héttel ezelott gyakorlati tapasztalatokat szereztünk egy magyarországon muködo vállalat adattárházával egy igazi Database Machine / Exadata környezetben, egy Sun Oracle Database Machine Half Rack kiépítéssel. Az eredmények valóban lenyugözöek. Az Exadata Storage izgalmas és egyedi tulajdonságai: Smart Scan, Smart Flash Cache, Storage Index, tömörítés: Exadata Hybrid Columnar Compression, a hihetetlen mértéku párhuzamosság olyan teljesítmény elonyhöz juttatja a felhasználót, ami szemelkerekedést, ujjongást és hosszantartó belso mosolyt eredményez. Hogy ez hasznos-e az Ön és adatbázis környezete egészségére nézve? :) Kérdezze meg orvosát, gyógyszerészét és a white paper leírásokat, továbbá publikált ügyfél történeteket. A technikai és kereskedelmi részletekrol kérdezzen engem. :) A holnapi (2010. máricius 24. szerda) eloadáson a HOUG Konferencián személyesen is meghallgathatók ezek a gyakorlati tapasztalatok, authentikus forrásból. Hasonlóan szép napokat kívánok mindenkinek!

    Read the article

  • Machine learning challenge: diagnosing program in java/groovy (datamining, machine learning)

    - by Registered User
    Hi All! I'm planning to develop program in Java which will provide diagnosis. The data set is divided into two parts one for training and the other for testing. My program should learn to classify from the training data (BTW which contain answer for 30 questions each in new column, each record in new line the last column will be diagnosis 0 or 1, in the testing part of data diagnosis column will be empty - data set contain about 1000 records) and then make predictions in testing part of data :/ I've never done anything similar so I'll appreciate any advice or information about solution to similar problem. I was thinking about Java Machine Learning Library or Java Data Mining Package but I'm not sure if it's right direction... ? and I'm still not sure how to tackle this challenge... Please advise. All the best!

    Read the article

  • Is it possible to predict future using machine learning and/or AI?

    - by Shekhar
    Recently I have started reading about machine learning. From 3000 feet view, machine learning seems really great thing but as if now I have found that machine learning is limited to only 3 types of algorithms namely classification, clustering and recommendations. I would like to know if my assumption about types of machine learning algorithms is correct or not and What is the extreme thing which we can do using machine learning and/or AI? Is it possible to predict future (same way we predict weather) using AI and/or machine learning?

    Read the article

  • Ha a hutés nem elég a gépteremben: Sun Cooling Door a Database Machine-hoz

    - by Fekete Zoltán
    A Database Machine hatalmas teljesítménye miatt általában jóval kevesebb hutésre van szükség, mintha egy külön high-end servert és külön high-end storage-ot hutenénk! Ha viszont a géptermünk maradék hutési kapacitása nem elegendo, és nem elégszünk meg a "hagyományos mosóporral", akkor újabb hutési trükkre van szükség. Erre kínálnak megoldást a Sun Cooling Door modellek, például az 5200-as és az 5600-as modellek.

    Read the article

  • Adatlap az Exadata, Database Machine supporthoz

    - by Fekete Zoltán
    Letöltheto és megtekintheto, sot elolvasható :) a COMPLETE SUPPORT SERVICES FOR ORACLE EXADATA dokumentum, amely részletesen leírja, milyen trméktámogatási lehetoségeket lehet igénybe venni egy Database Machine megoldáshoz. Az Exadata termékcsalád elemei mind tranzakciós rendszerek mind adattárházak adatbázisának futtatásához és adatbázisok egy közös szerveren muködtetéséhez nyújtanak optimális, nagy teljesítményu platformot.

    Read the article

  • Basics of Machine Learning

    - by user1263514
    I am going through Machine Learning algorithms since a week. And there are some doubts that I have in my mind regarding ML. Here are some of the basic questions that I need answers for What are the basic criterias for selecting any Clustering Algorithm? What are the factors affecting the Performance of any Algorithm and any ways to improve them? Please give me some idea as to how do I cope up with these basic questions.

    Read the article

  • Alan Turing Needs Your Help

    - by Chris Massey
    Well. sort of. Clearly, you are using a computer. If you are on this site, you are probably quite familiar with computers as artifacts of our modern society. Hopefully, you are also familiar with the fact that Alan Turing, logician and mathematician extraordinaire, was instrumental in laying down the foundations of modern computer science, and did a little work to help turn the tide of WWII in the Allies' favor. Hold that thought. A phenomenal collection of Turing's papers (including his first ever...(read more)

    Read the article

  • Le prix Turing pour un chercheur de légende, Google et Intel applaudissent ce « Nobel de l'informati

    Un chercheur légendaire reçoit le prix Turing Google et Intel applaudissent ce "Nobel de l'informatique" remis à un employé de Microsoft Qui a inventé la métaphore du "bureau" d'un PC ? Qui a introduit le premier la notion d'interface graphique pour le dit bureau ? Qui a créé la première barre des taches ? Qui, en somme, a inventé le "prototype des ordinateurs personnels de réseau" ? Un seul et même homme : Chuck Thacker. Pour tous ses travaux précurseurs qui ont façonné ce que sont les ordinateurs d'aujourd'hui, la ACM (Association for Computing Machinery) vient de lui décerner le prestigieux Prix Turing, équivalent du Prix Nobel dans l...

    Read the article

  • Project Turing: Beginning RIA Services

        Turing Project Page: [Novice: 9 | Advanced: 6 ]    FAQ | Table of Contents | Definitions What is this and where do I start? Reposted with VB.Net code     From Database to DataGrid The next step in Project Turing is to create a first iteration of the Silverlight application that will retrieve data from our database.  Using our technology of [...]...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Adaptative interface with Open GL and machine learning in C#

    - by Afnan
    For my Semester project I have to go for any Adaptative Interface Design. My language is C# and I have to Use OpenTK (wrapper for OpenGL). I have an idea that I should show two points and some obstacles and my subject (user) would drag an object from one place to the final place avoiding the Obstacles. Also (s)he can place obstacles randomly. My software should be able to learn some paths by doing test runs and then after learning it should be able to predict the shortest path. I do not know how stupid this idea sounds but it is just an idea. I need help regarding any ideas for adaptative interface possible small projects or if my idea is ok then please can you tell me what should be used to implement it? I mean that along with OpenGl for the Graphics what can I use for machine learning?

    Read the article

  • Computer Science or Computer Engineering for Data Science and Machine Learning

    - by ATMathew
    I'm a 25 year old data consultant who is considering returning to school to get a second bachelors degree in computer science or engineering. My interest is data science and machine learning. I use programming as a means to an end, and use languages like Python, R, C, Java, and Hadoop to find meaning in large data sets. Would a computer science or computer engineering degree be better for this? I realize that a statistics degree may be even more beneficial, but I'll be at a school which dosn't have a stats department or a computational math department.

    Read the article

  • Adaptative Interface with Open GL and Machine Learning in c#

    - by Afnan
    For my Semester project i have to go for any Adaptative Interface Design. My language is c# and i have to Use OpenTK(Wrapper for Open GL). I have an idea that I should show Two points and some obstacles. and my subject which is user would drag an object from one place to the final place avoiding the Obstacles. and he can place obstacles randomly.My software should be made to learn some paths by doing test runs and then after learning program should be able to predict the shortest path. I donot know how stupid this idea sounds but it is just an idea.I need help regarding any ideas for adaptative interface possible small projects or if my idea is ok then please can you tell me what should be used to implement it? I mean that along with OpenGl for the Graphics what can i use for machine learning that helps me Thanks

    Read the article

  • Collaborate10 &ndash; THEconference

    - by jean-pierre.dijcks
    After spending a few days in Mandalay Bay's THEHotel, I guess I now call everything THE... Seriously, they even tag their toilet paper with THEtp... I guess the brand builders in Vegas thought that once you are on to something you keep on doing it, and granted it is a nice hotel with nice rooms. THEanalytics Most of my collab10 experience was in a room called Reef C, where the BIWA bootcamp was held. Two solid days of BI, Warehousing and Analytics organized by the BIWA SIG at IOUG. Didn't get to see all sessions, but what struck me was the high interest in Analytics. Marty Gubar's OLAP session was full and he did some very nice things with the OLAP option. The cool bit was that he actually gets all the advanced calculations in OLAP to show up in OBI EE without any effort. It was nice to see that the idea from OWB where you generate an RPD is now also in AWM. I think it makes life so much simpler to generate these RPD's from your data model. Even if the end RPD needs some tweaking, it is all a lot less effort to get something going. You can see this stuff for yourself in this demo (click here). OBI EE uses just SQL to get to the calculations, and so, if you prefer APEX, you can build you application there and get the same nice calculations in an APEX application. Marty also showed the Simba MDX driver used with Excel. I guess we should call that THEcoolone... and it is very slick and wonderfully useful for all of you who actually know Excel. The nice thing is that you leverage pure Excel for all operations (no plug-ins). That means no new tools to learn, no new controls, all just pure Excel. THEdatabasemachine Got some very good questions in my "what makes Exadata fast" session and overall, the interest in Exadata is overwhelming. One of the things that I did try to do in my session is to get people to think in new patterns rather than in patterns based on Oracle 9i running on some random hardware configuration. We talked a little bit about the often over-indexing and how everyone has to unlearn all of that on Exadata. The main thing however is that everyone needs to get used to the shear size of some of the components in a Database machine V2. 5TB of flash cache is a lot of very fast data storage, half a TB of memory gets quite interesting as well. So what I did there was really focus on some of the content in these earlier posts on Upward ILM and In-Memory processing. In short, I do believe the these newer media point out a trend. In-memory and other fast media will get cheaper and will see more use. Some of that we do automatically by adding new functionality, but in some cases I think the end user of the system needs to start thinking about how to leverage all this new hardware. I think most people got very excited about these new capabilities and opportunities. THEcoolkids One of the cool things about the BIWA track was the hand-on track. Very cool to see big crowds for both OLAP and OWB hands-on. Also quite nice to see that the folks at RittmanMead spent so much time on preparing for that session. While all of them put down cool stuff, none was more cool that seeing Data Mining on an Apple iPAD... it all just looks great on an iPAD! Very disappointing to see that Mark Rittman still wasn't showing OWB on his iPAD ;-) THEend All in all this was a great set of sessions in the BIWA track. Lots of value to our guests (we hope) and we hope they all come again next year!

    Read the article

  • What are practical guidelines for evaluating a language's "Turing Completeness"?

    - by AShelly
    I've read "what-is-turing-complete" and the wikipedia page, but I'm less interested in a formal proof than in the practical implications of being Turing Complete. What I'm actually trying to decide is if the toy language I've just designed could be used as a general-purpose language. I know I can prove it is if I can write a Turing machine with it. But I don't want to go through that exercise until I'm fairly certain of success. Is there a minimum set of features without which Turing Completeness is impossible? Is there a set of features which virtually guarantees completeness? (My guess is that conditional branching and a readable/writeable memory store will get me most of the way there) EDIT: I think I've gone off on a tangent by saying "Turing Complete". I'm trying to guess with reasonable confidence that a newly invented language with a certain feature set (or alternately, a VM with a certain instruction set) would be able to compute anything worth computing. I know proving you can building a Turing machine with it is one way, but not the only way. What I was hoping for was a set of guidelines like: "if it can do X,Y,and Z, it can probably do anything".

    Read the article

  • How to recover from finite-state-machine breakdown?

    - by Earl Grey
    My question may seems very scientific but I think it's a common problem and seasoned developers and programmers hopefully will have some advice to avoid the problem I mention in title. Btw., what I describe bellow is a real problem I am trying to proactively solve in my iOS project, I want to avoid it at all cost. By finite state machine I mean this I have a UI with a few buttons, several session states relevant to that UI and what this UI represents, I have some data which values are partly displayed in the UI, I receive and handle some external triggers (represented by callbacks from sensors). I made state diagrams to better map the relevant scenarios that are desirable and alowable in that UI and application. As I slowly implement the code, the app starts to behave more and more like it should. However, I am not very confident that it is robust enough. My doubts come from watching my own thinking and implementation process as it goes. I was confident that I had everything covered, but it was enough to make a few brute tests in the UI and I quickly realized that there are still gaps in the behavior ..I patched them. However, as each component depends and behaves based on input from some other component, a certain input from user or some external source trigers a chain of events, state changes..etc. I have several components and each behave like this Trigger received on input - trigger and its sender analyzed - output something (a message, a state change) based on analysis The problem is, this is not completely selfcontained, and my components (a database item, a session state, some button's state)...COULD be changed, influenced, deleted, or otherwise modified, outside the scope of the event-chain or desirable scenario. (phone crashes, battery is empty phone turn of suddenly) This will introduce a nonvalid situation into the system, from which the system potentially COULD NOT BE ABLE to recover. I see this (althought people do not realize this is the problem) in many of my competitors apps that are on apple store, customers write things like this "I added three documents, and after going there and there, i cannot open them, even if a see them." or "I recorded videos everyday, but after recording a too log video, I cannot turn of captions on them.., and the button for captions doesn't work".. These are just shortened examples, customers often describe it in more detail..from the descriptions and behavior described in them, I assume that the particular app has a FSM breakdown. So the ultimate question is how can I avoid this, and how to protect the system from blocking itself? EDIT I am talking in the context of one viewcontroller's view on the phone, I mean one part of the application. I Understand the MVC pattern, I have separate modules for distinct functionality..everything I describe is relevant to one canvas on the UI.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >