Search Results

Search found 12798 results on 512 pages for 'language agnostic'.

Page 95/512 | < Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >

  • Any good C interpreters?

    - by NoMoreZealots
    I was looking at Ch from SofIntegration and it looks pretty interesting as a possible teaching tool. It would allow the you to let someone learning to program "play" while preparing them to write full fledged C programs. I was wondering if anybody had "good" experiences using a C interpreter or weather it would be a better to go with a language that is typically interpreter to start with?

    Read the article

  • A regular expression question

    - by Hellnar
    Hello, I am in dire need of a such regular expression where my alphabet is made up of 0s and 1s. Now I need a language that accepts all words as long as it has three 0s. IE: 000 10001 0001 1000 10000101

    Read the article

  • Why Swift is 100 times slower than C in this image processing test?

    - by xiaobai
    Like many other developers I have been very excited at the new Swift language from Apple. Apple has boasted its speed is faster than Objective C and can be used to write operating system. And from what I learned so far, it's a very type-safe language and able to have precisely control over the exact data type (like integer length). So it does look like having good potential handling performance critical tasks, like image processing, right? That's what I thought before I carried out a quick test. The result really surprised me. Here is a much simplified image alpha blending code snippet in C: test.c: #include <stdio.h> #include <stdint.h> #include <string.h> uint8_t pixels[640*480]; uint8_t alpha[640*480]; uint8_t blended[640*480]; void blend(uint8_t* px, uint8_t* al, uint8_t* result, int size) { for(int i=0; i<size; i++) { result[i] = (uint8_t)(((uint16_t)px[i]) *al[i] /255); } } int main(void) { memset(pixels, 128, 640*480); memset(alpha, 128, 640*480); memset(blended, 255, 640*480); // Test 10 frames for(int i=0; i<10; i++) { blend(pixels, alpha, blended, 640*480); } return 0; } I compiled it on my Macbook Air 2011 with the following command: gcc -O3 test.c -o test The 10 frame processing time is about 0.01s. In other words, it takes the C code 1ms to process one frame: $ time ./test real 0m0.010s user 0m0.006s sys 0m0.003s Then I have a Swift version of the same code: test.swift: let pixels = UInt8[](count: 640*480, repeatedValue: 128) let alpha = UInt8[](count: 640*480, repeatedValue: 128) let blended = UInt8[](count: 640*480, repeatedValue: 255) func blend(px: UInt8[], al: UInt8[], result: UInt8[], size: Int) { for(var i=0; i<size; i++) { var b = (UInt16)(px[i]) * (UInt16)(al[i]) result[i] = (UInt8)(b/255) } } for i in 0..10 { blend(pixels, alpha, blended, 640*480) } The build command line is: xcrun swift -O3 test.swift -o test Here I use the same O3 level optimization flag to make the comparison hopefully fair. However, the resulting speed is 100 time slower: $ time ./test real 0m1.172s user 0m1.146s sys 0m0.006s In other words, it takes Swift ~120ms to processing one frame which takes C just 1 ms. I also verified the memory initialization time in both test code are very small compared to the blend processing function time. What happened?

    Read the article

  • Why there are no semicolons and {} blocks in some languages

    - by Incognito
    I know the question has no practical value, but it is interesting why in some languages semicolons and {} blocks are removed although their predecessor have them. Actually it makes me nervous to write a code in Python as there are no ";" and {}. Also in new language Google-GO semicolons are also missing although it says that lexer uses a rule to insert semicolons automatically as it scans. So is there any secret :) reason for this.

    Read the article

  • What language/framework (technology) to use for website (flash games portal)

    - by cripox
    Hello, I know there are a lot of similar questions on the net, but because I am a newbie in web development I didn't find the solution for my specific problem. I am planing on creating a flash games portal from scratch. It is a big chance that there will be big traffic from the beginning (millions of pageviews). I want to reduce the server costs as much as possible but in the same time to not be tide to an expensive contract as there is a chance that the project will not be as successfully as I want and in that case the money would be very little. The question is : what technology to use? I don't know any web dev technology yet so it doesn't matter what I will learn. My web dev experience is a little php 8 years ago, and from then I programmed in C++ / Java- game and mobile development. I like Java and C syntax and language very much and I tend to dislike dynamic typing or non robust scripting (like php)- but I can get along if these are the best choices. The candidates are now: - Grails (my best for now) Ruby on Rails Cake PHP Other technologies (Google App Engine, Python/Django etc...) I was considering at first using pure C and compiling the web app in the server- just to squeeze more from the servers, but soon I understand that this is overkill. Next my eyes came on Ruby - as there is a lot of buzz for it's easiness of use. Next I discovered Grails and looked at Java because it is said that it is "faster". But I don't know what this "Faster" really means on my needs, so here comes the first question: 1) What will be my biggest consumption on the server, other than bandwidth, for a lot of flash content requests? Is it memory? I heard that Java needs a lot of memory, but is faster. Is it CPU? I am planning to take some daily VPS.NET nodes at first, to see if there is a demand, and if the "spike" is permanent to move to a dedicated server (serverloft.com has some good offers), else to remain with less nodes. I was also considering developing in Google App Engine- cheap or free hosting to use at first - so I can test my assumption- and also very easy to use (no need for sys administration) but the costs became high if used more ( 3 million games played / month .. x mb/ each). And the issue with Google is that it looks me in this technology. My other concern is scalability (not only for traffic/users, but as adding functionality) My plans are to release a functional site in just 4 weeks (just the basics frontend and some quick basic backend - so I can be able to modify some things and add games manually) - but then to raise it and add more things to it. I am planning to take a little different approach than other portals so I need to write it from scratch (a script will not do). 2) Will Grails take much more resources than RoR or Php server wise? I heard that making it on Java stack will be hardware expensive and is overkill if you don't make a bank application. My application will not be very complex (I hope and i will try to) but will have a lot of traffic. I also took in account using CDN for files, but the cheapest CDN found was 5c/GB (vps.net) and the cost per gb on serverloft (http://www.serverloft.com/dedizierte-server/server-details.php?products=4) is only 1.79 cents/GB and comes with the other resources either. I am new to this domain (web). I am learning the ropes and searching on the web for ~half of year but don't have any really practical experience, so I know that I must have some naive thinking and other issues that i don't know from now, so please give me any advice you want regarding anything, not just the specific questions asked. And thank you so much for such great community!

    Read the article

  • ruby on rails language problem "invalid byte sequence in GBK"

    - by user357203
    This is definitely a language issue, both of our code and our database contains Chinese characters. **This is my environment: About your application's environment Ruby version 1.9.1 (i386-mingw32) RubyGems version 1.3.5 Rack version 1.0 Rails version 2.3.5 Active Record version 2.3.5 Active Resource version 2.3.5 Action Mailer version 2.3.5 Active Support version 2.3.5 Application root C:/path_to_my_root Environment development Database adapter mysql Database schema version 20100327010640 **This is my localhost;3000 after running my ruby server: ArgumentError in HomeController#construction invalid byte sequence in GBK RAILS_ROOT: C:/path_to_my_root Application Trace | Framework Trace | Full Trace C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ template_error.rb:43:in `split' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ template_error.rb:43:in `source_extract' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ template_error.rb:86:in `compute_backtrace' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ template_error.rb:11:in `initialize' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ template.rb:212:in `new' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ template.rb:212:in `rescue in render_template' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ template.rb:205:in `render_template' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ base.rb:265:in `render' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ base.rb:352:in `_render_with_layout' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ base.rb:262:in `render' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/base.rb:1250:in `render_for_file' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/base.rb:951:in `render' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/benchmarking.rb:51:in `block in render_with_benchmark' C:/Ruby19/lib/ruby/gems/1.9.1/gems/activesupport-2.3.5/lib/ active_support/core_ext/benchmark.rb:17:in `block in ms' C:/Ruby19/lib/ruby/1.9.1/benchmark.rb:309:in `realtime' C:/Ruby19/lib/ruby/gems/1.9.1/gems/activesupport-2.3.5/lib/ active_support/core_ext/benchmark.rb:17:in `ms' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/benchmarking.rb:51:in `render_with_benchmark' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/mime_responds.rb:135:in `block in custom' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/mime_responds.rb:179:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/mime_responds.rb:179:in `block in respond' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/mime_responds.rb:173:in `each' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/mime_responds.rb:173:in `respond' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/mime_responds.rb:107:in `respond_to' C:/Users/Howard/Documents/local/vjoin/app/controllers/ home_controller.rb:53:in `construction' ..... C:/Ruby19/lib/ruby/gems/1.9.1/gems/rack-1.0.1/lib/rack/ methodoverride.rb:24:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/params_parser.rb:15:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/session/cookie_store.rb:93:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/failsafe.rb:26:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rack-1.0.1/lib/rack/lock.rb:11:in `block in call' :8:in `synchronize' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rack-1.0.1/lib/rack/lock.rb:11:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/dispatcher.rb:114:in `block in call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/reloader.rb:34:in `run' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/dispatcher.rb:108:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rails-2.3.5/lib/rails/rack/ static.rb:31:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rack-1.0.1/lib/rack/urlmap.rb:46:in `block in call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rack-1.0.1/lib/rack/urlmap.rb:40:in `each' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rack-1.0.1/lib/rack/urlmap.rb:40:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rails-2.3.5/lib/rails/rack/ log_tailer.rb:17:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rack-1.0.1/lib/rack/ content_length.rb:13:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rack-1.0.1/lib/rack/handler/ webrick.rb:50:in `service' C:/Ruby19/lib/ruby/1.9.1/webrick/httpserver.rb:111:in `service' C:/Ruby19/lib/ruby/1.9.1/webrick/httpserver.rb:70:in `run' C:/Ruby19/lib/ruby/1.9.1/webrick/server.rb:183:in `block in start_thread' Request Parameters: None Show session dump Response Headers: {"Cache-Control"=>"no-cache", "Content-Type"=>"text/html"} **What should I do? I tried to search online, didn't find much. The only thing I found was something like putting the following into application_controller: before_filter :set_charset, :set_locale def set_charset response.headers["Content-Type"] = "text/html; charset=utf-8" WIN32OLE.codepage = WIN32OLE::CP_UTF8 end but this still doesn't work. I am new to ruby on rails, so don't know much about it. Thanks for your help.

    Read the article

  • Handling extremely large numbers in a language which can't?

    - by Mallow
    I'm trying to think about how I would go about doing calculations on extremely large numbers (to infinitum - intergers no floats) if the language construct is incapable of handling numbers larger than a certain value. I am sure I am not the first nor the last to ask this question but the search terms I am using aren't giving me an algorithm to handle those situations. Rather most suggestions offer a language change or variable change, or talk about things that seem irrelevant to my search. So I need a little guideance. I would sketch out an algorithm like this: Determine the max length of the integer variable for the language. If a number is more than half the length of the max length of the variable split it in an array. (give a little play room) Array order [0] = the numbers most to the right [n-max] = numbers most to the left Ex. Num: 29392023 Array[0]:23, Array[1]: 20, array[2]: 39, array[3]:29 Since I established half the length of the variable as the mark off point I can then calculate the ones, tenths, hundredths, etc. Place via the halfway mark so that if a variable max length was 10 digits from 0 to 9999999999 then I know that by halfing that to five digits give me some play room. So if I add or multiply I can have a variable checker function that see that the sixth digit (from the right) of array[0] is the same place as the first digit (from the right) of array[1]. Dividing and subtracting have their own issues which I haven't thought about yet. I would like to know about the best implementations of supporting larger numbers than the program can.

    Read the article

  • What language and topics should be covered when teaching non-CS college students how to program?

    - by michaelcarrano
    I have been asked by many of my non-computer science friends to teach them how to program. I have agreed to hold a seminar for them that will last for approximately 1 to 2 hours. My thoughts are to use Python as the language to teach them basic programming skills. I figured Python is relatively easier to learn from what I have researched. It is also a language I want to learn which will make holding this seminar all the more enjoyable. The topics I plan to cover are as followed: Variables / Arrays Logic - If else statements, switch case, nested statements Loops - for, while, do-while and nested loops Functions - pass by value, pass by reference (is this the correct terms for Python? I am mostly a C/C++ person) Object Oriented Programming Of course, I plan to have code examples for all topics and I will try to have each example flow into each other so that at the end of the seminar everyone will have a complete working program. I suppose my question is, if you were given 1 to 2 hours to teach a group of college students how to program, what language would you choose and what topics would you cover? Update: Thank you for the great feedback. I should have mentioned in my earlier post above that a majority of the students attending the seminar have some form of programming experience whether it was with Java or using Matlab. Most of these students are 3rd/4th year Engineering students who want to get a refresher on programming before they graduate.

    Read the article

  • When can you call yourself good at language X?

    - by SoulBeaver
    This goes back to a conversation I've had with my girlfriend. I tried to tell her that I simply don't feel adequate enough in my programming language (C++) to call myself good. She then asked me, "Well, when do you consider yourself good enough?" That's an interesting question. I didn't know what to tell her. So I'm asking you. For any programming language, framework or the like, when do you reach a point were you sit back, look at what you've done and say, "Hey, I'm actually pretty good at this."? How do you define "good" so that you can tell others, honestly, "Yeah, I'm good at X". Additionally, do you reach these conclusions by comparing what others can do? Additional Info I have read the canonical paper on how it takes ten-thousand hours before you are an expert on the field. (Props to anybody that knows what this paper is called again) I have also read various articles from Coding Horror about interviewing people. Some people, it was said, "Cannot function outside of a framework." So they may be "good" for that framework, but not otherwise in the language. Is this true?

    Read the article

  • What is an appropriate language for expressing initial stages of algorithm refinement?

    - by hydroparadise
    First, this is not a homework assignment, but you can treat it as such ;). I found the following question in the published paper The Camel Has Two Humps. I was not a CS major going to college (I majored in MIS/Management), but I have a job where I find myself coding quite often. For a non-trivial programming problem, which one of the following is an appropriate language for expressing the initial stages of algorithm refinement? (a) A high-level programming language. (b) English. (c) Byte code. (d) The native machine code for the processor on which the program will run. (e) Structured English (pseudocode). What I do know is that you usually want to start your design implementation by writing down pseuducode and then moving/writing in the desired technology (because we all do that, right?) But I never thought about it in terms of refinement. I mean, if you were the original designer, then you might have access to the original pseudocode. But realisticly, when I have to maintain/refactor/refine somebody elses code, I just keep trucking with the language it currently resides in. Anybody have a definitive answer to this? As a side note, I did a quick scan of the paper as I havn't read every single detail. It presents various score statistics, can't find where the answers are with the paper.

    Read the article

  • As a programmer, what's the most valuable non-English (human) language to learn?

    - by Andrew M
    I was thinking that with my developer skills, learning new languages like French, German etc. might be easier for me now. I could setup the verbs as objects in Python and use dir(verb) to find its methods, tenses and stuff ;-) But seriously, if you're a professional developer, in my case in the UK, what's the best foreign language to learn from an employment perspective? I'm thinking, like Hindi - if all our programming jobs are getting outsourced to India, might as well position yourself to be the on-site, go-between guy. Mandarin - if the Chinese become the pre-eminent economy, the new USA, in ten or twenty years, then speaking their language would open up a huge market to you. Russian - maybe another major up-and-comer, but already closer to Western standards. More IT-sector growth here than anywhere else in the coming years? Japanese - drivers of global technology, being able to speak their language could give you a big competitive advantage over other Westerners But I'm just guessing/musing with all these points. If you have an opinion, or even better, some evidence, I'd like to hear it. If the programming things falls through then at least it'll make for more interesting holidays.

    Read the article

  • What is the best objective way to measure language popularity trends? (What's better than TIOBE?)

    - by Eric Wilson
    The best way to get data on computer language popularity that I know is the TIOBE index. But everyone knows that TIOBE is hopelessly flawed. (If someone provides a link to support this, I'll add it here.) So is there any data on programming language popularity that is generally considered meaningful? The only other option I know is to look at the trends at indeed.com, which is inherently flawed, being based on job postings. It isn't like I would make a future language decision solely based on an index, but it might provide a useful balance to the skewed perspective one obtains by talking to ones friends and colleagues. To illustrate that bias, I'll point out that based on the experience of those I personally know, the only languages used professionally today (in order of popularity) are Java, C#, Groovy, JavaScript, Ruby, Objective C, and Perl. (Though it is evident that C, C++ and PHP were used in the past.) So my question is, everyone bashes TIOBE, but is there anything else? If so, can anyone explain how we know the alternative has better methodology? Thanks.

    Read the article

  • How to use google translate to translate website automatically using geoip

    - by AK
    I have been looking around the internet for a script which would use google translate api to translate a website automatically through a geoip script without the need of clicking translate button. Since google does provide a small div snippet which you can add to your website and then through a drop down menu you can choose the language and click translate and it translates the whole website. the snippet is here http://translate.google.com/translate_tools?hl=en&layout=1&eotf=1&sl=ru&tl=en How can i integrate a geoip script along with the above snippet or there are also a couple of google translate scripts available on the internet.

    Read the article

  • Which programming languages have helped you to understand programming better?

    - by Xaisoft
    Which programming languages not only make you more proficient in the particular language your are learning, but also have a direct impact on the way you think and understand programming in general; therefore, making you a better programmer in other languages. Basically, which languages have the biggest impact on understanding the how and why of different programming concepts? What about Scheme? I have heard good things about that. I thought about taking the simplest of problems and implementing them in various languages. Has anyone done this?

    Read the article

  • Method signature Vs function prototype

    - by Maroloccio
    A formal definition of the two? Current Wiki articles denote their different contexts and applications, such as internal type signature "strings" in Java VMs (1) and C/C++ function prototypes informing compilers of upcoming method definitions (2) but... 1) http://en.wikipedia.org/wiki/Type_signature 2) http://en.wikipedia.org/wiki/Function_prototype ... where to look for a definition which clearly and formally distinguished one from the other? There is literature using the words prototype and signature almost interchangeably yet other uses appear strict and consistent, if language-specific. Background: I am writing documentation for a sample compiler written for a University project.

    Read the article

  • Embedded Prolog Interpreter/Compiler for Java

    - by Sami
    I'm working on an application in Java, that needs to do some complex logic rule deductions as part of its functionality. I'd like to code my logic deductions in Prolog or some other logic/constraint programming language, instead of Java, as I believe the resulting code will be significantly simpler and more maintainable. I Googled for embedded Java implementations on Prolog, and found number of them, each with very little documentation. My (modest) selection criteria are: should be embeddable in Java (e.g. can be bundled up with my java package instead of requiring any native installations on external programs) simple interface to use from Java (for initiating deductions, inspecting results, and adding rules) come with at least a few examples on how to use it doesn't necessarely have to be Prolog, but other logic/constraint programming languages with the above criteria would suit my needs, too. What choices do I have and what are their advantages and disadvantages?

    Read the article

  • Naive Bayesian for Topic detection using "Bag of Words" approach

    - by AlgoMan
    I am trying to implement a naive bayseian approach to find the topic of a given document or stream of words. Is there are Naive Bayesian approach that i might be able to look up for this ? Also, i am trying to improve my dictionary as i go along. Initially, i have a bunch of words that map to a topics (hard-coded). Depending on the occurrence of the words other than the ones that are already mapped. And depending on the occurrences of these words i want to add them to the mappings, hence improving and learning about new words that map to topic. And also changing the probabilities of words. How should i go about doing this ? Is my approach the right one ? Which programming language would be best suited for the implementation ?

    Read the article

  • Shorthand for nested null checking C#

    - by Myster
    As far as I know there is not a significantly more elegant way to write the following.... string src; if((ParentContent!= null) &&(ParentContent.Image("thumbnail") != null) &&(ParentContent.Image("thumbnail").Property("src") != null)) src = ParentContent.Image("thumbnail").Property("src").Value Do you think there should be a C# language feature to make this shorter? And if so, what should it look like? for example, something like extending the ?? operator string src = ParentContent??.Image("thumbnail")??.Property("width")??.Value; Apologies for the rather contrived example, and my over-simplified solution.

    Read the article

  • How much of the "Objective-C" I'm learning is universal Objective-C, and not Apple's frameworks?

    - by Chris Cooper
    This question is related to one of my others about C: What can you do in C without “std” includes? Are they part of “C,” or just libraries? I've become curious lately as to what is really contained the the core Objective-C language, and what parts of the Objective-C I've done for iPhone/OS X development is specific to Apple platforms. I know that things like syntax are the same, but for instance, is NSObject and its torrent of NS-subclasses actually part of "standard" Objective-C? Could I use them in, say, Windows? What parts are universal for the most part, and what parts would I only find on an Apple platform? If you want, giving an example of Objective-C used elsewhere as an example of what is more "universal" would help me as well. Thanks! =)

    Read the article

  • Shortest way of determining a name ends with an `s`, `x` or `z`, and then use the `I18n.t` method wi

    - by Koning Baard XIV
    I'm creating a Rails application where users can have a first and last name. Since I'm a perfectionist, the application may not show something like Dennis's profile or Xianx's profile, but rather Dennis' profile and Xianx' profile. I use L18n, so I wanted to ask what is the shortest way of implementing this? This grammar is the same for both English and Dutch, where the application will be translated to. Oh, some important things: I am not afraid of using helpers and the application controller My language files are in Ruby, not YAML Thanks!

    Read the article

  • Detecting syllables in a word

    - by user50705
    I need to find a fairly efficient way to detect syllables in a word. E.g., invisible - in-vi-sib-le There are some syllabification rules that could be used: V CV VC CVC CCV CCCV CVCC *where V is a vowel and C is a consonant. e.g., pronunciation (5 Pro-nun-ci-a-tion; CV-CVC-CV-V-CVC) I've tried few methods, among which were using regex (which helps only if you want to count syllables) or hard coded rule definition (a brute force approach which proves to be very inefficient) and finally using a finite state automata (which did not result with anything useful). The purpose of my application is to create a dictionary of all syllables in a given language. This dictionary will later be used for spell checking applications (using Bayesian classifiers) and text to speech synthesis. I would appreciate if one could give me tips on an alternate way to solve this problem besides my previous approaches. I work in Java, but any tip in C/C++, C#, Python, Perl... would work for me.

    Read the article

  • What languages have a while-else type control structure, and how does it work?

    - by Dan
    A long time ago, I thought I saw a proposal to add an else clause to for or while loops in C or C++... or something like that. I don't remember how it was supposed to work -- did the else clause run if the loop exited normally but not via a break statement? Anyway, this is tough to search for, so I thought maybe I could get some CW answers here for various languages. What languages support adding an else clause to something other than an if statement? What is the meaning of that clause? One language per answer please.

    Read the article

  • Why do C compilers prepend underscores to external names?

    - by Michael Burr
    I've been working in C for so long that the fact that compilers typically add an underscore to the start of an extern is just understood... However, another SO question today got me wondering about the real reason why the underscore is added. A wikipedia article claims that a reason is: It was common practice for C compilers to prepend a leading underscore to all external scope program identifiers to avert clashes with contributions from runtime language support I think there's at least a kernel of truth to this, but also it seems to no really answer the question, since if the underscore is added to all externs it won't help much with preventing clashes. Does anyone have good information on the rationale for the leading underscore? Is the added underscore part of the reason that the Unix creat() system call doesn't end with an 'e'? I've heard that early linkers on some platforms had a limit of 6 characters for names. If that's the case, then prepending an underscore to external names would seem to be a downright crazy idea (now I only have 5 characters to play with...).

    Read the article

  • Beginner question: What is binding?

    - by JDelage
    Hi, I was trying to understand the difference between early and late binding, and in the process realized that the concept of binding is nebulous to me. I think I understand that it relates to the way data-as-a-word-of-memory is linked to type-as-a-set-of-language-features but I am not sure those are the right concepts. Also, how does understanding this deeply help people become better programmers? Please note: This question is not "what is late v. early binding" or "what are the trade-offs between the 2". Those already exist here. Thanks, JDelage

    Read the article

< Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >