Search Results

Search found 14590 results on 584 pages for 'language discussion'.

Page 75/584 | < Previous Page | 71 72 73 74 75 76 77 78 79 80 81 82  | Next Page >

  • Programming Concepts That Were "Automated" By Modern Languages

    - by Ygam
    Weird question but here it is. What are the programming concepts that were "automated" by modern languages? What I mean are the concepts you had to manually do before. Here is an example: I have just read that in C, you manually do garbage collection; with "modern" languages however, the language itself takes care of it. Do you know of any other, or there aren't any more?

    Read the article

  • [Raise|Trigger|Fire|...] an event?!?

    - by winSharp93
    Hello, in a German programming forum we currently have a discussion about events and what you (grammatically) do with them. The MSDN talks about "Event Raising" and "to raise an event". Thus, this seems to be one possibility. Are there any other synonyms? What about "to trigger an event" and "to fire an event"? A Google search will bring results for all of the three possibilities. This, however, does not mean that they are correct, too, of course. Are they? Are there any (stylistic, ...) differences or are they used in different contexts? Many thanks in advance for ending a heated debate :-)

    Read the article

  • When did you feel comfortable with programming?

    - by misterwebz
    I've been programming for about 6 months, but i still feel like i haven't learned anything. Or maybe i'm overwhelmed by the things i still don't know. Am i doing something wrong here or is this normal? I want to be able to work as a freelancer in a year or two, but i'm not sure if that goal is achievable. So when did you feel comfortable using your programming language of choice and how did you learn it?

    Read the article

  • A regular expression question

    - by Hellnar
    Hello, I am in dire need of a such regular expression where my alphabet is made up of 0s and 1s. Now I need a language that accepts all words as long as it has three 0s. IE: 000 10001 0001 1000 10000101

    Read the article

  • What is going to be "the next Hype" ? [closed]

    - by kabado
    I know it's a very vague question, but maybe the answers could act like a poll about what could possibly be the next big event in computer science. A new language ? A new technology ? A new architecture ? A new business model ? Please provide details, what is it and why do you think it's going to rule the world.

    Read the article

  • Why Swift is 100 times slower than C in this image processing test?

    - by xiaobai
    Like many other developers I have been very excited at the new Swift language from Apple. Apple has boasted its speed is faster than Objective C and can be used to write operating system. And from what I learned so far, it's a very type-safe language and able to have precisely control over the exact data type (like integer length). So it does look like having good potential handling performance critical tasks, like image processing, right? That's what I thought before I carried out a quick test. The result really surprised me. Here is a much simplified image alpha blending code snippet in C: test.c: #include <stdio.h> #include <stdint.h> #include <string.h> uint8_t pixels[640*480]; uint8_t alpha[640*480]; uint8_t blended[640*480]; void blend(uint8_t* px, uint8_t* al, uint8_t* result, int size) { for(int i=0; i<size; i++) { result[i] = (uint8_t)(((uint16_t)px[i]) *al[i] /255); } } int main(void) { memset(pixels, 128, 640*480); memset(alpha, 128, 640*480); memset(blended, 255, 640*480); // Test 10 frames for(int i=0; i<10; i++) { blend(pixels, alpha, blended, 640*480); } return 0; } I compiled it on my Macbook Air 2011 with the following command: gcc -O3 test.c -o test The 10 frame processing time is about 0.01s. In other words, it takes the C code 1ms to process one frame: $ time ./test real 0m0.010s user 0m0.006s sys 0m0.003s Then I have a Swift version of the same code: test.swift: let pixels = UInt8[](count: 640*480, repeatedValue: 128) let alpha = UInt8[](count: 640*480, repeatedValue: 128) let blended = UInt8[](count: 640*480, repeatedValue: 255) func blend(px: UInt8[], al: UInt8[], result: UInt8[], size: Int) { for(var i=0; i<size; i++) { var b = (UInt16)(px[i]) * (UInt16)(al[i]) result[i] = (UInt8)(b/255) } } for i in 0..10 { blend(pixels, alpha, blended, 640*480) } The build command line is: xcrun swift -O3 test.swift -o test Here I use the same O3 level optimization flag to make the comparison hopefully fair. However, the resulting speed is 100 time slower: $ time ./test real 0m1.172s user 0m1.146s sys 0m0.006s In other words, it takes Swift ~120ms to processing one frame which takes C just 1 ms. I also verified the memory initialization time in both test code are very small compared to the blend processing function time. What happened?

    Read the article

  • Closure vs Anonymous function (difference?)

    - by Maxim Gershkovich
    Hi, I have been unable to find a definition that clearly explains the differences between a closure and an anonymous function. Most references I have seen clearly specify that they are distinct "things" yet I can't seem to get my head around why. Could someone please simplify it for me? What are the specific differences between these two language features? Which one is more appropriate in what scenarios?

    Read the article

  • Why there are no semicolons and {} blocks in some languages

    - by Incognito
    I know the question has no practical value, but it is interesting why in some languages semicolons and {} blocks are removed although their predecessor have them. Actually it makes me nervous to write a code in Python as there are no ";" and {}. Also in new language Google-GO semicolons are also missing although it says that lexer uses a rule to insert semicolons automatically as it scans. So is there any secret :) reason for this.

    Read the article

  • What language/framework (technology) to use for website (flash games portal)

    - by cripox
    Hello, I know there are a lot of similar questions on the net, but because I am a newbie in web development I didn't find the solution for my specific problem. I am planing on creating a flash games portal from scratch. It is a big chance that there will be big traffic from the beginning (millions of pageviews). I want to reduce the server costs as much as possible but in the same time to not be tide to an expensive contract as there is a chance that the project will not be as successfully as I want and in that case the money would be very little. The question is : what technology to use? I don't know any web dev technology yet so it doesn't matter what I will learn. My web dev experience is a little php 8 years ago, and from then I programmed in C++ / Java- game and mobile development. I like Java and C syntax and language very much and I tend to dislike dynamic typing or non robust scripting (like php)- but I can get along if these are the best choices. The candidates are now: - Grails (my best for now) Ruby on Rails Cake PHP Other technologies (Google App Engine, Python/Django etc...) I was considering at first using pure C and compiling the web app in the server- just to squeeze more from the servers, but soon I understand that this is overkill. Next my eyes came on Ruby - as there is a lot of buzz for it's easiness of use. Next I discovered Grails and looked at Java because it is said that it is "faster". But I don't know what this "Faster" really means on my needs, so here comes the first question: 1) What will be my biggest consumption on the server, other than bandwidth, for a lot of flash content requests? Is it memory? I heard that Java needs a lot of memory, but is faster. Is it CPU? I am planning to take some daily VPS.NET nodes at first, to see if there is a demand, and if the "spike" is permanent to move to a dedicated server (serverloft.com has some good offers), else to remain with less nodes. I was also considering developing in Google App Engine- cheap or free hosting to use at first - so I can test my assumption- and also very easy to use (no need for sys administration) but the costs became high if used more ( 3 million games played / month .. x mb/ each). And the issue with Google is that it looks me in this technology. My other concern is scalability (not only for traffic/users, but as adding functionality) My plans are to release a functional site in just 4 weeks (just the basics frontend and some quick basic backend - so I can be able to modify some things and add games manually) - but then to raise it and add more things to it. I am planning to take a little different approach than other portals so I need to write it from scratch (a script will not do). 2) Will Grails take much more resources than RoR or Php server wise? I heard that making it on Java stack will be hardware expensive and is overkill if you don't make a bank application. My application will not be very complex (I hope and i will try to) but will have a lot of traffic. I also took in account using CDN for files, but the cheapest CDN found was 5c/GB (vps.net) and the cost per gb on serverloft (http://www.serverloft.com/dedizierte-server/server-details.php?products=4) is only 1.79 cents/GB and comes with the other resources either. I am new to this domain (web). I am learning the ropes and searching on the web for ~half of year but don't have any really practical experience, so I know that I must have some naive thinking and other issues that i don't know from now, so please give me any advice you want regarding anything, not just the specific questions asked. And thank you so much for such great community!

    Read the article

  • ruby on rails language problem "invalid byte sequence in GBK"

    - by user357203
    This is definitely a language issue, both of our code and our database contains Chinese characters. **This is my environment: About your application's environment Ruby version 1.9.1 (i386-mingw32) RubyGems version 1.3.5 Rack version 1.0 Rails version 2.3.5 Active Record version 2.3.5 Active Resource version 2.3.5 Action Mailer version 2.3.5 Active Support version 2.3.5 Application root C:/path_to_my_root Environment development Database adapter mysql Database schema version 20100327010640 **This is my localhost;3000 after running my ruby server: ArgumentError in HomeController#construction invalid byte sequence in GBK RAILS_ROOT: C:/path_to_my_root Application Trace | Framework Trace | Full Trace C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ template_error.rb:43:in `split' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ template_error.rb:43:in `source_extract' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ template_error.rb:86:in `compute_backtrace' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ template_error.rb:11:in `initialize' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ template.rb:212:in `new' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ template.rb:212:in `rescue in render_template' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ template.rb:205:in `render_template' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ base.rb:265:in `render' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ base.rb:352:in `_render_with_layout' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/action_view/ base.rb:262:in `render' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/base.rb:1250:in `render_for_file' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/base.rb:951:in `render' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/benchmarking.rb:51:in `block in render_with_benchmark' C:/Ruby19/lib/ruby/gems/1.9.1/gems/activesupport-2.3.5/lib/ active_support/core_ext/benchmark.rb:17:in `block in ms' C:/Ruby19/lib/ruby/1.9.1/benchmark.rb:309:in `realtime' C:/Ruby19/lib/ruby/gems/1.9.1/gems/activesupport-2.3.5/lib/ active_support/core_ext/benchmark.rb:17:in `ms' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/benchmarking.rb:51:in `render_with_benchmark' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/mime_responds.rb:135:in `block in custom' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/mime_responds.rb:179:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/mime_responds.rb:179:in `block in respond' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/mime_responds.rb:173:in `each' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/mime_responds.rb:173:in `respond' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/mime_responds.rb:107:in `respond_to' C:/Users/Howard/Documents/local/vjoin/app/controllers/ home_controller.rb:53:in `construction' ..... C:/Ruby19/lib/ruby/gems/1.9.1/gems/rack-1.0.1/lib/rack/ methodoverride.rb:24:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/params_parser.rb:15:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/session/cookie_store.rb:93:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/failsafe.rb:26:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rack-1.0.1/lib/rack/lock.rb:11:in `block in call' :8:in `synchronize' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rack-1.0.1/lib/rack/lock.rb:11:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/dispatcher.rb:114:in `block in call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/reloader.rb:34:in `run' C:/Ruby19/lib/ruby/gems/1.9.1/gems/actionpack-2.3.5/lib/ action_controller/dispatcher.rb:108:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rails-2.3.5/lib/rails/rack/ static.rb:31:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rack-1.0.1/lib/rack/urlmap.rb:46:in `block in call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rack-1.0.1/lib/rack/urlmap.rb:40:in `each' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rack-1.0.1/lib/rack/urlmap.rb:40:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rails-2.3.5/lib/rails/rack/ log_tailer.rb:17:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rack-1.0.1/lib/rack/ content_length.rb:13:in `call' C:/Ruby19/lib/ruby/gems/1.9.1/gems/rack-1.0.1/lib/rack/handler/ webrick.rb:50:in `service' C:/Ruby19/lib/ruby/1.9.1/webrick/httpserver.rb:111:in `service' C:/Ruby19/lib/ruby/1.9.1/webrick/httpserver.rb:70:in `run' C:/Ruby19/lib/ruby/1.9.1/webrick/server.rb:183:in `block in start_thread' Request Parameters: None Show session dump Response Headers: {"Cache-Control"=>"no-cache", "Content-Type"=>"text/html"} **What should I do? I tried to search online, didn't find much. The only thing I found was something like putting the following into application_controller: before_filter :set_charset, :set_locale def set_charset response.headers["Content-Type"] = "text/html; charset=utf-8" WIN32OLE.codepage = WIN32OLE::CP_UTF8 end but this still doesn't work. I am new to ruby on rails, so don't know much about it. Thanks for your help.

    Read the article

  • Deploying software on compromised machines

    - by Martin
    I've been involved in a discussion about how to build internet voting software for a general election. We've reached a general consensus that there exist plenty of secure methods for two way authentication and communication. However, someone came along and pointed out that in a general election some of the machines being used are almost certainly going to be compromised. To quote: Let me be an evil electoral fraudster. I want to sample peoples votes as they vote and hope I get something scandalous. I hire a bot-net from some really shady dudes who control 1000 compromised machines in the UK just for election day. I capture the voting habits of 1000 voters on election day. I notice 5 of them have voted BNP. I look these users up and check out their machines, I look through their documents on their machine and find out their names and addresses. I find out one of them is the wife of a tory MP. I leak 'wife of tory mp is a fascist!' to some blogger I know. It hits the internet and goes viral, swings an election. That's a serious problem! So, what are the best techniques for running software where user interactions with the software must be kept secret, on a machine which is possibly compromised?

    Read the article

  • What's your most controversial programming opinion?

    - by Jon Skeet
    This is definitely subjective, but I'd like to try to avoid it becoming argumentative. I think it could be an interesting question if people treat it appropriately. The idea for this question came from the comment thread from my answer to the "What are five things you hate about your favorite language?" question. I contended that classes in C# should be sealed by default - I won't put my reasoning in the question, but I might write a fuller explanation as an answer to this question. I was surprised at the heat of the discussion in the comments (25 comments currently). So, what contentious opinions do you hold? I'd rather avoid the kind of thing which ends up being pretty religious with relatively little basis (e.g. brace placing) but examples might include things like "unit testing isn't actually terribly helpful" or "public fields are okay really". The important thing (to me, anyway) is that you've got reasons behind your opinions. Please present your opinion and reasoning - I would encourage people to vote for opinions which are well-argued and interesting, whether or not you happen to agree with them.

    Read the article

  • Rationale of C# iterators design (comparing to C++)

    - by macias
    I found similar topic: http://stackoverflow.com/questions/56347/iterators-in-c-stl-vs-java-is-there-a-conceptual-difference Which basically deals with Java iterator (which is similar to C#) being unable to going backward. So here I would like to focus on limits -- in C++ iterator does not know its limit, you have by yourself compare the given iterator with the limit. In C# iterator knows more -- you can tell without comparing to any external reference, if the iterator is valid or not. I prefer C++ way, because once having iterator you can set any iterator as a limit. In other words if you would like to get only few elements instead of entire collection, you don't have to alter the iterator (in C++). For me it is more "pure" (clear). But of course MS knew this and C++ while designing C#. So what are the advantages of C# way? Which approach is more powerful (which leads to more elegant functions based on iterators). What do I miss? If you have thoughts on C# vs. C++ iterators design other than their limits (boundaries), please also answer. Note: (just in case) please, keep the discussion strictly technical. No C++/C# flamewar.

    Read the article

  • Handling extremely large numbers in a language which can't?

    - by Mallow
    I'm trying to think about how I would go about doing calculations on extremely large numbers (to infinitum - intergers no floats) if the language construct is incapable of handling numbers larger than a certain value. I am sure I am not the first nor the last to ask this question but the search terms I am using aren't giving me an algorithm to handle those situations. Rather most suggestions offer a language change or variable change, or talk about things that seem irrelevant to my search. So I need a little guideance. I would sketch out an algorithm like this: Determine the max length of the integer variable for the language. If a number is more than half the length of the max length of the variable split it in an array. (give a little play room) Array order [0] = the numbers most to the right [n-max] = numbers most to the left Ex. Num: 29392023 Array[0]:23, Array[1]: 20, array[2]: 39, array[3]:29 Since I established half the length of the variable as the mark off point I can then calculate the ones, tenths, hundredths, etc. Place via the halfway mark so that if a variable max length was 10 digits from 0 to 9999999999 then I know that by halfing that to five digits give me some play room. So if I add or multiply I can have a variable checker function that see that the sixth digit (from the right) of array[0] is the same place as the first digit (from the right) of array[1]. Dividing and subtracting have their own issues which I haven't thought about yet. I would like to know about the best implementations of supporting larger numbers than the program can.

    Read the article

  • What language and topics should be covered when teaching non-CS college students how to program?

    - by michaelcarrano
    I have been asked by many of my non-computer science friends to teach them how to program. I have agreed to hold a seminar for them that will last for approximately 1 to 2 hours. My thoughts are to use Python as the language to teach them basic programming skills. I figured Python is relatively easier to learn from what I have researched. It is also a language I want to learn which will make holding this seminar all the more enjoyable. The topics I plan to cover are as followed: Variables / Arrays Logic - If else statements, switch case, nested statements Loops - for, while, do-while and nested loops Functions - pass by value, pass by reference (is this the correct terms for Python? I am mostly a C/C++ person) Object Oriented Programming Of course, I plan to have code examples for all topics and I will try to have each example flow into each other so that at the end of the seminar everyone will have a complete working program. I suppose my question is, if you were given 1 to 2 hours to teach a group of college students how to program, what language would you choose and what topics would you cover? Update: Thank you for the great feedback. I should have mentioned in my earlier post above that a majority of the students attending the seminar have some form of programming experience whether it was with Java or using Matlab. Most of these students are 3rd/4th year Engineering students who want to get a refresher on programming before they graduate.

    Read the article

  • When can you call yourself good at language X?

    - by SoulBeaver
    This goes back to a conversation I've had with my girlfriend. I tried to tell her that I simply don't feel adequate enough in my programming language (C++) to call myself good. She then asked me, "Well, when do you consider yourself good enough?" That's an interesting question. I didn't know what to tell her. So I'm asking you. For any programming language, framework or the like, when do you reach a point were you sit back, look at what you've done and say, "Hey, I'm actually pretty good at this."? How do you define "good" so that you can tell others, honestly, "Yeah, I'm good at X". Additionally, do you reach these conclusions by comparing what others can do? Additional Info I have read the canonical paper on how it takes ten-thousand hours before you are an expert on the field. (Props to anybody that knows what this paper is called again) I have also read various articles from Coding Horror about interviewing people. Some people, it was said, "Cannot function outside of a framework." So they may be "good" for that framework, but not otherwise in the language. Is this true?

    Read the article

  • As a programmer, what's the most valuable non-English (human) language to learn?

    - by Andrew M
    I was thinking that with my developer skills, learning new languages like French, German etc. might be easier for me now. I could setup the verbs as objects in Python and use dir(verb) to find its methods, tenses and stuff ;-) But seriously, if you're a professional developer, in my case in the UK, what's the best foreign language to learn from an employment perspective? I'm thinking, like Hindi - if all our programming jobs are getting outsourced to India, might as well position yourself to be the on-site, go-between guy. Mandarin - if the Chinese become the pre-eminent economy, the new USA, in ten or twenty years, then speaking their language would open up a huge market to you. Russian - maybe another major up-and-comer, but already closer to Western standards. More IT-sector growth here than anywhere else in the coming years? Japanese - drivers of global technology, being able to speak their language could give you a big competitive advantage over other Westerners But I'm just guessing/musing with all these points. If you have an opinion, or even better, some evidence, I'd like to hear it. If the programming things falls through then at least it'll make for more interesting holidays.

    Read the article

  • What is an appropriate language for expressing initial stages of algorithm refinement?

    - by hydroparadise
    First, this is not a homework assignment, but you can treat it as such ;). I found the following question in the published paper The Camel Has Two Humps. I was not a CS major going to college (I majored in MIS/Management), but I have a job where I find myself coding quite often. For a non-trivial programming problem, which one of the following is an appropriate language for expressing the initial stages of algorithm refinement? (a) A high-level programming language. (b) English. (c) Byte code. (d) The native machine code for the processor on which the program will run. (e) Structured English (pseudocode). What I do know is that you usually want to start your design implementation by writing down pseuducode and then moving/writing in the desired technology (because we all do that, right?) But I never thought about it in terms of refinement. I mean, if you were the original designer, then you might have access to the original pseudocode. But realisticly, when I have to maintain/refactor/refine somebody elses code, I just keep trucking with the language it currently resides in. Anybody have a definitive answer to this? As a side note, I did a quick scan of the paper as I havn't read every single detail. It presents various score statistics, can't find where the answers are with the paper.

    Read the article

  • What is the best objective way to measure language popularity trends? (What's better than TIOBE?)

    - by Eric Wilson
    The best way to get data on computer language popularity that I know is the TIOBE index. But everyone knows that TIOBE is hopelessly flawed. (If someone provides a link to support this, I'll add it here.) So is there any data on programming language popularity that is generally considered meaningful? The only other option I know is to look at the trends at indeed.com, which is inherently flawed, being based on job postings. It isn't like I would make a future language decision solely based on an index, but it might provide a useful balance to the skewed perspective one obtains by talking to ones friends and colleagues. To illustrate that bias, I'll point out that based on the experience of those I personally know, the only languages used professionally today (in order of popularity) are Java, C#, Groovy, JavaScript, Ruby, Objective C, and Perl. (Though it is evident that C, C++ and PHP were used in the past.) So my question is, everyone bashes TIOBE, but is there anything else? If so, can anyone explain how we know the alternative has better methodology? Thanks.

    Read the article

  • Converting EBNF to BNF

    - by Vivin Paliath
    It's been a few years since my computer-language class and so I've forgotten the finer points of BNF's and EBNF's and I don't have a textbook next to me. Specifically, I've forgotten how to convert an EBNF into BNF. From what little I remember, I know that one of the main points is to convert { term } into <term> | <many-terms>. But I don't remember the other rules. I've tried to look this up online but I can only find links to either homework questions, or a small comment about converting terms with curly braces. I can't find an exhaustive list of rules that define the translation.

    Read the article

< Previous Page | 71 72 73 74 75 76 77 78 79 80 81 82  | Next Page >