Search Results

Search found 8703 results on 349 pages for 'cache money'.

Page 108/349 | < Previous Page | 104 105 106 107 108 109 110 111 112 113 114 115  | Next Page >

  • who has the best online prep MCSE study materials?

    - by phill
    I'm studying for my Microsoft Certified Systems Engineer (MCSE) on my own and was wondering who you could recommend for study materials? I did shell out a bunch of money for a local class and when it came to taking the test, there were a slew of topics which weren't even covered. For example, in my 070-293 test, they didn't even touch such as Certificates, how to setup clusters, sql server clustering, etcs. I realize there is a bunch out there such as cbtnuggets, preplogic, etcs. Which online preps do you suggest best preps you for the tests before I spend any more money on this stuff? thanks in advance

    Read the article

  • NGINX - CORS error affecting only Firefox

    - by wiherek
    this is an issue with Nginx that affects only firefox. I have this config: http://pastebin.com/q6Yeqxv9 upstream connect { server 127.0.0.1:8080; } server { server_name admin.example.com www.admin.example.com; listen 80; return 301 https://admin.example.com$request_uri; } server { listen 80; server_name ankieta.example.com www.ankieta.example.com; add_header Access-Control-Allow-Origin $http_origin; add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS, PUT, PATCH, DELETE'; add_header 'Access-Control-Allow-Credentials' 'true'; add_header 'Access-Control-Allow-Headers' 'Access-Control-Request-Method,Access-Control-Request-Headers,Cache,Pragma,Authorization,Accept,Accept-Encoding,Accept-Language,Host,Referer,Content-Length,Origin,DNT,X-Mx-ReqToken,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type'; return 301 https://ankieta.example.com$request_uri; } server { server_name admin.example.com; listen 443 ssl; ssl_certificate /srv/ssl/14182263.pem; ssl_certificate_key /srv/ssl/admin_i_ankieta.example.com.key; ssl_protocols SSLv3 TLSv1; ssl_ciphers ALL:!aNULL:!ADH:!eNULL:!LOW:!EXP:RC4+RSA:+HIGH:+MEDIUM; location / { proxy_pass http://connect; } } server { server_name ankieta.example.com; listen 443 ssl; ssl_certificate /srv/ssl/14182263.pem; ssl_certificate_key /srv/ssl/admin_i_ankieta.example.com.key; ssl_protocols SSLv3 TLSv1; ssl_ciphers ALL:!aNULL:!ADH:!eNULL:!LOW:!EXP:RC4+RSA:+HIGH:+MEDIUM; root /srv/limesurvey; index index.php; add_header 'Access-Control-Allow-Origin' $http_origin; add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS, PUT, PATCH, DELETE'; add_header 'Access-Control-Allow-Credentials' 'true'; add_header 'Access-Control-Allow-Headers' 'Access-Control-Request-Method,Access-Control-Request-Headers,Cache,Pragma,Authorization,Accept,Accept-Encoding,Accept-Language,Host,Referer,Content-Length,Origin,DNT,X-Mx-ReqToken,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type'; client_max_body_size 4M; location / { try_files $uri $uri/ /index.php?q=$uri&$args; } location ~ /*.php$ { fastcgi_split_path_info ^(.+\.php)(/.+)$; #NOTE: You should have "cgi.fix_pathinfo = 0;" in php.ini include fastcgi_params; fastcgi_param SCRIPT_FILENAME /srv/limesurvey$fastcgi_script_name; # fastcgi_param HTTPS $https; fastcgi_intercept_errors on; fastcgi_pass 127.0.0.1:9000; } location ~* \.(js|css|png|jpg|jpeg|gif|ico)$ { expires max; log_not_found off; } } this is basically an AngularJS app and a PHP app (LimeSurvey), served under two different domains by the same webserver (Nginx). AngularJS is in fact served by ConnectJS, which is proxied to by Nginx (ConnectJS listens only on localhost). In Firefox console I get this: Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://ankieta.example.com/admin/remotecontrol. This can be fixed by moving the resource to the same domain or enabling CORS. which of course is annoying. Other browsers work fine (Chrome, IE). Any suggestions on this?

    Read the article

  • centos6.3 varnish3.03 get the wrong backend

    - by Sola.Shawn
    I install varnish3.03 with yum! I got a problem with it my varnish config bellow:** # #backend weibo { .host = "192.168.1.178"; .port = "8080"; .connect_timeout=20s; .first_byte_timeout=20s; .between_bytes_timeout=20s; } #backend smth { .host = "192.168.1.115"; .port = "8080"; .connect_timeout=20s; .first_byte_timeout=20s; .between_bytes_timeout=20s; } #sub vcl_recv { if (req.restarts == 0) { if (req.http.x-forwarded-for) { set req.http.X-Forwarded-For = req.http.X-Forwarded-For + ", " + client.ip; } else { set req.http.X-Forwarded-For = client.ip; } } if (req.request != "GET" && req.request != "HEAD" && req.request != "PUT" && req.request != "POST" && req.request != "TRACE" && req.request != "OPTIONS" && req.request != "DELETE") { # /* Non-RFC2616 or CONNECT which is weird. */ return(pipe); } if (req.request != "GET" && req.request != "HEAD") { # /* We only deal with GET and HEAD by default */ return(pass); } if (req.http.Authorization || req.http.Cookie) { /* Not cacheable by default */ return(pass); } if (req.http.host ~ "^(hk.)?weibo.com"){ set req.http.host = "hk.weibo.com"; set req.backend = weibo; } elseif (req.http.host ~ "^(www.)?newsmth.net"){ set req.http.host = "www.newsmth.net"; set req.backend = smth; } else { error 404 "Unknown virtual host"; } return(lookup); } ##sub vcl_pipe { return(pipe); } #sub vcl_pass { return(pass); } #sub vcl_hash { hash_data(req.url); if(req.http.host) { hash_data(req.http.host); } else { hash_data(server.ip); } return(hash); } #sub vcl_hit { if(req.http.Cache-Control~"no-cache"||req.http.Cache-Control~"max-age=0"||req.http.Pragma~"no-cache"){ set obj.ttl=0s; return (restart); } return(deliver); } #sub vcl_miss { return(fetch); } #sub vcl_fetch { if (beresp.ttl <= 120s || beresp.http.Set-Cookie || beresp.http.Vary == "*") { /* * Mark as "Hit-For-Pass" for the next 2 minutes */ set beresp.ttl = 10s; return (hit_for_pass); } return(deliver); } #sub vcl_deliver { return(deliver); } #sub vcl_init { return(ok); } #sub vcl_fini { return(ok); } and my Win7's hosts file add bellow: 192.168.1.178 www.newsmth.net 192.168.1.178 hk.weibo.com start varnish varnishd -f /etc/varnish/dd.vcl -s malloc,100M -a 0.0.0.0:8000 -T 0.0.0.0:3500<br> but when I access the "hk.weibo.com:8000" it fine, and got: Hello,I am hk.weibo.com! but when access http://www.newsmth.net:8000/, got: Hello,I am hk.weibo.com! <br> My question is why it isn't "Hello,I am www.newsmth.net!"? varnish fetched the content from the wrong backend. Does anyone know how to fix this?

    Read the article

  • (Preferably) Encrypted Server Backups

    - by Shoaibi
    I have somehow managed to purchase a VPS after collecting money for sometime, now problem is i cant find a way to backup the server. My previous approach was: Got a webdav account from mydisk.se, mounted it on the vps, used duplicity and created encrypted backups. Problem is it was only 2G, and its running out of space, at my own place i dont have a stable internet connection else i have a 500G drive that i could surely use for backups. The vps has a 12G HD, and i would like to backup /home, /root, /etc, /var/ (specially log and www). Any ideas are welcomed. [EDIT] I am more of looking for resource of setting up a backup-point or such(i know how to setup a backup server, but i cant as i dont have stable connection or the money to buy another VPS/disk for backup) , i have already got the tools needed.

    Read the article

  • I hyperlinked a cell in excel 2003, formula issues?

    - by joseinsomniac
    I have a budget spreadsheet using excel 2003. I have My deposit, then all of my bills, the total, then a cell that has the difference(between the amount of deposit and the total of the bills). The difference cell numbers turn red when I dont have enough money (deposit vs bill total). I hyperlinked the difference cell to a checkbook register spreadsheet so I can track where all my extra money went(reconsile receipts daily). When hyperlinked the numbers are blue. I need the numbers to stay black(when above 0.00) and stay red (when the numbers are below 0.00) and not change after the link has been clicked on. Also if the link has not been clicked on, and the numbers are red, the font is smaller, even though the toolbar shows the font size hasnt changed. After I click on it and go back to the budget sheet, its the size it should be. Any Ideas? Thanks!

    Read the article

  • Backing up a Linux VPS with RSync to Vista

    - by Frank
    I've been working to setup a Linux VPS to host a couple of Wordpress sites and eventually a Mercurial server. I've setup one site and things have gone well. However, before I start moving other things to the VPS, I need to setup a backup solution. My provider, Linode, suggest RSync (among a couple of other options) to do backups. I've seen a few posts on this site that suggests other backup solutions including going to the Amazon Cloud but that costs money and the VPS is all the money I want to spend on this for the time being. So, to help solve that I want to have my backup computer be my home desktop computer. Assuming I'm using RSync, is it possible to use my Vista based home computer to become the destination for the backup? And if it is possible, what type of command or connection would I need to configure on the vista machine? Any insight would be helpful. It's probably obvious, but I've never used RSync.

    Read the article

  • MySQL reserves too much RAM

    - by Buddy
    I have a cheap VPS with 128Mb RAM and 256Mb burst. MySQL starts and reserves about 110Mb, but uses not more than 20Mb of them. My VPS Control Panel shows, that I use 127Mb (I also running nginx and sphinx), I know, that it shows reserved RAM, but when I reach over 128Mb, my VPS reboots automatically every 4 hours. So I want to force MySQL to reserve less RAM. How can i do that? I did some tweaks with my.conf but it helped not so much. top output: PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 1 root 15 0 2156 668 572 S 0.0 0.3 0:00.03 init 11311 root 15 0 11212 356 228 S 0.0 0.1 0:00.00 vzctl 11312 root 18 0 3712 1484 1248 S 0.0 0.6 0:00.01 bash 11347 root 18 0 2284 916 732 R 0.0 0.3 0:00.00 top 13978 root 17 -4 2248 552 344 S 0.0 0.2 0:00.00 udevd 14262 root 15 0 1812 564 472 S 0.0 0.2 0:00.03 syslogd 14293 sphinx 15 0 11816 1172 672 S 0.0 0.4 0:00.07 searchd 14305 root 25 0 7192 1036 636 S 0.0 0.4 0:00.00 sshd 14321 root 25 0 2832 836 668 S 0.0 0.3 0:00.00 xinetd 15389 root 18 0 3708 1300 1132 S 0.0 0.5 0:00.00 mysqld_safe 15441 mysql 15 0 113m 16m 4440 S 0.0 6.4 0:00.15 mysqld 15489 root 21 0 13056 1456 340 S 0.0 0.6 0:00.00 nginx 15490 nginx 18 0 13328 2388 992 S 0.0 0.9 0:00.06 nginx 15507 nginx 25 0 19520 5888 4244 S 0.0 2.2 0:00.00 php-cgi 15508 nginx 18 0 19636 4876 2748 S 0.0 1.9 0:00.12 php-cgi 15509 nginx 15 0 19668 4872 2716 S 0.0 1.9 0:00.11 php-cgi 15518 root 18 0 4492 1116 568 S 0.0 0.4 0:00.01 crond MySQL tuner: >> MySQLTuner 1.0.1 - Major Hayden <[email protected]> >> Bug reports, feature requests, and downloads at http://mysqltuner.com/ >> Run with '--help' for additional options and output filtering Please enter your MySQL administrative login: root Please enter your MySQL administrative password: -------- General Statistics -------------------------------------------------- [--] Skipped version check for MySQLTuner script [OK] Currently running supported MySQL version 5.0.77 [OK] Operating on 32-bit architecture with less than 2GB RAM -------- Storage Engine Statistics ------------------------------------------- [--] Status: -Archive -BDB -Federated +InnoDB -ISAM -NDBCluster [--] Data in InnoDB tables: 1M (Tables: 1) [OK] Total fragmented tables: 0 -------- Performance Metrics ------------------------------------------------- [--] Up for: 38m 43s (37 q [0.016 qps], 20 conn, TX: 4M, RX: 3K) [--] Reads / Writes: 100% / 0% [--] Total buffers: 28.1M global + 832.0K per thread (100 max threads) [OK] Maximum possible memory usage: 109.4M (42% of installed RAM) [OK] Slow queries: 0% (0/37) [OK] Highest usage of available connections: 1% (1/100) [OK] Key buffer size / total MyISAM indexes: 128.0K/64.0K [OK] Query cache efficiency: 42.1% (8 cached / 19 selects) [OK] Query cache prunes per day: 0 [!!] Temporary tables created on disk: 27% (3 on disk / 11 total) [!!] Thread cache is disabled [OK] Table cache hit rate: 57% (8 open / 14 opened) [OK] Open file limit used: 1% (12/1K) [OK] Table locks acquired immediately: 100% (22 immediate / 22 locks) [!!] Connections aborted: 10% [OK] InnoDB data size / buffer pool: 1.5M/8.0M -------- Recommendations ----------------------------------------------------- General recommendations: MySQL started within last 24 hours - recommendations may be inaccurate Enable the slow query log to troubleshoot bad queries When making adjustments, make tmp_table_size/max_heap_table_size equal Reduce your SELECT DISTINCT queries without LIMIT clauses Set thread_cache_size to 4 as a starting value Your applications are not closing MySQL connections properly Variables to adjust: tmp_table_size (> 32M) max_heap_table_size (> 16M) thread_cache_size (start at 4) I think if I do what MySQLtuner says, MySQL will use more RAM.

    Read the article

  • Compute number of occurrences in a column of a spreadsheet

    - by wnstnsmth
    I have a Google Drive spreadsheet with a single column that holds string values (Twitter screen names) such as "user1", "user1", "UserX", and I would like to count those values so that I can easily craft a bar chart out of it. So the result should be value occurrence ----------------------- user1 2 UserX 1 ... .... Please note, I only want to look for whole words, and not part words. EG, the words 'on' and 'one' appears in the word 'money' - I would not count this (eg, only the word money is counted). Hope that is clear enough. What formula should I use?

    Read the article

  • Generating and collaborating on network map diagrams?

    - by Ian C.
    I have to turn out some network topology maps for very large networks. I'd like the format for the maps to be something other people can also edit and contribute to regardless of what software I'm using on my Mac to build them. I don't mind spending money on my end for software, but I can't require that my clients spend any money. I also can't promise my clients are also using OS X -- they could be running Linux or Windows. Is there a best software application on OS X for producing maps that I can share with other, non-OS X, users? Is there a best format for sharing topology maps that I should use when exporting the maps to disk?

    Read the article

  • need assistance with my.cnf - 1500% CPU usage

    - by Alan Long
    I'm running into a few issues with our new database server. It is a HP G8 with 2 INTEL XEON E5-2650 processors and 32GB of ram. This server is dedicated as a MySQL server (5.1.69) for our intranet portal. I have been having issues with this server staying alive - I notice high CPU usage during certain times of day (8% ~ 1500%+) and see very low memory usage (7 ~ 15%) based on using the 'top' command. When the CPU usage passes 1000%, that is when the app usually dies. I'm trying to see what I'm doing wrong with the config file, hopefully one of the experts can chime in and let me know what they think. See below for my.cnf file: [mysqld] default-storage-engine=InnoDB datadir=/var/lib/mysql socket=/var/lib/mysql/mysql.sock #user=mysql large-pages # Disabling symbolic-links is recommended to prevent assorted security risks symbolic-links=0 max_connections=275 tmp_table_size=1G key_buffer_size=384M key_buffer=384M thread_cache_size=1024 long_query_time=5 low_priority_updates=1 max_heap_table_size=1G myisam_sort_buffer_size=8M concurrent_insert=2 table_cache=1024 sort_buffer_size=8M read_buffer_size=5M read_rnd_buffer_size=6M join_buffer_size=16M table_definition_cache=6k open_files_limit=8k slow_query_log #skip-name-resolve # Innodb Settings innodb_buffer_pool_size=18G innodb_thread_concurrency=0 innodb_log_file_size=1G innodb_log_buffer_size=16M innodb_flush_log_at_trx_commit=2 innodb_lock_wait_timeout=50 innodb_file_per_table #innodb_buffer_pool_instances=4 #eliminating double buffering innodb_flush_method = O_DIRECT flush_time=86400 innodb_additional_mem_pool_size=40M #innodb_io_capacity = 5000 #innodb_read_io_threads = 64 #innodb_write_io_threads = 64 # increase until threads_created doesnt grow anymore thread_cache=1024 query_cache_type=1 query_cache_limit=4M query_cache_size=256M # Try number of CPU's*2 for thread_concurrency thread_concurrency = 0 wait_timeout = 1800 connect_timeout = 10 interactive_timeout = 60 [mysqldump] max_allowed_packet=32M [mysqld_safe] log-error=/var/log/mysqld.log pid-file=/var/run/mysqld/mysqld.pid log-slow-queries=/var/log/mysql/slow-queries.log long_query_time = 1 log-queries-not-using-indexes we connect to one database with 75 tables, the largest table has 1,150,000 entries and the second largest has 128,036 entries. I have also verified that our PHP queries are optimized as best as possible. Reference - MySQLtuner: >> MySQLTuner 1.2.0 - Major Hayden <[email protected]> >> Bug reports, feature requests, and downloads at http://mysqltuner.com/ >> Run with '--help' for additional options and output filtering -------- General Statistics -------------------------------------------------- [--] Skipped version check for MySQLTuner script [OK] Currently running supported MySQL version 5.1.69-log [OK] Operating on 64-bit architecture -------- Storage Engine Statistics ------------------------------------------- [--] Status: -Archive -BDB -Federated +InnoDB -ISAM -NDBCluster [--] Data in InnoDB tables: 420M (Tables: 75) [!!] Total fragmented tables: 75 -------- Security Recommendations ------------------------------------------- [!!] User '[email protected]' has no password set. -------- Performance Metrics ------------------------------------------------- [--] Up for: 1h 14m 50s (8M q [1K qps], 705 conn, TX: 6B, RX: 892M) [--] Reads / Writes: 68% / 32% [--] Total buffers: 19.7G global + 35.2M per thread (275 max threads) [!!] Maximum possible memory usage: 29.1G (93% of installed RAM) [OK] Slow queries: 0% (472/8M) [OK] Highest usage of available connections: 66% (183/275) [OK] Key buffer size / total MyISAM indexes: 384.0M/91.0K [OK] Key buffer hit rate: 100.0% (173 cached / 0 reads) [OK] Query cache efficiency: 96.2% (7M cached / 7M selects) [!!] Query cache prunes per day: 553614 [OK] Sorts requiring temporary tables: 0% (3 temp sorts / 1K sorts) [!!] Temporary tables created on disk: 49% (3K on disk / 7K total) [OK] Thread cache hit rate: 74% (183 created / 705 connections) [OK] Table cache hit rate: 97% (231 open / 238 opened) [OK] Open file limit used: 0% (17/8K) [OK] Table locks acquired immediately: 100% (432K immediate / 432K locks) [OK] InnoDB data size / buffer pool: 420.9M/18.0G -------- Recommendations ----------------------------------------------------- General recommendations: Run OPTIMIZE TABLE to defragment tables for better performance MySQL started within last 24 hours - recommendations may be inaccurate Reduce your overall MySQL memory footprint for system stability Increasing the query_cache size over 128M may reduce performance Temporary table size is already large - reduce result set size Reduce your SELECT DISTINCT queries without LIMIT clauses Variables to adjust: *** MySQL's maximum memory usage is dangerously high *** *** Add RAM before increasing MySQL buffer variables *** query_cache_size (> 256M) [see warning above] Thanks in advanced for your help!

    Read the article

  • Get lots of javascript problems when using Opera 11.00 to surf

    - by s hanley
    Sites like ebay and even superuser stop working properly when I use opera 11.00. Menus stop working everywhere from ebay to godaddy. Hovering on a menu item doesn't expand it, no sub menu slides out. This makes a large number of very popular websites unusable. Am I right in assuming this is a javascript issue? I use opera for the turbo feature (I have tested opera with and without turbo so it's not turbo's fault) because I'm on mobile broadband until I get my phone line sorted out. Turbo helps me save money, as well as allowing me to surf at a sane speed. Is there a firefox or chrome equivalent to opera turbo that doesn't cost money? I'm using Opera 11.00, build 1156.

    Read the article

  • Websites not working with Opera 11.00, major bugs

    - by s hanley
    Sites like ebay and even superuser stop working properly when I use opera 11.00. Menus stop working everywhere from ebay to godaddy. Hovering on a menu item doesn't expand it, no sub menu slides out. This makes a large number of very popular websites unusable. Am I right in assuming this is a javascript issue? I use opera for the turbo feature (I have tested opera with and without turbo so it's not turbo's fault) because I'm on mobile broadband until I get my phone line sorted out. Turbo helps me save money, as well as allowing me to surf at a sane speed. Is there a firefox or chrome equivalent to opera turbo that doesn't cost money? I'm using Opera 11.00, build 1156.

    Read the article

  • Domain registration and DNS, what am I actually paying for? [on hold]

    - by jozxyqk
    Long story short I'm quite confused as to exactly what is offered by domain registration and dns service sites. When I go to the url "http://google.com", my PC connects to a name server and gets the IP for "google.com", then connects to the IP and says, give me the page for "http://google.com". AFAIK there are many name servers and they all cache these bits of information in some hierarchical network, but ultimately a DNS record must come from a single source (not sure what this is called). There are different kinds of records, that might not an IP but an alias/redirect to other records for example. Lets say I want my own domain name for some server. Maybe it even has a static IP but I want a nicer thing for people to remember, or my ISP assigns dynamic IPs and I want a URL that always works, or my website is hosted on a shared machine so the browser needs to send "http://mydnsname.com" to the webserver to distinguish it from other requests to the same IP but for different sites. Registering a domain costs a small amount of money per year. Where does this money go, not that I'm complaining :P? Is that really all it costs to maintain the entire DNS system of nameservers? If I just register the domain and nothing else, what do I get? Is that just reserving a name or hosting WHOIS information or have I paid for a dns recrord to be hosted? Can a domain alone have a record, such as an IP or be an alias to another? A bunch of sites out there offer other services, in addition to domain registration (I'm assuming they register the domain through another party for me). One example is "dynamic DNS" (DDNS), but isn't this just a regular DNS record that's updated regularly? Does it cost extra to update more often? Without a DDNS, can a DNS record still point to an IP? I've also seen the term "managed DNS" and have no idea where that fits in.

    Read the article

  • Copyrights concerning code snippets and larger amounts of code

    - by JustcallmeDrago
    I am designing a public code repository. Users will be allowed to post and edit whatever amount of code they want, from code snippets to entire multi-file projects. I have a few major legal concerns about this: Not getting sued/shut down - I feel the site would be a much easier target than tracking down an individual user to sue. I have looked around a bit and see links to legal info in the footer of each page is common. What specific things should I do--and what does does a site such as YouTube (which I see copyrighted material on all the time) do--for protection? Citing sources and editing sourced code - If a user wants to post code that isn't theirs, what concerns/safeguards should I have? Will a link suffice, and what do I need further to allow the code to be edited (to improve it for example)? What can happen if a user posts copyrighted code without citing it? Large chunks of code - What legal differences should I look out for as the amount grows? Not having a mess of licenses for the site - I would like to have a single license (like RosettaCode) that keeps things simple for interaction on the site. I want the code to be postable and editable. I have looked into StackOverflow's CreativeCommons license a little and it says that If you alter, transform, or build upon this work, you may distribute the resulting work only under the same or similar license to this one. And on RosettaCode: All software found on Rosetta Code should be considered potentially hazardous. Use at your own risk. Be aware that all code on Rosetta Code is under the GNU Free Documentation License, as are any edits made by contributors. See Rosetta Code:Copyrights for details. What other licenses are like this? Commercializing the site - In what ways can I and can't I make money off of a site that contains code like this? All code will be publicly visible. Initial thoughts are having ads or making money by charging for advanced features.

    Read the article

  • Understanding LINQ to SQL (11) Performance

    - by Dixin
    [LINQ via C# series] LINQ to SQL has a lot of great features like strong typing query compilation deferred execution declarative paradigm etc., which are very productive. Of course, these cannot be free, and one price is the performance. O/R mapping overhead Because LINQ to SQL is based on O/R mapping, one obvious overhead is, data changing usually requires data retrieving:private static void UpdateProductUnitPrice(int id, decimal unitPrice) { using (NorthwindDataContext database = new NorthwindDataContext()) { Product product = database.Products.Single(item => item.ProductID == id); // SELECT... product.UnitPrice = unitPrice; // UPDATE... database.SubmitChanges(); } } Before updating an entity, that entity has to be retrieved by an extra SELECT query. This is slower than direct data update via ADO.NET:private static void UpdateProductUnitPrice(int id, decimal unitPrice) { using (SqlConnection connection = new SqlConnection( "Data Source=localhost;Initial Catalog=Northwind;Integrated Security=True")) using (SqlCommand command = new SqlCommand( @"UPDATE [dbo].[Products] SET [UnitPrice] = @UnitPrice WHERE [ProductID] = @ProductID", connection)) { command.Parameters.Add("@ProductID", SqlDbType.Int).Value = id; command.Parameters.Add("@UnitPrice", SqlDbType.Money).Value = unitPrice; connection.Open(); command.Transaction = connection.BeginTransaction(); command.ExecuteNonQuery(); // UPDATE... command.Transaction.Commit(); } } The above imperative code specifies the “how to do” details with better performance. For the same reason, some articles from Internet insist that, when updating data via LINQ to SQL, the above declarative code should be replaced by:private static void UpdateProductUnitPrice(int id, decimal unitPrice) { using (NorthwindDataContext database = new NorthwindDataContext()) { database.ExecuteCommand( "UPDATE [dbo].[Products] SET [UnitPrice] = {0} WHERE [ProductID] = {1}", id, unitPrice); } } Or just create a stored procedure:CREATE PROCEDURE [dbo].[UpdateProductUnitPrice] ( @ProductID INT, @UnitPrice MONEY ) AS BEGIN BEGIN TRANSACTION UPDATE [dbo].[Products] SET [UnitPrice] = @UnitPrice WHERE [ProductID] = @ProductID COMMIT TRANSACTION END and map it as a method of NorthwindDataContext (explained in this post):private static void UpdateProductUnitPrice(int id, decimal unitPrice) { using (NorthwindDataContext database = new NorthwindDataContext()) { database.UpdateProductUnitPrice(id, unitPrice); } } As a normal trade off for O/R mapping, a decision has to be made between performance overhead and programming productivity according to the case. In a developer’s perspective, if O/R mapping is chosen, I consistently choose the declarative LINQ code, unless this kind of overhead is unacceptable. Data retrieving overhead After talking about the O/R mapping specific issue. Now look into the LINQ to SQL specific issues, for example, performance in the data retrieving process. The previous post has explained that the SQL translating and executing is complex. Actually, the LINQ to SQL pipeline is similar to the compiler pipeline. It consists of about 15 steps to translate an C# expression tree to SQL statement, which can be categorized as: Convert: Invoke SqlProvider.BuildQuery() to convert the tree of Expression nodes into a tree of SqlNode nodes; Bind: Used visitor pattern to figure out the meanings of names according to the mapping info, like a property for a column, etc.; Flatten: Figure out the hierarchy of the query; Rewrite: for SQL Server 2000, if needed Reduce: Remove the unnecessary information from the tree. Parameterize Format: Generate the SQL statement string; Parameterize: Figure out the parameters, for example, a reference to a local variable should be a parameter in SQL; Materialize: Executes the reader and convert the result back into typed objects. So for each data retrieving, even for data retrieving which looks simple: private static Product[] RetrieveProducts(int productId) { using (NorthwindDataContext database = new NorthwindDataContext()) { return database.Products.Where(product => product.ProductID == productId) .ToArray(); } } LINQ to SQL goes through above steps to translate and execute the query. Fortunately, there is a built-in way to cache the translated query. Compiled query When such a LINQ to SQL query is executed repeatedly, The CompiledQuery can be used to translate query for one time, and execute for multiple times:internal static class CompiledQueries { private static readonly Func<NorthwindDataContext, int, Product[]> _retrieveProducts = CompiledQuery.Compile((NorthwindDataContext database, int productId) => database.Products.Where(product => product.ProductID == productId).ToArray()); internal static Product[] RetrieveProducts( this NorthwindDataContext database, int productId) { return _retrieveProducts(database, productId); } } The new version of RetrieveProducts() gets better performance, because only when _retrieveProducts is first time invoked, it internally invokes SqlProvider.Compile() to translate the query expression. And it also uses lock to make sure translating once in multi-threading scenarios. Static SQL / stored procedures without translating Another way to avoid the translating overhead is to use static SQL or stored procedures, just as the above examples. Because this is a functional programming series, this article not dive into. For the details, Scott Guthrie already has some excellent articles: LINQ to SQL (Part 6: Retrieving Data Using Stored Procedures) LINQ to SQL (Part 7: Updating our Database using Stored Procedures) LINQ to SQL (Part 8: Executing Custom SQL Expressions) Data changing overhead By looking into the data updating process, it also needs a lot of work: Begins transaction Processes the changes (ChangeProcessor) Walks through the objects to identify the changes Determines the order of the changes Executes the changings LINQ queries may be needed to execute the changings, like the first example in this article, an object needs to be retrieved before changed, then the above whole process of data retrieving will be went through If there is user customization, it will be executed, for example, a table’s INSERT / UPDATE / DELETE can be customized in the O/R designer It is important to keep these overhead in mind. Bulk deleting / updating Another thing to be aware is the bulk deleting:private static void DeleteProducts(int categoryId) { using (NorthwindDataContext database = new NorthwindDataContext()) { database.Products.DeleteAllOnSubmit( database.Products.Where(product => product.CategoryID == categoryId)); database.SubmitChanges(); } } The expected SQL should be like:BEGIN TRANSACTION exec sp_executesql N'DELETE FROM [dbo].[Products] AS [t0] WHERE [t0].[CategoryID] = @p0',N'@p0 int',@p0=9 COMMIT TRANSACTION Hoverer, as fore mentioned, the actual SQL is to retrieving the entities, and then delete them one by one:-- Retrieves the entities to be deleted: exec sp_executesql N'SELECT [t0].[ProductID], [t0].[ProductName], [t0].[SupplierID], [t0].[CategoryID], [t0].[QuantityPerUnit], [t0].[UnitPrice], [t0].[UnitsInStock], [t0].[UnitsOnOrder], [t0].[ReorderLevel], [t0].[Discontinued] FROM [dbo].[Products] AS [t0] WHERE [t0].[CategoryID] = @p0',N'@p0 int',@p0=9 -- Deletes the retrieved entities one by one: BEGIN TRANSACTION exec sp_executesql N'DELETE FROM [dbo].[Products] WHERE ([ProductID] = @p0) AND ([ProductName] = @p1) AND ([SupplierID] IS NULL) AND ([CategoryID] = @p2) AND ([QuantityPerUnit] IS NULL) AND ([UnitPrice] = @p3) AND ([UnitsInStock] = @p4) AND ([UnitsOnOrder] = @p5) AND ([ReorderLevel] = @p6) AND (NOT ([Discontinued] = 1))',N'@p0 int,@p1 nvarchar(4000),@p2 int,@p3 money,@p4 smallint,@p5 smallint,@p6 smallint',@p0=78,@p1=N'Optimus Prime',@p2=9,@p3=$0.0000,@p4=0,@p5=0,@p6=0 exec sp_executesql N'DELETE FROM [dbo].[Products] WHERE ([ProductID] = @p0) AND ([ProductName] = @p1) AND ([SupplierID] IS NULL) AND ([CategoryID] = @p2) AND ([QuantityPerUnit] IS NULL) AND ([UnitPrice] = @p3) AND ([UnitsInStock] = @p4) AND ([UnitsOnOrder] = @p5) AND ([ReorderLevel] = @p6) AND (NOT ([Discontinued] = 1))',N'@p0 int,@p1 nvarchar(4000),@p2 int,@p3 money,@p4 smallint,@p5 smallint,@p6 smallint',@p0=79,@p1=N'Bumble Bee',@p2=9,@p3=$0.0000,@p4=0,@p5=0,@p6=0 -- ... COMMIT TRANSACTION And the same to the bulk updating. This is really not effective and need to be aware. Here is already some solutions from the Internet, like this one. The idea is wrap the above SELECT statement into a INNER JOIN:exec sp_executesql N'DELETE [dbo].[Products] FROM [dbo].[Products] AS [j0] INNER JOIN ( SELECT [t0].[ProductID], [t0].[ProductName], [t0].[SupplierID], [t0].[CategoryID], [t0].[QuantityPerUnit], [t0].[UnitPrice], [t0].[UnitsInStock], [t0].[UnitsOnOrder], [t0].[ReorderLevel], [t0].[Discontinued] FROM [dbo].[Products] AS [t0] WHERE [t0].[CategoryID] = @p0) AS [j1] ON ([j0].[ProductID] = [j1].[[Products])', -- The Primary Key N'@p0 int',@p0=9 Query plan overhead The last thing is about the SQL Server query plan. Before .NET 4.0, LINQ to SQL has an issue (not sure if it is a bug). LINQ to SQL internally uses ADO.NET, but it does not set the SqlParameter.Size for a variable-length argument, like argument of NVARCHAR type, etc. So for two queries with the same SQL but different argument length:using (NorthwindDataContext database = new NorthwindDataContext()) { database.Products.Where(product => product.ProductName == "A") .Select(product => product.ProductID).ToArray(); // The same SQL and argument type, different argument length. database.Products.Where(product => product.ProductName == "AA") .Select(product => product.ProductID).ToArray(); } Pay attention to the argument length in the translated SQL:exec sp_executesql N'SELECT [t0].[ProductID] FROM [dbo].[Products] AS [t0] WHERE [t0].[ProductName] = @p0',N'@p0 nvarchar(1)',@p0=N'A' exec sp_executesql N'SELECT [t0].[ProductID] FROM [dbo].[Products] AS [t0] WHERE [t0].[ProductName] = @p0',N'@p0 nvarchar(2)',@p0=N'AA' Here is the overhead: The first query’s query plan cache is not reused by the second one:SELECT sys.syscacheobjects.cacheobjtype, sys.dm_exec_cached_plans.usecounts, sys.syscacheobjects.[sql] FROM sys.syscacheobjects INNER JOIN sys.dm_exec_cached_plans ON sys.syscacheobjects.bucketid = sys.dm_exec_cached_plans.bucketid; They actually use different query plans. Again, pay attention to the argument length in the [sql] column (@p0 nvarchar(2) / @p0 nvarchar(1)). Fortunately, in .NET 4.0 this is fixed:internal static class SqlTypeSystem { private abstract class ProviderBase : TypeSystemProvider { protected int? GetLargestDeclarableSize(SqlType declaredType) { SqlDbType sqlDbType = declaredType.SqlDbType; if (sqlDbType <= SqlDbType.Image) { switch (sqlDbType) { case SqlDbType.Binary: case SqlDbType.Image: return 8000; } return null; } if (sqlDbType == SqlDbType.NVarChar) { return 4000; // Max length for NVARCHAR. } if (sqlDbType != SqlDbType.VarChar) { return null; } return 8000; } } } In this above example, the translated SQL becomes:exec sp_executesql N'SELECT [t0].[ProductID] FROM [dbo].[Products] AS [t0] WHERE [t0].[ProductName] = @p0',N'@p0 nvarchar(4000)',@p0=N'A' exec sp_executesql N'SELECT [t0].[ProductID] FROM [dbo].[Products] AS [t0] WHERE [t0].[ProductName] = @p0',N'@p0 nvarchar(4000)',@p0=N'AA' So that they reuses the same query plan cache: Now the [usecounts] column is 2.

    Read the article

  • Dilemma for growing a project: Open source volunteer developers VS closed source paid / revshare developers? [closed]

    - by giorgio79
    I am trying to grow my project, and I am vaccillating between some examples. Some options seem to be: 1. open sourcing the project to draw volunteer developers. Pros This would mean anyone can try and make some money off the code that would motivate them to contribute back and grow the project. Cons Existing bigger could easily copy and paste my work so far. They can also replicate without having access to the code, but that would take more time. I also thought of using AGPL license, but again, code can still be copied without redistribution. After all, enforcing a license costs a lot of money, and I cannot just say to a possible copycat that it seems you copied my code, show me what you got. 2. Keep the project closed source, but create some kind of a developer program where they get revshare Pros I keep the main rights for the project, but still generate interest by creating a developer program. Noone can copy code easily, just with some considerable effort, but make contributions easy as a breeze. I am also seeing many companies just open source a part of their projects, like Acquia does not open source its multisite setup, or github does not open source some of its core business. Cons Less attention from open source committed devs. Conclusion So option 2 seems the most secure, but would love some feedback.

    Read the article

  • The Red Gate and .NET Reflector Debacle

    - by Rick Strahl
    About a month ago Red Gate – the company who owns the NET Reflector tool most .NET devs use at one point or another – decided to change their business model for Reflector and take the product from free to a fully paid for license model. As a bit of history: .NET Reflector was originally created by Lutz Roeder as a free community tool to inspect .NET assemblies. Using Reflector you can examine the types in an assembly, drill into type signatures and quickly disassemble code to see how a particular method works.  In case you’ve been living under a rock and you’ve never looked at Reflector, here’s what it looks like drilled into an assembly from disk with some disassembled source code showing: Note that you get tons of information about each element in the tree, and almost all related types and members are clickable both in the list and source view so it’s extremely easy to navigate and follow the code flow even in this static assembly only view. For many year’s Lutz kept the the tool up to date and added more features gradually improving an already amazing tool and making it better. Then about two and a half years ago Red Gate bought the tool from Lutz. A lot of ruckus and noise ensued in the community back then about what would happen with the tool and… for the most part very little did. Other than the incessant update notices with prominent Red Gate promo on them life with Reflector went on. The product didn’t die and and it didn’t go commercial or to a charge model. When .NET 4.0 came out it still continued to work mostly because the .NET feature set doesn’t drastically change how types behave.  Then a month back Red Gate started making noise about a new Version Version 7 which would be commercial. No more free version - and a shit storm broke out in the community. Now normally I’m not one to be critical of companies trying to make money from a product, much less for a product that’s as incredibly useful as Reflector. There isn’t day in .NET development that goes by for me where I don’t fire up Reflector. Whether it’s for examining the innards of the .NET Framework, checking out third party code, or verifying some of my own code and resources. Even more so recently I’ve been doing a lot of Interop work with a non-.NET application that needs to access .NET components and Reflector has been immensely valuable to me (and my clients) if figuring out exact type signatures required to calling .NET components in assemblies. In short Reflector is an invaluable tool to me. Ok, so what’s the problem? Why all the fuss? Certainly the $39 Red Gate is trying to charge isn’t going to kill any developer. If there’s any tool in .NET that’s worth $39 it’s Reflector, right? Right, but that’s not the problem here. The problem is how Red Gate went about moving the product to commercial which borders on the downright bizarre. It’s almost as if somebody in management wrote a slogan: “How can we piss off the .NET community in the most painful way we can?” And that it seems Red Gate has a utterly succeeded. People are rabid, and for once I think that this outrage isn’t exactly misplaced. Take a look at the message thread that Red Gate dedicated from a link off the download page. Not only is Version 7 going to be a paid commercial tool, but the older versions of Reflector won’t be available any longer. Not only that but older versions that are already in use also will continually try to update themselves to the new paid version – which when installed will then expire unless registered properly. There have also been reports of Version 6 installs shutting themselves down and failing to work if the update is refused (I haven’t seen that myself so not sure if that’s true). In other words Red Gate is trying to make damn sure they’re getting your money if you attempt to use Reflector. There’s a lot of temptation there. Think about the millions of .NET developers out there and all of them possibly upgrading – that’s a nice chunk of change that Red Gate’s sitting on. Even with all the community backlash these guys are probably making some bank right now just because people need to get life to move on. Red Gate also put up a Feedback link on the download page – which not surprisingly is chock full with hate mail condemning the move. Oddly there’s not a single response to any of those messages by the Red Gate folks except when it concerns license questions for the full version. It puzzles me what that link serves for other yet than another complete example of failure to understand how to handle customer relations. There’s no doubt that that all of this has caused some serious outrage in the community. The sad part though is that this could have been handled so much less arrogantly and without pissing off the entire community and causing so much ill-will. People are pissed off and I have no doubt that this negative publicity will show up in the sales numbers for their other products. I certainly hope so. Stupidity ought to be painful! Why do Companies do boneheaded stuff like this? Red Gate’s original decision to buy Reflector was hotly debated but at that the time most of what would happen was mostly speculation. But I thought it was a smart move for any company that is in need of spreading its marketing message and corporate image as a vendor in the .NET space. Where else do you get to flash your corporate logo to hordes of .NET developers on a regular basis?  Exploiting that marketing with some goodwill of providing a free tool breeds positive feedback that hopefully has a good effect on the company’s visibility and the products it sells. Instead Red Gate seems to have taken exactly the opposite tack of corporate bullying to try to make a quick buck – and in the process ruined any community goodwill that might have come from providing a service community for free while still getting valuable marketing. What’s so puzzling about this boneheaded escapade is that the company doesn’t need to resort to underhanded tactics like what they are trying with Reflector 7. The tools the company makes are very good. I personally use SQL Compare, Sql Data Compare and ANTS Profiler on a regular basis and all of these tools are essential in my toolbox. They certainly work much better than the tools that are in the box with Visual Studio. Chances are that if Reflector 7 added useful features I would have been more than happy to shell out my $39 to upgrade when the time is right. It’s Expensive to give away stuff for Free At the same time, this episode shows some of the big problems that come with ‘free’ tools. A lot of organizations are realizing that giving stuff away for free is actually quite expensive and the pay back is often very intangible if any at all. Those that rely on donations or other voluntary compensation find that they amount contributed is absolutely miniscule as to not matter at all. Yet at the same time I bet most of those clamouring the loudest on that Red Gate Reflector feedback page that Reflector won’t be free anymore probably have NEVER made a donation to any open source project or free tool ever. The expectation of Free these days is just too great – which is a shame I think. There’s a lot to be said for paid software and having somebody to hold to responsible to because you gave them some money. There’s an incentive –> payback –> responsibility model that seems to be missing from free software (not all of it, but a lot of it). While there certainly are plenty of bad apples in paid software as well, money tends to be a good motivator for people to continue working and improving products. Reasons for giving away stuff are many but often it’s a naïve desire to share things when things are simple. At first it might be no problem to volunteer time and effort but as products mature the fun goes out of it, and as the reality of product maintenance kicks in developers want to get something back for the time and effort they’re putting in doing non-glamorous work. It’s then when products die or languish and this is painful for all to watch. For Red Gate however, I think there was always a pretty good payback from the Reflector acquisition in terms of marketing: Visibility and possible positioning of their products although they seemed to have mostly ignored that option. On the other hand they started this off pretty badly even 2 and a half years back when they aquired Reflector from Lutz with the same arrogant attitude that is evident in the latest episode. You really gotta wonder what folks are thinking in management – the sad part is from advance emails that were circulating, they were fully aware of the shit storm they were inciting with this and I suspect they are banking on the sheer numbers of .NET developers to still make them a tidy chunk of change from upgrades… Alternatives are coming For me personally the single license isn’t a problem, but I actually have a tool that I sell (an interop Web Service proxy generation tool) to customers and one of the things I recommend to use with has been Reflector to view assembly information and to find which Interop classes to instantiate from the non-.NET environment. It’s been nice to use Reflector for this with its small footprint and zero-configuration installation. But now with V7 becoming a paid tool that option is not going to be available anymore. Luckily it looks like the .NET community is jumping to it and trying to fill the void. Amidst the Red Gate outrage a new library called ILSpy has sprung up and providing at least some of the core functionality of Reflector with an open source library. It looks promising going forward and I suspect there will be a lot more support and interest to support this project now that Reflector has gone over to the ‘dark side’…© Rick Strahl, West Wind Technologies, 2005-2011

    Read the article

  • Data Quality Through Data Governance

    Data Quality Governance Data quality is very important to every organization, bad data cost an organization time, money, and resources that could be prevented if the proper governance was put in to place.  Data Governance Program Criteria: Support from Executive Management and all Business Units Data Stewardship Program  Cross Functional Team of Data Stewards Data Governance Committee Quality Structured Data It should go without saying but any successful project in today’s business world must get buy in from executive management and all stakeholders involved with the project. If management does not fully support a project because they see it is in there and the company’s best interest then they will remove/eliminate funding, resources and allocated time to work on the project. In essence they can render a project dead until it is official killed by the business. In addition, buy in from stake holders is also very important because they can cause delays increased spending in time, money and resources because they do not support a project. Data Stewardship programs are administered by a data steward manager who primary focus is to support, train and manage a cross functional data stewards team. A cross functional team of data stewards are pulled from various departments act to ensure that all systems work to ensure that an organization’s goals are achieved. Typically, data stewards are subject matter experts that act as mediators between their respective departments and IT. Data Quality Procedures Data Governance Committees are composed of data stewards, Upper management, IT Leadership and various subject matter experts depending on a company. The primary goal of this committee is to define strategic goals, coordinate activities, set data standards and offer data guidelines for the business. Data Quality Policies In 1997, Claudia Imhoff defined a Data Stewardship’s responsibility as to approve business naming standards, develop consistent data definitions, determine data aliases, develop standard calculations and derivations, document the business rules of the corporation, monitor the quality of the data in the data warehouse, define security requirements, and so forth. She further explains data stewards responsible for creating and enforcing polices on the following but not limited to issues. Resolving Data Integration Issues Determining Data Security Documenting Data Definitions, Calculations, Summarizations, etc. Maintaining/Updating Business Rules Analyzing and Improving Data Quality

    Read the article

  • Cost to licence characters or ships for a game

    - by Michael Jasper
    I am producing a game pitch document for a university game design class, and I am looking for examples of licencing cost for using characters or ships from other IP holders in a game. For example: cost of using an X-Wing in a game, licencing from Lucas cost of using the Enterprise in a game, licencing from Paramount cost of using the Space Shuttle (if any), licencing from Nasa EDIT The closest information I can find is from an article about Nights of the Old Republic, but isn't nearly specific enough for my needs: What Kotick means by Lucas being the principal beneficiary of the success of The Old Republic is that there are most likely clauses in the license agreement that give percentages, points, or another denomination of revenue out to Lucas and his people just for the Star Wars name, and that amount is presumed to be a great deal of money. Kotick is saying that because the cost of the license is so prohibitive, as he has personally had experience with in his position as CEO of Activision Blizzard, that EA will not be able to be profitable because of the hemorrhaging of money to the licensor. EDIT 2 Another vague source stating that FOX uses a "five-figure rule" (assuming between $10,000 - $99,000) It seems FOX, like most studios, will not license individuals to create new works based upon their products. They will only commission individuals of their choosing if they elect to branch out into expanded product lines related to those licenses. Alternately, they are open to making the licencing available to large corporations with access to global markets, but only if those corporations agree to what Ms Friedman called a "five-figure guarantee". Presumably this means that the corporation seeking the licensing must agree to pay a 5-figure sum for that license, and be confident that their product will sell enough volume to recoup that fee, and to produce sufficient profits to make the acquisition worth their while. Thank you!

    Read the article

  • Useful certifications for a young programmer

    - by Alain
    As @Paddyslacker elegantly stated in Are certifications worth it? The main purpose of certifications is to make money for the certifying body. I am a fairly young developer, with only an undergraduate degree, and my job is (graciously) offering to sponsor some professional development of my choice (provided it can be argued that it will contribute to the quality of work I do for them). A search online offers a slew of (mostly worthless) certifications one can attain. I'm wondering if there are any that are actually recognized in the (North American) industry as an asset. My local university promoted CIPS (I.S.P., ITCP) at the time I was graduating, but for all I can tell it's just the one that happened to get its foot in the door. It's certainly money grubbing - with a $205 a year fee. So are there any such certifications that provide useful credentials? To better define 'useful' - would it benefit full time developers, or is it only something worth while to the self-employed? Would any certifications lead me to being considered for higher wages, or can that only be achieved with more experience and an higher-level degree?

    Read the article

  • I want to sell my software [C# desktop application] but I was Stuck in licencing [closed]

    - by Surendra Soni
    Possible Duplicate: if I use .NET Framework for my application, do I have to pay anything to Microsoft? I have developed a desktop application called "Institute Management System" which has modules like Class manager, Subject manager, Topics manager, Student inquiry manager Student admission manager, Fees manager, Exam manager etc. using C# for the front end and MS Access for the back end. The main problem is that I want to sell it and earn some money but I heard that my application needs to be registered at Microsoft, and I would have to get a license from them for selling, and have to pay them money too. I have spent four months developing it at my own expense, and worked very hard to develop it. So I want some tips, advice, any suggestion for the same. Please also tell me the procedure for all of the required registrations and payment issues. And I also want to ask you if you Can you suggest any other technology where I develop my application and sell it without worrying about licencing and related issues? I am now more confused about that "MS technology is open source or not? "

    Read the article

  • Advices fo starting a video game design career

    - by Allen Gabriel Baker
    I'm 24 and have a passion for video games and game-design. I've decided I want to design video games as my career. I have no experience with designing video games or coding but I'm interested and willing to learn. I want a job at any level but what would I need to land a job? I have no college experience and I have no money. What is a cheap school, or do I really need to go to school for this, or can I learn on my own? Is it possible to do this with no money? I'm literally broke but I want this so bad I feel like its the only career I'll enjoy. I want to call up company's and ask them what they are looking for in someone they want to hire, is that a good idea? Also I don't know the history of video game design and I don't want to sound like a dummy when someone says something about this field or talks about a famous designer and I have no idea who they're talking about. So what is key info when it comes to this field and where should I find it? Hopefully some of you guys and girls can help me out: I know in the future I will create something everyone will enjoy and you guys will remember when you gave me advice and I will always remember you guys for helping me. I'm gifted I know I am and I want to share my gift with the rest of the world by making games that change the Industry. Help me out please.

    Read the article

  • Friday Fun: Building Blasters 2

    - by Mysticgeek
    After dealing with unnecessary spreadsheets and TPS reports all week, it’s time to waste time playing a flash game. Today we take a look at Building Blasters 2 where you strategically place explosives to bring down structures. Building Blasters 2 You need to place explosives carefully to clear areas in the red level, keep bystanders safe, and manage your budget. After placing the explosives on the structure, you can set the amount of time that passes before they blow. This comes in handy when you reach advanced levels. When you’re ready to start the demolition click on the Detonate button and watch the buildings fall. If you don’t achieve the objectives, you will get the Demolition Error screen and can replay the level. After you’ve received enough money, you’ll get a message between missions telling you there is enough money to buy items in the shop. You can get enhanced destructive devices such as nitroglycerin, a wrecking ball, call in an air strike and more… If you’re sick of the pointy haired boss dragging you down all week, pretend the structures are the office building and destroy away. Building Blasters 2 is a great way to have fun and let off steam so you can enjoy your weekend. Play Building Blasters 2 For additional fun games to play, make sure and check out the How-To Geek Arcade. Similar Articles Productive Geek Tips Friday Fun: Demolition CityFriday Fun: Cargo BridgeFriday Fun: Portal, the Flash VersionFriday Fun: VehiclesFriday Fun: Play Bubble Quod TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 10 Superb Firefox Wallpapers OpenDNS Guide Google TV The iPod Revolution Ultimate Boot CD can help when disaster strikes Windows Firewall with Advanced Security – How To Guides

    Read the article

  • Google Loon–A network of balloons to provide internet to everyone

    - by Gopinath
    Google once just a super powerful search engine provider and now they are venturing in to a lot of interesting non software projects like self driving cars, glasses that beam information right on to your eye balls, high speed internet services @ 1 Giga bytes per second. A recent addition to this innovative list is Google Loon – a network of flying balloons that provide internet access to remote parts of the world where it is not feasible for many governments/corporate to provider internet services. Google says there are several billions of people around the world who don’t have access to internet and Google Loon aim is to provide internet facilities to all these people. A pilot project is started couple of days ago by launching 30 balloons into stratosphere from New Zealand. These balloons fly 20 Kilometers above earth(much higher than where aero planes fly) and they beam internet to homes having Loon receiver wirelessly. Checkout the embedded introductory video on Google Loon What is in it for Google? Why is Google getting into these type of projects and what is in it for them? Google is the gateway to web and majority of people find information on web using Google Services/Software. So providing internet facilities to more people means, more people using Google services and it in turn contributes to their revenue growth. Google is not a charity, they do all these projects to earn money just like every other corporate. The best part is while earning money they are touching lives of billions of people in a positive way. Just imagine everyone in the world connected and have ability to take informed decisions irrespective of whether they live in developed countries or underprivileged parts of the world! Wow that will be a beautiful day. Further reading Google Loon website Google unveils its Project Loon Wi-Fi balloons – in pictures Google flies Internet balloons in stratosphere for a “network in the sky” How Google Will Use High-Flying Balloons to Deliver Internet to the Hinterlands Good discussion on Google Loon at Hacker News community

    Read the article

  • Why is it that software is still easily pirated today?

    - by mohabitar
    I've always been curious about this. Now I wouldn't call myself a programmer yet, but I'm learning, so maybe the answer to this is obvious. It just seems a little hard to believe that with all of our technological advances and the billions of dollars spent on engineering the most unbelievable and mind-blowing software, we still have no other means of protecting against piracy then a "serial number/activation key". I'm sure a ton of money, maybe even billions, went into creating Windows 7 or Office and even Snow Leopard, yet I can get it for free in less than 20 minutes. Same for all of Adobe's products, which are probably the easiest. I guess my question is: can there exist a fool proof and hack proof method of protecting your software against piracy? If not realistically, how about theoretically possible? Or no matter what mechanisms these companies deploy, hackers can always find a way around it? EDIT: So apparently, the answer is no. There's pretty much no way. And so I'm sure these big companies have realized this as well. Should they adopt another sales model rather than charging a crapload for their software (I know its justified and they put a lot of hard work into their software, but its still a lot of money). Are there any alternative solutions that will benefit both the company and the user (i.e. if you purchase our product, we'll apply $X dollars to your account that will apply to future purchases from our company)?

    Read the article

< Previous Page | 104 105 106 107 108 109 110 111 112 113 114 115  | Next Page >