Search Results

Search found 309 results on 13 pages for 'coin bird'.

Page 12/13 | < Previous Page | 8 9 10 11 12 13  | Next Page >

  • How can I check if the mouse button is released, and THEN execute a procedure once in Borland Pascal

    - by Robert
    Hi! I use Borland Pascal 7.0, and I would like to make a slots game (If 3 random numbers are the same, you win). The problem is that when I click on the start (Inditas) button on the menu, the procedure executes many times until I release the mouse button. I was told that I should check if the mouse button is released before executing the procedure once. How can I do that? Here's what the menu looks like: procedure eger; begin; mouseinit; mouseon; menu; repeat getmouse(m); if (m.left) and (m.x60) ANd (m.x<130) and (m.y120) and (m.y<150) then teglalap(90,90,300,300,blue); if (m.left) and (m.x60) AND (m.x<130) and (m.y160) and (m.y<190) then jatek(a,b,c,coin,coins); until ((m.left) and (m.x60) ANd (m.x<130) and (m.y240) and (m.y<270)); end; Thanks, Robert

    Read the article

  • Modeling complex hierarchies

    - by jdn
    To gain some experience, I am trying to make an expert system that can answer queries about the animal kingdom. However, I have run into a problem modeling the domain. I originally considered the animal kingdom hierarchy to be drawn like -animal -bird -carnivore -hawk -herbivore -bluejay -mammals -carnivores -herbivores This I figured would allow me to make queries easily like "give me all birds", but would be much more expensive to say "give me all carnivores", so I rewrote the hierarchy to look like: -animal -carnivore -birds -hawk -mammals -xyz -herbivores -birds -bluejay -mammals But now it will be much slower to query "give me all birds." This is of course a simple example, but it got me thinking that I don't really know how to model complex relationships that are not so strictly hierarchical in nature in the context of writing an expert system to answer queries as stated above. A directed, cyclic graph seems like it could mathematically solve the problem, but storing this in a relational database and maintaining it (updates) would seem like a nightmare to me. I would like to know how people typically model such things. Explanations or pointers to resources to read further would be acceptable and appreciated.

    Read the article

  • Hierarchy of meaning

    - by asldkncvas
    I am looking for a method to build a hierarchy of words. Background: I am a "amateur" natural language processing enthusiast and right now one of the problems that I am interested in is determining the hierarchy of word semantics from a group of words. For example, if I have the set which contains a "super" representation of others, i.e. [cat, dog, monkey, animal, bird, ... ] I am interested to use any technique which would allow me to extract the word 'animal' which has the most meaningful and accurate representation of the other words inside this set. Note: they are NOT the same in meaning. cat != dog != monkey != animal BUT cat is a subset of animal and dog is a subset of animal. I know by now a lot of you will be telling me to use wordnet. Well, I will try to but I am actually interested in doing a very domain specific area which WordNet doesn't apply because: 1) Most words are not found in Wordnet 2) All the words are in another language; translation is possible but is to limited effect. another example would be: [ noise reduction, focal length, flash, functionality, .. ] so functionality includes everything in this set. I have also tried crawling wikipedia pages and applying some techniques on td-idf etc but wikipedia pages doesn't really do much either. Can someone possibly enlighten me as to what direction my research should go towards? (I could use anything)

    Read the article

  • Accessing bitmap array in another class? C#

    - by Marius Mathisen
    I have this array : Bitmap[] bildeListe = new Bitmap[21]; bildeListe[0] = Properties.Resources.ål; bildeListe[1] = Properties.Resources.ant; bildeListe[2] = Properties.Resources.bird; bildeListe[3] = Properties.Resources.bear; bildeListe[4] = Properties.Resources.butterfly; bildeListe[5] = Properties.Resources.cat; bildeListe[6] = Properties.Resources.chicken; bildeListe[7] = Properties.Resources.dog; bildeListe[8] = Properties.Resources.elephant; bildeListe[9] = Properties.Resources.fish; bildeListe[10] = Properties.Resources.goat; bildeListe[11] = Properties.Resources.horse; bildeListe[12] = Properties.Resources.ladybug; bildeListe[13] = Properties.Resources.lion; bildeListe[14] = Properties.Resources.moose; bildeListe[15] = Properties.Resources.polarbear; bildeListe[16] = Properties.Resources.reke; bildeListe[17] = Properties.Resources.sheep; bildeListe[18] = Properties.Resources.snake; bildeListe[19] = Properties.Resources.spider; bildeListe[20] = Properties.Resources.turtle; I want that array and it´s content in a diffenrent class, and access it from my main form. I don´t know if should use method, function or what to use with arrays. Are there some good way for me to access for instanse bildeListe[0] in my new class?

    Read the article

  • game currency convert: math efficient

    - by Comradsky
    For variables:4 text views named diamondText, goldText, silverText, and bronzeText;money variable unsigned int money;and an NSTimer, every .1 sec,runs function: -(void)updateMoney{ money++; bronzeText.text = [NSString stringWithFormat:@"%d",money]; silverText.text = [NSString stringWithFormat:@"%d",money%10]; goldText.text = [NSString stringWithFormat:@"%d",money%100]; diamondText.text= [NSString stringWithFormat:@"%d",money%1000]; } Given that my currency is diamond = 10 gold = 10 silver = 10 bronze = 1; What would be most efficient way to calculate and display the money labels? How would you store this variable, with GameCenter and NSDictionary or GameCenter and something else? More details are below if you don't understand. To clarify: bronze has the last 2 numbers, silver has the next 2 numbers, and so on. I understand I could use 4 ints or an array, but i would rather try to use this method, unless theres a much more efficient way. Example: When money = 1000;bronzeText = nothing, silverText = 10,goldText = nothing, diamondText = nothing; What other ways would you do this that you think would be more efficient? I will be calling a function (void)collisionDetector that detects if my player.frame crosses with a flyingObject.frame, and if that object is a coin it gives an added value to money and then calls (void)updateMoney. Im just using the timer to test this and spawn these flying objects.

    Read the article

  • need help with some basic java.

    - by Racket
    Hi, I'm doing the first chapter exercises on my Java book and I have been stuck for a problem for a while now. I'll print the question, prompt/read a double value representing a monetary amount. Then determine the fewest number of each bill and coin needed to represent that amount, starting with the highest (assume that a ten dollar bill is the maximum size needed). For example, if the value entered is 47,63 (forty-seven dollars and sixty-three cents), and the program should print the equivalent amount as: 4 ten dollar bills 1 five dollar bills 2 one dollar bills 2 quarters 1 dimes 0 nickels 3 pennies" etc. I'm doing an example exactly as they said in order to get an idea, as you will see in the code. Nevertheless, I managed to print 4 dollars, and I can't figure out how to get "1 five dollar", only 7 dollars (see code). Please, don't do the whole code for me. I just need some advice in regards to what I said. Thank you. import java.util.Scanner; public class PP29 { public static void main (String[] args) { Scanner sc = new Scanner (System.in); int amount; double value; double test1; double quarter; System.out.println("Enter \"double\" value: "); value = sc.nextDouble(); amount = (int) value / 10; // 47,63 / 10 = 4. int amount2 = (int) value % 10; // 47 - 40 = 7 quarter = value * 100; // 47,63 * 100 = 4736 int sum = (int) quarter % 100; // 4763 / 100 => 4763-4700 = 63. System.out.println(amount); System.out.println(amount2); } }

    Read the article

  • Java Embedded @ JavaOne: Q & A

    - by terrencebarr
    There has been a lot of interest in Java Embedded @ JavaOne since it was announced a short while ago (see my previous post). As this is a new conference we did get a number of questions regarding the conference. So we put together a brief Q & A on audience focus, dates, registrations, pricing, submissions, etc. Hope this helps and, remember, the Call for Papers ends next week, Jul 18th 2012! Cheers, – Terrence    Java Embedded @ JavaOne : Q & A  Q. Where can I learn more about “Java Embedded @ JavaOne”? A. Please visit: http://oracle.com/javaone/embedded Q. What is the purpose of “Java Embedded @ JavaOne”? A. This net-new event is designed to provide business and technical decision makers, as well as Java embedded ecosystem partners, a unique occasion to come together and learn about how they can use Java Embedded technologies for new business opportunities. Q. What broad audiences would benefit by attending “Java Embedded @ JavaOne”? A. Java licensees; Government agencies; ISVs, Device Manufacturers; Service Providers such as Telcos, Utilities, Healthcare, Energy, Smart Grid/Smart Metering; Automotive/Telematics; Home/Building Automation; Factory Automation; Media/TV; and Payment vendors. Q. What business titles would benefit by attending “Java Embedded @ JavaOne”? A. The ideal audience for this event is business and technical decision makers (e.g. System Integrators, CTO, CXO, Chief Architects/Architects, Business Development Managers, Project Managers, Purchasing managers, Technical Leads, Senior Decision Makers, Practice Leads, R&D Heads, and Development Managers/Leads). Q. When is “Java Embedded @ JavaOne” taking place? A. The event takes place on Wednesday, Oct. 3th through Thursday, Oct. 4th. Q. Where is “Java Embedded @ JavaOne” taking place? A. The event takes place in the Hotel Nikko. Q. Won’t “Java Embedded @ JavaOne” impact the flagship JavaOne conference since the Hotel Nikko is one of the 3 flagship JavaOne conference’s venue hotels? A. No. Separate space in the Hotel Nikko will be used for “Java Embedded @ JavaOne” and will in no way impact scale and scope of the flagship JavaOne conference’s content mix. Q. Will there be a call for papers for “Java Embedded @ JavaOne”? A. Yes.  The call for papers has started but is ONLY for business focused submissions. Q. What type of business submissions can I make for “Java Embedded @ JavaOne”? A. We are accepting 3 types of business submissions: Best Practices: Java Embedded business solutions, methods, and techniques that consistently show results superior to those achieved with other means, as well as discussions on how Java Embedded can improve business operations, and increase competitive differentiation and profitability. Case Studies: Discussions with Oracle customers and partners that describe the unique business drivers that convinced them to implement Java Embedded as part of an infrastructure technology mix. The discussions will highlight the issues they faced, the decision making involved, and the implementation choices made to create value and improve business differentiation. Panel: Moderator-driven open discussion focused on the emerging opportunities Java Embedded offers businesses, as well as other topics such as strategy, overcoming common challenges, etc. Q. What is the call for papers timeline for “Java Embedded @ JavaOne”? A. The timeline is as follows: CFP Launched – June 18th Deadline for submissions – July 18th Notifications (Accepts/Declines) – week of July 29th Deadline for speakers to accept speaker invitation – August 10th Presentations due for review – August 31st Q. Where can I find more call for paper details for “Java Embedded @ JavaOne”? A. Please go to: http://www.oracle.com/javaone/embedded/call-for-papers/information/index.html Q. How much does it cost to attend “Java Embedded @ JavaOne”? A. The cost to attend is: $595.00 U.S. — Early Bird (Launch date – July 13, 2012) $795.00 U.S. — Pre-Registration (July 14 – September 28, 2012) $995.00 U.S. — Onsite Registration (September 29 – October 4, 2012) Q. Can an attendee of the flagship JavaOne event and Oracle OpenWorld attend “Java Embedded @ JavaOne”? ?A. Yes.  Attendees of both the flagship JavaOne event and Oracle OpenWorld can attend “Java Embedded @ JavaOne” by purchasing a $100.00 U.S. upgrade to their full conference pass. Filed under: Mobile & Embedded Tagged: Call for Papers, Java Embedded @ JavaOne, JavaOne San Francisco

    Read the article

  • SQL SERVER – Solution to Puzzle – Simulate LEAD() and LAG() without Using SQL Server 2012 Analytic Function

    - by pinaldave
    Earlier I wrote a series on SQL Server Analytic Functions of SQL Server 2012. During the series to keep the learning maximum and having fun, we had few puzzles. One of the puzzle was simulating LEAD() and LAG() without using SQL Server 2012 Analytic Function. Please read the puzzle here first before reading the solution : Write T-SQL Self Join Without Using LEAD and LAG. When I was originally wrote the puzzle I had done small blunder and the question was a bit confusing which I corrected later on but wrote a follow up blog post on over here where I describe the give-away. Quick Recap: Generate following results without using SQL Server 2012 analytic functions. I had received so many valid answers. Some answers were similar to other and some were very innovative. Some answers were very adaptive and some did not work when I changed where condition. After selecting all the valid answer, I put them in table and ran RANDOM function on the same and selected winners. Here are the valid answers. No Joins and No Analytic Functions Excellent Solution by Geri Reshef – Winner of SQL Server Interview Questions and Answers (India | USA) WITH T1 AS (SELECT Row_Number() OVER(ORDER BY SalesOrderDetailID) N, s.SalesOrderID, s.SalesOrderDetailID, s.OrderQty FROM Sales.SalesOrderDetail s WHERE SalesOrderID IN (43670, 43669, 43667, 43663)) SELECT SalesOrderID,SalesOrderDetailID,OrderQty, CASE WHEN N%2=1 THEN MAX(CASE WHEN N%2=0 THEN SalesOrderDetailID END) OVER (Partition BY (N+1)/2) ELSE MAX(CASE WHEN N%2=1 THEN SalesOrderDetailID END) OVER (Partition BY N/2) END LeadVal, CASE WHEN N%2=1 THEN MAX(CASE WHEN N%2=0 THEN SalesOrderDetailID END) OVER (Partition BY N/2) ELSE MAX(CASE WHEN N%2=1 THEN SalesOrderDetailID END) OVER (Partition BY (N+1)/2) END LagVal FROM T1 ORDER BY SalesOrderID, SalesOrderDetailID, OrderQty; GO No Analytic Function and Early Bird Excellent Solution by DHall – Winner of Pluralsight 30 days Subscription -- a query to emulate LEAD() and LAG() ;WITH s AS ( SELECT 1 AS ldOffset, -- equiv to 2nd param of LEAD 1 AS lgOffset, -- equiv to 2nd param of LAG NULL AS ldDefVal, -- equiv to 3rd param of LEAD NULL AS lgDefVal, -- equiv to 3rd param of LAG ROW_NUMBER() OVER (ORDER BY SalesOrderDetailID) AS row, SalesOrderID, SalesOrderDetailID, OrderQty FROM Sales.SalesOrderDetail WHERE SalesOrderID IN (43670, 43669, 43667, 43663) ) SELECT s.SalesOrderID, s.SalesOrderDetailID, s.OrderQty, ISNULL( sLd.SalesOrderDetailID, s.ldDefVal) AS LeadValue, ISNULL( sLg.SalesOrderDetailID, s.lgDefVal) AS LagValue FROM s LEFT OUTER JOIN s AS sLd ON s.row = sLd.row - s.ldOffset LEFT OUTER JOIN s AS sLg ON s.row = sLg.row + s.lgOffset ORDER BY s.SalesOrderID, s.SalesOrderDetailID, s.OrderQty No Analytic Function and Partition By Excellent Solution by DHall – Winner of Pluralsight 30 days Subscription /* a query to emulate LEAD() and LAG() */ ;WITH s AS ( SELECT 1 AS LeadOffset, /* equiv to 2nd param of LEAD */ 1 AS LagOffset, /* equiv to 2nd param of LAG */ NULL AS LeadDefVal, /* equiv to 3rd param of LEAD */ NULL AS LagDefVal, /* equiv to 3rd param of LAG */ /* Try changing the values of the 4 integer values above to see their effect on the results */ /* The values given above of 0, 0, null and null behave the same as the default 2nd and 3rd parameters to LEAD() and LAG() */ ROW_NUMBER() OVER (ORDER BY SalesOrderDetailID) AS row, SalesOrderID, SalesOrderDetailID, OrderQty FROM Sales.SalesOrderDetail WHERE SalesOrderID IN (43670, 43669, 43667, 43663) ) SELECT s.SalesOrderID, s.SalesOrderDetailID, s.OrderQty, ISNULL( sLead.SalesOrderDetailID, s.LeadDefVal) AS LeadValue, ISNULL( sLag.SalesOrderDetailID, s.LagDefVal) AS LagValue FROM s LEFT OUTER JOIN s AS sLead ON s.row = sLead.row - s.LeadOffset /* Try commenting out this next line when LeadOffset != 0 */ AND s.SalesOrderID = sLead.SalesOrderID /* The additional join criteria on SalesOrderID above is equivalent to PARTITION BY SalesOrderID in the OVER clause of the LEAD() function */ LEFT OUTER JOIN s AS sLag ON s.row = sLag.row + s.LagOffset /* Try commenting out this next line when LagOffset != 0 */ AND s.SalesOrderID = sLag.SalesOrderID /* The additional join criteria on SalesOrderID above is equivalent to PARTITION BY SalesOrderID in the OVER clause of the LAG() function */ ORDER BY s.SalesOrderID, s.SalesOrderDetailID, s.OrderQty No Analytic Function and CTE Usage Excellent Solution by Pravin Patel - Winner of SQL Server Interview Questions and Answers (India | USA) --CTE based solution ; WITH cteMain AS ( SELECT SalesOrderID, SalesOrderDetailID, OrderQty, ROW_NUMBER() OVER (ORDER BY SalesOrderDetailID) AS sn FROM Sales.SalesOrderDetail WHERE SalesOrderID IN (43670, 43669, 43667, 43663) ) SELECT m.SalesOrderID, m.SalesOrderDetailID, m.OrderQty, sLead.SalesOrderDetailID AS leadvalue, sLeg.SalesOrderDetailID AS leagvalue FROM cteMain AS m LEFT OUTER JOIN cteMain AS sLead ON sLead.sn = m.sn+1 LEFT OUTER JOIN cteMain AS sLeg ON sLeg.sn = m.sn-1 ORDER BY m.SalesOrderID, m.SalesOrderDetailID, m.OrderQty No Analytic Function and Co-Related Subquery Usage Excellent Solution by Pravin Patel – Winner of SQL Server Interview Questions and Answers (India | USA) -- Co-Related subquery SELECT m.SalesOrderID, m.SalesOrderDetailID, m.OrderQty, ( SELECT MIN(SalesOrderDetailID) FROM Sales.SalesOrderDetail AS l WHERE l.SalesOrderID IN (43670, 43669, 43667, 43663) AND l.SalesOrderID >= m.SalesOrderID AND l.SalesOrderDetailID > m.SalesOrderDetailID ) AS lead, ( SELECT MAX(SalesOrderDetailID) FROM Sales.SalesOrderDetail AS l WHERE l.SalesOrderID IN (43670, 43669, 43667, 43663) AND l.SalesOrderID <= m.SalesOrderID AND l.SalesOrderDetailID < m.SalesOrderDetailID ) AS leag FROM Sales.SalesOrderDetail AS m WHERE m.SalesOrderID IN (43670, 43669, 43667, 43663) ORDER BY m.SalesOrderID, m.SalesOrderDetailID, m.OrderQty This was one of the most interesting Puzzle on this blog. Giveaway Winners will get following giveaways. Geri Reshef and Pravin Patel SQL Server Interview Questions and Answers (India | USA) DHall Pluralsight 30 days Subscription Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, Readers Contribution, Readers Question, SQL, SQL Authority, SQL Function, SQL Puzzle, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL – Quick Start with Explorer Sections of NuoDB – Query NuoDB Database

    - by Pinal Dave
    This is the third post in the series of the blog posts I am writing about NuoDB. NuoDB is very innovative and easy-to-use product. I can clearly see how one can scale-out NuoDB with so much ease and confidence. In my very first blog post we discussed how we can install NuoDB (link), and in my second post I discussed how we can manage the NuoDB database transaction engines and storage managers with a few clicks (link). Note: You can Download NuoDB from here. In this post, we will learn how we can use the Explorer feature of NuoDB to do various SQL operations. NuoDB has a browser-based Explorer, which is very powerful and has many of the features any IDE would normally have. Let us see how it works in the following step-by-step tutorial. Let us go to the NuoDBNuoDB Console by typing the following URL in your browser: http://localhost:8080/ It will bring you to the QuickStart screen. Make sure that you have created the sample database. If you have not created sample database, click on Create Database and create it successfully. Now go to the NuoDB Explorer by clicking on the main tab, and it will ask you for your domain username and password. Enter the username as a domain and password as a bird. Alternatively you can also enter username as a quickstart and password as a quickstart. Once you enter the password you will be able to see the databases. In our example we have installed the Sample Database hence you will see the Test database in our Database Hierarchy screen. When you click on database it will ask for the database login. Note that Database Login is different from Domain login and you will have to enter your database login over here. In our case the database username is dba and password is goalie. Once you enter a valid username and password it will display your database. Further expand your database and you will notice various objects in your database. Once you explore various objects, select any database and click on Open. When you click on execute, it will display the SQL script to select the data from the table. The autogenerated script displays entire result set from the database. The NuoDB Explorer is very powerful and makes the life of developers very easy. If you click on List SQL Statements it will list all the available SQL statements right away in Query Editor. You can see the popup window in following image. Here is the cool thing for geeks. You can even click on Query Plan and it will display the text based query plan as well. In case of a SELECT, the query plan will be much simpler, however, when we write complex queries it will be very interesting. We can use the query plan tab for performance tuning of the database. Here is another feature, when we click on List Tables in NuoDB Explorer.  It lists all the available tables in the query editor. This is very helpful when we are writing a long complex query. Here is a relatively complex example I have built using Inner Join syntax. Right below I have displayed the Query Plan. The query plan displays all the little details related to the query. Well, we just wrote multi-table query and executed it against the NuoDB database. You can use the NuoDB Admin section and do various analyses of the query and its performance. NuoDB is a distributed database built on a patented emergent architecture with full support for SQL and ACID guarantees.  It allows you to add Transaction Engine processes to a running system to improve the performance of your system.  You can also add a second Storage Engine to your running system for redundancy purposes.  Conversely, you can shut down processes when you don’t need the extra database resources. NuoDB also provides developers and administrators with a single intuitive interface for centrally monitoring deployments. If you have read my blog posts and have not tried out NuoDB, I strongly suggest that you download it today and catch up with the learnings with me. Trust me though the product is very powerful, it is extremely easy to learn and use. Reference: Pinal Dave (http://blog.sqlauthority.com)   Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: NuoDB

    Read the article

  • Gene Hunt Says:

    - by BizTalk Visionary
    "She's as nervous as a very small nun at a penguin shoot"   "He's got fingers in more pies than a leper on a cookery course" "You so much as belch out of line and I'll have your scrotum on a barbed wire plate" "Let's go play slappyface" "your surrounded by armed barstewards" “Right, get out and find this murdering scum right now!” [pause] “Scratch that, we start 9am sharp tomorrow, it's beer-o-clock.” "So then Cartwright, you're such a good Detective.... Go and Detect me a packet of Garibaldies" "You're not the one who is going to have to knit himself a new arsehole after 25 years of aggressive male love in prison" “A dream for me is Diana Dors and a bottle of chip fat." “A dream for me is Diana Dors and a bottle of chip fat." “They reckon you've got concussion - but personally, I couldn't give a tart's furry cup if half your brains are falling out. Don't ever waltz into my kingdom playing king of the jungle.” “You great... soft... sissy... girlie... nancy... french... bender... Man-United supporting POOF!!” “Drugs eh? What's the point. They make you forget, make you talk funny, make you see things that aren't there. My old grandma got all of that for free when she had a stroke.” “He's Dead! It's quite serious!” “Fanny in the flat...Nice Work” “SoopaDoopa” “Tits in a Jumper!” “Drop your weapons! You are surrounded by armed bastards!” “It's 1973, almost dinnertime. I'm 'avin 'oops!” “Trust the Gene Genie!” “I wanna hump Britt Ekland...What're we gonna do...!” “Was that 'E' and you don't know the rest?! or you going 'Eeee, I Dunno'” “Good Girl! Prostate probe and no jelly. “ “Give over, it's nothing like Spain!” “I'll come over your houses and stamp on all your toys!” “The Wizard will sort it out. It's cos of the wonderful things he does” “Cartwright can jump up and down on his knackers!” “It's not a windup love, he really thinks like this!” “Women! You can't say two words to them” “I was thinking, maybe, a Berni Inn!” “If I wanted a bollocking for drinking too much...!” “Shhhh...hear that...that's the sound of this case being closed! “Chicken!? In a basket!?” “Seems a large quantity of cocaine...” “You probably thought he kept his cock in his keks!” “The tail-end of Rays demotion speech!” “Stephen Warren is gay!?” “You're a smart boy, use your initiative!” “Don't be such a Jessie!” “I find the idea of a bird brushing her teeth...!” “Never been tempted to the Magic talcum powder?” “Make sure she's got nice tits!” “You're more likely to find an ostrich with a plum up it's arse!” “Drink this lot under the table and have a pint on the way home!” “Never be a female Prime Minister!” “Pub? Pub! pub!.....Pub!” “Thou shalt not suck off rent boys!” “The number for the special clinic is on the notice board!” “If me uncle had tits, would he be me auntie!” “Got your vicars in a twist!” “We Done?!” “Your mates got balls...If they were any bigger he'd need a wheelbarrow!” “The Ending - from 'I want to go home' to the end music.”

    Read the article

  • Five Reasons to Attend PLM Summit 2013: The Conference Formerly Known as AGILITY

    - by Terri Hiskey
    As we approach the end of 2012, we are also closing in on the last couple of weeks that Agile customers and prospects can register for the upcoming PLM Summit 2013 for the bargain early bird rate of $195. Register now to secure your spot! The Conference Formerly Known as AGILITY... Long-time Agile customers may remember AGILITY, which was Agile's PLM customer conference that was held on an annual basis prior to Oracle's acquisiton of Agile in 2007. In February 2012, due to feedback we received from our Agile PLM community, we successfully resurrected the AGILITY conference and renamed it the PLM Summit. The PLM Summit was so well received and well-attended, that we are doing it again in 2013. This upcoming PLM Summit is being co-located in San Francisco under the overarching banner of the Oracle Value Chain Summit, and will be held alongside several other Oracle customer conferences that cover a range of value chain solutions, including Value Chain Planning, Value Chain Execution, Procurement, Maintenance and Manufacturing. This setup offers PLM attendees the best of all worlds--the opportunity to participate and learn about PLM in smaller, focused sessions by product and by industry, while also giving attendees the chance to see how PLM works together with other critical enterprise applications that address other important aspects of the value chain. Top Five Reasons to Attend the PLM Summit 2013 In the spirit of all of the end-of-the-year lists that are currently popping up, here is a list of the top five reasons to attend the PLM Summit for anyone out there needs a little extra encouragement to register: 1. The Best Opportunities for Customer Networking   The PLM Summit offers attendees numerous opportunities to learn and network with fellow Agile users. Customer stories are featured in keynote and breakout presentations and the schedule allows for plenty of networking time during breakfasts, lunches, breaks and dinners. Customer networking is the number one reason that Agile users attend the PLM Summit. Read what attendees thought of the most recent PLM Summit: "Hearing about the implementation of Agile products from a customers’ perspective is invaluable." - Director of Quality Assurance & Regulatory Affairs, leading medical device manufacturer "Understanding the scope of other companies’ projects and the lessons learned made attending this event well worth my time." - Director of Test Engineering, global industrial manufacturer "The most beneficial thing about attending this event is the opportunity to network with other customers with similar experiences." - Director of Business Process Improvement, leading high technology company Come to the PLM Summit and play an active role within the PLM community: swap war stories and business cards, connect on LinkedIn and Facebook, share your stories and discuss the sessions from each day. Register now! 2. It's Educational! The PLM Summit is the premier educational event for anyone in the Agile PLM community. There are nearly 40 PLM-focused in-depth educational sessions led by Agile PLM experts, customers and partners that will cover a range of specific product and industry-focused topics. Keynotes will give attendees a broad overview of the entire Agile PLM footprint, while sessions will delve deeply into specific product functionality and customer case studies. There is truly something for everyone. Check out the latest agenda for view of all the sessions. 3. Visit with the PLM Partner Community Our partners play a significant and important role within the Agile PLM community. At the PLM Summit, attendees will be able to meet and mingle with several of the top Oracle Agile PLM partners including: Deloitte, Domain, GoEngineer, Hitachi Consulting, IBM, Kalypso, KPIT Cummins (CPG Solutions), Perception Software, Verdant, Xavor and ZeroWaitState. Go here for a complete list of all the Value Chain Summit sponsors. 4. See Agile PLM in Action at our Dedicated PLM Demo Pods At the PLM Summit, attendees will have the chance to see Agile PLM in action at dedicated PLM demo pods, manned by expert members of our Agile PLM team. If you would like to see up close specific Agile PLM functionality, or if you have a question on how to extend the scope of your current implemention or if you want a better understanding of how to leverage Agile PLM to address specific use-cases, stop by one of the Agile PLM demo pods and engage the Agile PLM experts on hand at the PLM Summit. 5. Spend Some Time in Lovely San Francisco Still on the fence about the upcoming PLM Summit? Remember that it is being held in San Francisco, which is a fantastic city for a getaway. After spending time learning and networking about PLM, take an extra day or two to escape the dreary winter and enjoy the beautiful scenery and the unique actitivies offered only by the City by the Bay. You will walk away from the conference not only with renewed excitement about Agile PLM, but feeling rejuvenated in general.

    Read the article

  • What Counts For a DBA: Simplicity

    - by Louis Davidson
    Too many computer processes do an apparently simple task in a bizarrely complex way. They remind me of this strip by one of my favorite artists: Rube Goldberg. In order to keep the boss from knowing one was late, a process is devised whereby the cuckoo clock kisses a live cuckoo bird, who then pulls a string, which triggers a hat flinging, which in turn lands on a rod that removes a typewriter cover…and so on. We rely on creating automated processes to keep on top of tasks. DBAs have a lot of tasks to perform: backups, performance tuning, data movement, system monitoring, and of course, avoiding being noticed.  Every day, there are many steps to perform to maintain the database infrastructure, including: checking physical structures, re-indexing tables where needed, backing up the databases, checking those backups, running the ETL, and preparing the daily reports and yes, all of these processes have to complete before you can call it a day, and probably before many others have started that same day. Some of these tasks are just naturally complicated on their own. Other tasks become complicated because the database architecture is excessively rigid, and we often discover during “production testing” that certain processes need to be changed because the written requirements barely resembled the actual customer requirements.   Then, with no time to change that rigid structure, we are forced to heap layer upon layer of code onto the problematic processes. Instead of a slight table change and a new index, we end up with 4 new ETL processes, 20 temp tables, 30 extra queries, and 1000 lines of SQL code.  Report writers then need to build reports and make magical numbers appear from those toxic data structures that are overly complex and probably filled with inconsistent data. What starts out as a collection of fairly simple tasks turns into a Goldbergian nightmare of daily processes that are likely to cause your dinner to be interrupted by the smartphone doing the vibration dance that signifies trouble at the mill. So what to do? Well, if it is at all possible, simplify the problem by either going into the code and refactoring the complex code to simple, or taking all of the processes and simplifying them into small, independent, easily-tested steps.  The former approach usually requires an agreement on changing underlying structures that requires countless mind-numbing meetings; while the latter can generally be done to any complex process without the same frustration or anger, though it will still leave you with lots of steps to complete, the ability to test each step independently will definitely increase the quality of the overall process (and with each step reporting status back, finding an actual problem within the process will be definitely less unpleasant.) We all know the principle behind simplifying a sequence of processes because we learned it in math classes in our early years of attending school, starting with elementary school. In my 4 years (ok, 9 years) of undergraduate work, I remember pretty much one thing from my many math classes that I apply daily to my career as a data architect, data programmer, and as an occasional indentured DBA: “show your work”. This process of showing your work was my first lesson in simplification. Each step in the process was in fact, far simpler than the entire process.  When you were working an equation that took both sides of 4 sheets of paper, showing your work was important because the teacher could see every step, judge it, and mark it accordingly.  So often I would make an error in the first few lines of a problem which meant that the rest of the work was actually moving me closer to a very wrong answer, no matter how correct the math was in the subsequent steps. Yet, when I got my grade back, I would sometimes be pleasantly surprised. I passed, yet missed every problem on the test. But why? While I got the fact that 1+1=2 wrong in every problem, the teacher could see that I was using the right process. In a computer process, the process is very similar. We take complex processes, show our work by storing intermediate values, and test each step independently. When a process has 100 steps, each step becomes a simple step that is tested and verified, such that there will be 100 places where data is stored, validated, and can be checked off as complete. If you get step 1 of 100 wrong, you can fix it and be confident (that if you did your job of testing the other steps better than the one you had to repair,) that the rest of the process works. If you have 100 steps, and store the state of the process exactly once, the resulting testable chunk of code will be far more complex and finding the error will require checking all 100 steps as one, and usually it would be easier to find a specific needle in a stack of similarly shaped needles.  The goal is to strive for simplicity either in the solution, or at least by simplifying every process down to as many, independent, testable, simple tasks as possible.  For the tasks that really can’t be done completely independently, minimally take those tasks and break them down into simpler steps that can be tested independently.  Like working out division problems longhand, have each step of the larger problem verified and tested.

    Read the article

  • Blog Buzz - Devoxx 2011

    - by Janice J. Heiss
    Some day I will make it to Devoxx – for now, I’m content to vicariously follow the blogs of attendees and pick up on what’s happening.  I’ve been doing more blog "fishing," looking for the best commentary on 2011 Devoxx. There’s plenty of food for thought – and the ideas are not half-baked.The bloggers are out in full, offering useful summaries and commentary on Devoxx goings-on.Constantin Partac, a Java developer and a member of Transylvania JUG, a community from Cluj-Napoca/Romania, offers an excellent summary of the Devoxx keynotes. Here’s a sample:“Oracle Opening Keynote and JDK 7, 8, and 9 Presentation•    Oracle is committed to Java and wants to provide support for it on any device.•    JSE 7 for Mac will be released next week.•    Oracle would like Java developers to be involved in JCP, to adopt a JSR and to attend local JUG meetings.•    JEE 7 will be released next year.•    JEE 7 is focused on cloud integration, some of the features are already implemented in glassfish 4 development branch.•    JSE 8 will be release in summer of 2013 due to “enterprise community request” as they can not keep the pace with an 18    month release cycle.•    The main features included in JSE8 are lambda support, project Jigsaw, new Date/Time API, project Coin++ and adding   support for sensors. JSE 9 probably will focus on some of these features:1.    self tuning JVM2.    improved native language integration3.    processing enhancement for big data4.    reification (adding runtime class type info for generic types)5.    unification of primitive and corresponding object classes6.    meta-object protocol in order to use type and methods define in other JVM languages7.    multi-tenancy8.    JVM resource management” Thanks Constantin! Ivan St. Ivanov, of SAP Labs Bulgaria, also commented on the keynotes with a different focus.  He summarizes Henrik Stahl’s look ahead to Java SE 8 and JavaFX 3.0; Cameron Purdy on Java EE and the cloud; celebrated Java Champion Josh Bloch on what’s good and bad about Java; Mark Reinhold’s quick look ahead to Java SE 9; and Brian Goetz on lambdas and default methods in Java SE 8. Here’s St. Ivanov’s account of Josh Bloch’s comments on the pluses of Java:“He started with the virtues of the platform. To name a few:    Tightly specified language primitives and evaluation order – int is always 32 bits and operations are executed always from left  to right, without compilers messing around    Dynamic linking – when you change a class, you need to recompile and rebuild just the jar that has it and not the whole application    Syntax  similarity with C/C++ – most existing developers at that time felt like at home    Object orientations – it was cool at that time as well as functional programming is today    It was statically typed language – helps in faster runtime, better IDE support, etc.    No operator overloading – well, I’m not sure why it is good. Scala has it for example and that’s why it is far better for defining DSLs. But I will not argue with Josh.”It’s worth checking out St. Ivanov’s summary of Bloch’s views on what’s not so great about Java as well. What's Coming in JAX-RS 2.0Marek Potociar, Principal Software Engineer at Oracle and currently specification lead of Java EE RESTful web services API (JAX-RS), blogged on his talk about what's coming in JAX-RS 2.0, scheduled for final release in mid-2012.  Here’s a taste:“Perhaps the most wanted addition to the JAX-RS is the Client API, that would complete the JAX-RS story, that is currently server-side only. In JAX-RS 2.0 we are adding a completely interface-based and fluent client API that blends nicely in with the existing fluent response builder pattern on the server-side. When we started with the client API, the first proposal contained around 30 classes. Thanks to the feedback from our Expert Group we managed to reduce the number of API classes to 14 (2 of them being exceptions)! The resulting is compact while at the same time we still managed to create an API that reflects the method invocation context flow (e.g. once you decide on the target URI and start setting headers on the request, your IDE will not try to offer you a URI setter in the code completion). This is a subtle but very important usability aspect of an API…” Obviously, Devoxx is a great Java conference, one that is hitting this year at a time when much is brewing in the platform and beginning to be anticipated.

    Read the article

  • PASS: 2013 Summit Location

    - by Bill Graziano
    HQ recently posted a brief update on our search for a location for 2013.  It includes links to posts by four Board members and two community members. I’d like to add my thoughts to the mix and ask you a question.  But I can’t give you a real understanding without telling you some history first. So far we’ve had the Summit in Chicago, San Francisco, Orlando, Dallas, Denver and Seattle.  Each has a little different feel and distinct memories.  I enjoyed getting drinks by the pool in Orlando after the sessions ended.  I didn’t like that our location in Dallas was so far away from all the nightlife.  Denver was in downtown but we had real challenges with hotels.  I enjoyed the different locations.  I always enjoyed the announcement during the third keynote with the location of the next Summit. There are two big events that impacted my thinking on the Summit location.  The first was our transition to the new management company in early 2007.  The event that September in Denver was put on with a six month planning cycle by a brand new headquarters staff.  It wasn’t perfect but came off much better than I had dared to hope.  It also moved us out of the cookie cutter conferences that we used to do into a model where we have a lot more control.  I think you’ll all agree that the production values of our last few Summits have been fantastic.  That Summit also led to our changing relationship with Microsoft.  Microsoft holds two seats on the PASS Board.  All the PASS Board members face the same challenge: we all have full-time jobs and PASS comes in second place professionally (or sometimes further back).  Starting in 2008 we were assigned a liaison from Microsoft that had a much larger block of time to coordinate with us.  That changed everything between PASS and Microsoft.  Suddenly we were talking to product marketing, Microsoft PR, their event team, the Tech*Ed team, the education division, their user group team and their field sales team – locally and internationally.  We strengthened our relationship with CSS, SQLCAT and the engineering teams.  We had exposure at the executive level that we’d never had before.  And their level of participation at the Summit changed from under 100 people to 400-500 people.  I think those 400+ Microsoft employees have value at a conference on Microsoft SQL Server.  For the first time, Seattle had a real competitive advantage over other cities. I’m one that looked very hard at staying in Seattle for a long, long time.  I think those Microsoft engineers have value to our attendees.  I think the increased support that Microsoft can provide when we’re in Seattle has value to our attendees.  But that doesn’t tell the whole story.  There’s a significant (and vocal!) percentage of our membership that wants the Summit outside Seattle.  Post-2007 PASS doesn’t know what it’s like to have a Summit outside of Seattle.  I think until we have a Summit in another city we won’t really know the trade-offs. I think a model where we move every third or every other year is interesting.  But until we have another Summit outside Seattle and we can evaluate the logistics and how important it is to have depth and variety in our Microsoft participation we won’t really know. Another benefit that comes with a move is variety or diversity.  I learn more when I’m exposed to new things and new people.  I believe that moving the Summit will give a different set of people an opportunity to attend. Grant Fritchey writes “It seems that the board is leaning, extremely heavily, towards making it a permanent fixture in Seattle.”  I don’t believe that’s true.  I know there was discussion of that earlier but I don’t believe it’s true now. And that brings me to my question.  Do we announce the city now or do we wait until the 2012 Summit?  I’m happy to announce Seattle vs. not-Seattle as soon as we sign the contract.  But I’d like to leave the actual city announcement until the 2011 Summit.  I like the drama and mystery of it.  I also like that it doesn’t give you a reason to skip a Summit and wait for the next one if it’s closer or back in Seattle.  The other side of the coin is that your planning is easier if you know where it is.  What do you think?

    Read the article

  • The program is executing properly on dev C++ but is giving problem in Linux.The movement is becoming

    - by srinija
    #include<stdio.h> #include<GL/glut.h> GLfloat v[3][24]={{100.0,300.0,350.0,50.0,100.0,120.0,120.0,100.0,260.0,280.0, 280.0,260.0,140.0,160.0,160.0,140.0,180.0,200.0,200.0,180.0, 220.0,240.0,240.0,220.0},{100.0,100.0,200.0,200.0,160.0, 160.0,180.0,180.0,160.0,160.0,180.0,180.0,160.0,160.0,180.0, 180.0,160.0,160.0,180.0,180.0,160.0,160.0,180.0,180.0}, {1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0, 1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0}}; GLfloat v1[3][16]={{50.0,350.0,350.0,50.0,100.0,300.0,300.0,100.0,125.0,175.0, 175.0,125.0,225.0,275.0,275.0,225.0},{200.0,200.0,210.0, 210.0,210.0,210.0,240.0,240.0,240.0,240.0,310.0,310.0,240.0, 240.0,310.0,310.0},{1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0, 1.0,1.0,1.0,1.0,1.0,1.0}}; GLfloat colors[4][3]={{0.0,0.0,1.0},{0.9961,0.9961,0.65625},{1.0,0.0,1.0}, {1.0,.0,1.0}}; static float q,w,e; static float fq,fw,fe; static GLfloat wa=0,wb=0,wc=0,ba,bb,bc; int flag; void myinit(void) { glClearColor(0.506,.7,1,0.0); glPointSize(2.0); glLoadIdentity(); glOrtho(0.0,499.0,0.0,499.0,-300.0,300.0); } void draw_top_boxes(GLint i,GLint j) { glColor3f(1.0,0.0,0.0); glBegin(GL_POLYGON); glColor3fv(colors[j]); // to draw the boat glVertex2f(v1[0][i+0],v1[1][i+0]); glColor3fv(colors[j+1]); glVertex2f(v1[0][i+1],v1[1][i+1]); glColor3fv(colors[j+2]); glVertex2f(v1[0][i+2],v1[1][i+2]); glColor3fv(colors[j+3]); glVertex2f(v1[0][i+3],v1[1][i+3]); glEnd(); } void draw_polygon(GLint i) { glBegin(GL_POLYGON); // to draw the boat glColor3f(0.0,0.0,0.0); glColor3fv(colors[0]); glVertex2f(v[0][i+0],v[1][i+0]); glColor3fv(colors[1]); glVertex2f(v[0][i+1],v[1][i+1]); glColor3fv(colors[2]); glVertex2f(v[0][i+2],v[1][i+2]); glColor3fv(colors[3]); glVertex2f(v[0][i+3],v[1][i+3]); glEnd(); } void draw_boat() { draw_polygon(0); draw_polygon(4); draw_polygon(8); draw_polygon(12); draw_polygon(16); draw_polygon(20); draw_top_boxes(0,0); draw_top_boxes(4,0); draw_top_boxes(8,0); draw_top_boxes(12,0); glFlush(); glPopMatrix(); glPopMatrix(); } void draw_water() { GLfloat i; GLfloat x=0,y=103,j=0; GLfloat k; glPushMatrix(); glTranslatef(wa,wb,wc); glPushMatrix(); glColor3f(0,0,1); for(k=y;k>0;k-=6) { for(i=1;i<30;i++) { glBegin(GL_LINES); glVertex2f(j,k); glVertex2f(j+10,k); glEnd(); j=j+20; } j=0; } glPopMatrix(); glPopMatrix(); } void draw_fishes() { glPushMatrix(); glTranslatef(fq,12.0,fe); glPushMatrix(); glColor3f(.99609375,0.2578125,0.2578125); glBegin(GL_TRIANGLES); glVertex2f(100,80); glVertex2f(100,60); glVertex2f(85,70); glEnd(); glColor3f(.99609375,0.2578125,0.2578125); glBegin(GL_TRIANGLES); glVertex2f(100,70); glVertex2f(110,75); glVertex2f(110,65); glEnd(); glColor3f(0,0,0); glBegin(GL_POINTS); glVertex2f(90,71); glEnd(); glColor3f(.99609375,0.2578125,0.2578125); glBegin(GL_TRIANGLES); glVertex2f(200,80); glVertex2f(200,60); glVertex2f(185,70); glEnd(); glColor3f(.99609375,0.2578125,0.2578125); glBegin(GL_TRIANGLES); glVertex2f(200,70); glVertex2f(210,75); glVertex2f(210,65); glEnd(); glColor3f(0,0,0); glBegin(GL_POINTS); glVertex2f(190,71); glEnd(); glPopMatrix(); glPopMatrix(); glFlush(); } void draw_cloud() { GLfloat m=100,n=400,o=10; for(int i=0;i<7;i++) { glPushMatrix(); glColor3f(1.0,1.0,1.0); if(i==1) glTranslated(125,415,10); else if(i==3||i==5) glTranslated(m,n+5,o); else glTranslated(m,n,o); glutSolidSphere(20.0,5000,150); glPopMatrix(); m+=10; } } void draw_square() { glColor3f(0,0.5,0.996); glBegin(GL_POLYGON); glVertex2f(0,0); glVertex2f(1000,0); glVertex2f(0,300); glVertex2f(1000,300); glEnd(); glFlush(); } void draw_brotate() { glPushMatrix(); glColor3f(0.96,0.5,0.25); //to draw body of the bird glTranslated(300,400,10); glScalef(3,1,1); glutSolidSphere(6,50000,15); glPopMatrix(); glPushMatrix(); glTranslated(323,400,10); glutSolidSphere(5,50000,15); glPopMatrix(); glColor3f(0,0,0); glBegin(GL_POINTS); glVertex2f(325,401); glEnd(); glColor3f(0.96,0.5,0.25); //to draw wings glBegin(GL_LINES); glVertex2f(294,394); glVertex2f(286,389); glEnd(); glBegin(GL_LINES); glVertex2f(286,389); glVertex2f(295,391); glEnd(); glBegin(GL_LINES); glVertex2f(295,391); glVertex2f(285,385); glEnd(); glBegin(GL_LINES); glVertex2f(285,385); glVertex2f(309,395); glEnd(); glBegin(GL_LINES); glVertex2f(294,406); glVertex2f(286,411); glEnd(); glBegin(GL_LINES); glVertex2f(286,411); glVertex2f(295,409); glEnd(); glBegin(GL_LINES); glVertex2f(295,409); glVertex2f(285,415); glEnd(); glBegin(GL_LINES); glVertex2f(285,415); glVertex2f(309,406); glEnd(); glColor3f(0.96,0.5,0.25); } void draw_bird() { GLfloat x=200,y=400,z=10; draw_brotate(); glBegin(GL_LINES); //draw legs of the bird glVertex2f(285,402); glVertex2f(275,402); glEnd(); glBegin(GL_LINES); glVertex2f(285,398); glVertex2f(275,398); glEnd(); glBegin(GL_LINES); glVertex2f(275,402); glVertex2f(270,405); glEnd(); glBegin(GL_LINES); glVertex2f(275,402); glVertex2f(270,398); glEnd(); glBegin(GL_LINES); glVertex2f(275,398); glVertex2f(273,400); glEnd(); glBegin(GL_LINES); glVertex2f(275,398); glVertex2f(270,395); glEnd(); glBegin(GL_LINES); glVertex2f(323,405); glVertex2f(323,407); glEnd(); glPushMatrix(); glTranslatef(323,409,10); glutSolidSphere(2,200,20); glPopMatrix(); glBegin(GL_TRIANGLES); glVertex2f(328,400); glVertex2f(331,397); glVertex2f(327,398.5); glEnd(); glFlush(); } void drawstars() { glColor3f(1.0,1.0,1.0); glBegin(GL_POINTS); glVertex3f(300.0,400.0,10.0); glVertex3f(200,400.0,10.0); glVertex3f(150,450.0,10.0); glVertex3f(100,470.0,10.0); glVertex3f(50,450.0,10.0); glVertex3f(50,350.0,10.0); glVertex3f(90,365.0,10.0); glVertex3f(350,450.0,10.0); glVertex3f(275,470.0,10.0); glVertex3f(280,430.0,10.0); glVertex3f(250,400.0,10.0); glVertex3f(450,450.0,10.0); glVertex3f(430,430.0,10.0); glVertex3f(430,470.0,10.0); glVertex3f(300,450.0,10.0); glVertex3f(265,380.0,10.0); glVertex3f(235,450.0,10.0); glEnd(); } void draw_all() { glClear(GL_COLOR_BUFFER_BIT); if(flag==0) { glDisable(GL_LIGHTING); //immp one draw_square(); draw_cloud(); glClearColor(0.506,.7,1,0.0); glTranslatef(q,w,e); glPushMatrix(); glColor3f(1.0,0.0,0.0); draw_boat(); draw_fishes(); glPushMatrix(); glColor3f(1.0,1.0,0.0); glTranslated(400,400,10); glutSolidSphere(20.0,5000,150); glPopMatrix(); } if(flag==1) { glDisable(GL_LIGHTING); //imp one draw_square(); draw_cloud(); glClearColor(0.9960,0.7070,0.3164,0.0); glTranslatef(q,w,e); glPushMatrix(); glColor3f(1.0,0.0,0.0); draw_boat(); draw_fishes(); glPushMatrix(); glColor3f(1.0,1.0,0.0); glTranslated(400,400,10); glutSolidSphere(20.0,500,100); glPopMatrix(); } if(flag==2) { // just try and change values in these arrays, specially the position array drawstars(); glEnable(GL_LIGHTING); glEnable(GL_LIGHT0); // GLfloat emission[]={0.1,0.1,0.1,0.0}; GLfloat diffuse[] = { 0.40, 0.40,0.40, 1.0 }; GLfloat ambiance[] = { 0.5, 0.5,0.5, 1.0 }; GLfloat specular[] = { 1.3, 1.3,.3, 1.0 }; GLfloat intensity[]={500.0}; GLfloat position[] = { 10,30,-30,1.0 }; glLightfv (GL_LIGHT0, GL_POSITION, position); glLightfv (GL_LIGHT0, GL_DIFFUSE,diffuse); glLightfv (GL_LIGHT0, GL_AMBIENT,ambiance); glLightModeli(GL_LIGHT_MODEL_LOCAL_VIEWER,GL_TRUE); glLightfv (GL_LIGHT0, GL_SPECULAR,specular); glLightfv (GL_LIGHT0, GL_INTENSITY,intensity); glColor3f(0,0.5,0.996); glBegin(GL_POLYGON); glVertex2f(0,0); glVertex2f(1000,0); glVertex2f(0,150); glVertex2f(1000,150); glEnd(); glTranslatef(q,w,e); glPushMatrix(); glColor3f(1.0,0.0,0.0); draw_boat(); draw_fishes(); glDisable(GL_LIGHTING); glDisable(GL_LIGHT0); draw_cloud(); glClearColor(0.0,0.0,0.0,0.0); glPushMatrix(); glColor3f(1.0,1.0,1.0); glTranslated(400,400,10); glutSolidSphere(20.0,500,100); glPopMatrix(); glColor3f(1.0,1.0,1.0); glBegin(GL_POINTS); glVertex3f(300.0,400.0,10.0); glEnd(); } glPushMatrix(); glTranslatef(ba,bb,bc); glPushMatrix(); draw_bird(); glPopMatrix(); glPopMatrix(); GLfloat i; glPushMatrix(); GLfloat x=0,y=100,j=0; int k; //draw_water(); Sleep(60); q+=5; fq-=3.5; if(q>=440.0) //470 q=-390.0; //400 if(fq<=-300) //500 fq=400.0; //400 wa-=1; if(wa<=(-20)) wa=-0.5; ba+=6; if(ba>=500) ba=-400; glFlush(); glutSwapBuffers(); } void display(void) { draw_all(); } void color_menu(int id) { switch(id) { case 1: flag=0;break; case 2: flag=1;break; case 3: flag=2;break; case 4: exit(0); break; } glutPostRedisplay(); } void main_menu(int id) { switch(id) { case 1: break; case 2:exit(0);break; glutPostRedisplay(); } } int main(int argc,char **argv) { int sub_menu; glutInit(&argc,argv); glutInitDisplayMode(GLUT_RGB|GLUT_DOUBLE); glutInitWindowSize(1000,1000); glutInitWindowPosition(0,0); glutCreateWindow("Ship"); sub_menu=glutCreateMenu(color_menu); glutAddMenuEntry("Morning",1); glutAddMenuEntry("Evening",2); glutAddMenuEntry("Night",3); glutAddMenuEntry("Quit",4); glutCreateMenu(main_menu); glutAddSubMenu("View",sub_menu); glutAddMenuEntry("Quit",2); glutAttachMenu(GLUT_RIGHT_BUTTON); glutDisplayFunc(display); glutIdleFunc(display); myinit(); glutMainLoop(); glFlush(); }

    Read the article

  • This code is working properly in Dev C++ .But on Linux platform it is giving problem with the moveme

    - by srinija
    #include<stdio.h> #include<GL/glut.h> #include<stdlib.h> GLfloat v[3][24]={{100.0,300.0,350.0,50.0,100.0,120.0,120.0,100.0,260.0,280.0, 280.0,260.0,140.0,160.0,160.0,140.0,180.0,200.0,200.0,180.0, 220.0,240.0,240.0,220.0},{100.0,100.0,200.0,200.0,160.0, 160.0,180.0,180.0,160.0,160.0,180.0,180.0,160.0,160.0,180.0, 180.0,160.0,160.0,180.0,180.0,160.0,160.0,180.0,180.0}, {1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0, 1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0}}; GLfloat v1[3][16]={{50.0,350.0,350.0,50.0,100.0,300.0,300.0,100.0,125.0,175.0, 175.0,125.0,225.0,275.0,275.0,225.0},{200.0,200.0,210.0, 210.0,210.0,210.0,240.0,240.0,240.0,240.0,310.0,310.0,240.0, 240.0,310.0,310.0},{1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0, 1.0,1.0,1.0,1.0,1.0,1.0}}; GLfloat colors[4][3]={{0.0,0.0,1.0},{0.9961,0.9961,0.65625},{1.0,0.0,1.0}, {1.0,.0,1.0}}; static float q,w,e; static float fq,fw,fe; static GLfloat wa=0,wb=0,wc=0,ba,bb,bc; int flag; void myinit(void) { glClearColor(0.506,.7,1,0.0); glPointSize(2.0); glLoadIdentity(); glOrtho(0.0,499.0,0.0,499.0,-300.0,300.0); } void draw_top_boxes(GLint i,GLint j) { glColor3f(1.0,0.0,0.0); glBegin(GL_POLYGON); glColor3fv(colors[j]); // to draw the boat glVertex2f(v1[0][i+0],v1[1][i+0]); glColor3fv(colors[j+1]); glVertex2f(v1[0][i+1],v1[1][i+1]); glColor3fv(colors[j+2]); glVertex2f(v1[0][i+2],v1[1][i+2]); glColor3fv(colors[j+3]); glVertex2f(v1[0][i+3],v1[1][i+3]); glEnd(); } void draw_polygon(GLint i) { glBegin(GL_POLYGON); // to draw the boat glColor3f(0.0,0.0,0.0); glColor3fv(colors[0]); glVertex2f(v[0][i+0],v[1][i+0]); glColor3fv(colors[1]); glVertex2f(v[0][i+1],v[1][i+1]); glColor3fv(colors[2]); glVertex2f(v[0][i+2],v[1][i+2]); glColor3fv(colors[3]); glVertex2f(v[0][i+3],v[1][i+3]); glEnd(); } void draw_boat() { draw_polygon(0); draw_polygon(4); draw_polygon(8); draw_polygon(12); draw_polygon(16); draw_polygon(20); draw_top_boxes(0,0); draw_top_boxes(4,0); draw_top_boxes(8,0); draw_top_boxes(12,0); glFlush(); glPopMatrix(); glPopMatrix(); } void draw_water() { GLfloat i; GLfloat x=0,y=103,j=0; GLfloat k; glPushMatrix(); glTranslatef(wa,wb,wc); glPushMatrix(); glColor3f(0,0,1); for(k=y;k>0;k-=6) { for(i=1;i<30;i++) { glBegin(GL_LINES); glVertex2f(j,k); glVertex2f(j+10,k); glEnd(); j=j+20; } j=0; } glPopMatrix(); glPopMatrix(); } void draw_fishes() { glPushMatrix(); glTranslatef(fq,12.0,fe); glPushMatrix(); glColor3f(.99609375,0.2578125,0.2578125); glBegin(GL_TRIANGLES); glVertex2f(100,80); glVertex2f(100,60); glVertex2f(85,70); glEnd(); glColor3f(.99609375,0.2578125,0.2578125); glBegin(GL_TRIANGLES); glVertex2f(100,70); glVertex2f(110,75); glVertex2f(110,65); glEnd(); glColor3f(0,0,0); glBegin(GL_POINTS); glVertex2f(90,71); glEnd(); glColor3f(.99609375,0.2578125,0.2578125); glBegin(GL_TRIANGLES); glVertex2f(200,80); glVertex2f(200,60); glVertex2f(185,70); glEnd(); glColor3f(.99609375,0.2578125,0.2578125); glBegin(GL_TRIANGLES); glVertex2f(200,70); glVertex2f(210,75); glVertex2f(210,65); glEnd(); glColor3f(0,0,0); glBegin(GL_POINTS); glVertex2f(190,71); glEnd(); glPopMatrix(); glPopMatrix(); glFlush(); } void draw_cloud() { GLfloat m=100,n=400,o=10; for(int i=0;i<7;i++) { glPushMatrix(); glColor3f(1.0,1.0,1.0); if(i==1) glTranslated(125,415,10); else if(i==3||i==5) glTranslated(m,n+5,o); else glTranslated(m,n,o); glutSolidSphere(20.0,5000,150); glPopMatrix(); m+=10; } } void draw_square() { glColor3f(0,0.5,0.996); glBegin(GL_POLYGON); glVertex2f(0,0); glVertex2f(1000,0); glVertex2f(0,300); glVertex2f(1000,300); glEnd(); glFlush(); } void draw_brotate() { glPushMatrix(); glColor3f(0.96,0.5,0.25); //to draw body of the bird glTranslated(300,400,10); glScalef(3,1,1); glutSolidSphere(6,50000,15); glPopMatrix(); glPushMatrix(); glTranslated(323,400,10); glutSolidSphere(5,50000,15); glPopMatrix(); glColor3f(0,0,0); glBegin(GL_POINTS); glVertex2f(325,401); glEnd(); glColor3f(0.96,0.5,0.25); //to draw wings glBegin(GL_LINES); glVertex2f(294,394); glVertex2f(286,389); glEnd(); glBegin(GL_LINES); glVertex2f(286,389); glVertex2f(295,391); glEnd(); glBegin(GL_LINES); glVertex2f(295,391); glVertex2f(285,385); glEnd(); glBegin(GL_LINES); glVertex2f(285,385); glVertex2f(309,395); glEnd(); glBegin(GL_LINES); glVertex2f(294,406); glVertex2f(286,411); glEnd(); glBegin(GL_LINES); glVertex2f(286,411); glVertex2f(295,409); glEnd(); glBegin(GL_LINES); glVertex2f(295,409); glVertex2f(285,415); glEnd(); glBegin(GL_LINES); glVertex2f(285,415); glVertex2f(309,406); glEnd(); glColor3f(0.96,0.5,0.25); } void draw_bird() { GLfloat x=200,y=400,z=10; draw_brotate(); glBegin(GL_LINES); //draw legs of the bird glVertex2f(285,402); glVertex2f(275,402); glEnd(); glBegin(GL_LINES); glVertex2f(285,398); glVertex2f(275,398); glEnd(); glBegin(GL_LINES); glVertex2f(275,402); glVertex2f(270,405); glEnd(); glBegin(GL_LINES); glVertex2f(275,402); glVertex2f(270,398); glEnd(); glBegin(GL_LINES); glVertex2f(275,398); glVertex2f(273,400); glEnd(); glBegin(GL_LINES); glVertex2f(275,398); glVertex2f(270,395); glEnd(); glBegin(GL_LINES); glVertex2f(323,405); glVertex2f(323,407); glEnd(); glPushMatrix(); glTranslatef(323,409,10); glutSolidSphere(2,200,20); glPopMatrix(); glBegin(GL_TRIANGLES); glVertex2f(328,400); glVertex2f(331,397); glVertex2f(327,398.5); glEnd(); glFlush(); } void drawstars() { glColor3f(1.0,1.0,1.0); glBegin(GL_POINTS); glVertex3f(300.0,400.0,10.0); glVertex3f(200,400.0,10.0); glVertex3f(150,450.0,10.0); glVertex3f(100,470.0,10.0); glVertex3f(50,450.0,10.0); glVertex3f(50,350.0,10.0); glVertex3f(90,365.0,10.0); glVertex3f(350,450.0,10.0); glVertex3f(275,470.0,10.0); glVertex3f(280,430.0,10.0); glVertex3f(250,400.0,10.0); glVertex3f(450,450.0,10.0); glVertex3f(430,430.0,10.0); glVertex3f(430,470.0,10.0); glVertex3f(300,450.0,10.0); glVertex3f(265,380.0,10.0); glVertex3f(235,450.0,10.0); glEnd(); } void draw_all() { glClear(GL_COLOR_BUFFER_BIT); if(flag==0) { glDisable(GL_LIGHTING); //immp one draw_square(); draw_cloud(); glClearColor(0.506,.7,1,0.0); glTranslatef(q,w,e); glPushMatrix(); glColor3f(1.0,0.0,0.0); draw_boat(); draw_fishes(); glPushMatrix(); glColor3f(1.0,1.0,0.0); glTranslated(400,400,10); glutSolidSphere(20.0,5000,150); glPopMatrix(); } if(flag==1) { glDisable(GL_LIGHTING); //imp one draw_square(); draw_cloud(); glClearColor(0.9960,0.7070,0.3164,0.0); glTranslatef(q,w,e); glPushMatrix(); glColor3f(1.0,0.0,0.0); draw_boat(); draw_fishes(); glPushMatrix(); glColor3f(1.0,1.0,0.0); glTranslated(400,400,10); glutSolidSphere(20.0,500,100); glPopMatrix(); } if(flag==2) { // just try and change values in these arrays, specially the position array drawstars(); glEnable(GL_LIGHTING); glEnable(GL_LIGHT0); // GLfloat emission[]={0.1,0.1,0.1,0.0}; GLfloat diffuse[] = { 0.40, 0.40,0.40, 1.0 }; GLfloat ambiance[] = { 0.5, 0.5,0.5, 1.0 }; GLfloat specular[] = { 1.3, 1.3,.3, 1.0 }; GLfloat intensity[]={500.0}; GLfloat position[] = { 10,30,-30,1.0 }; glLightfv (GL_LIGHT0, GL_POSITION, position); glLightfv (GL_LIGHT0, GL_DIFFUSE,diffuse); glLightfv (GL_LIGHT0, GL_AMBIENT,ambiance); glLightModeli(GL_LIGHT_MODEL_LOCAL_VIEWER,GL_TRUE); glLightfv (GL_LIGHT0, GL_SPECULAR,specular); glLightfv (GL_LIGHT0, GL_INTENSITY,intensity); glColor3f(0,0.5,0.996); glBegin(GL_POLYGON); glVertex2f(0,0); glVertex2f(1000,0); glVertex2f(0,150); glVertex2f(1000,150); glEnd(); glTranslatef(q,w,e); glPushMatrix(); glColor3f(1.0,0.0,0.0); draw_boat(); draw_fishes(); glDisable(GL_LIGHTING); glDisable(GL_LIGHT0); draw_cloud(); glClearColor(0.0,0.0,0.0,0.0); glPushMatrix(); glColor3f(1.0,1.0,1.0); glTranslated(400,400,10); glutSolidSphere(20.0,500,100); glPopMatrix(); glColor3f(1.0,1.0,1.0); glBegin(GL_POINTS); glVertex3f(300.0,400.0,10.0); glEnd(); } glPushMatrix(); glTranslatef(ba,bb,bc); glPushMatrix(); draw_bird(); glPopMatrix(); glPopMatrix(); GLfloat i; glPushMatrix(); GLfloat x=0,y=100,j=0; int k; //draw_water(); q+=25; fq-=3.5; if(q>=440.0) //470 q=-390.0; //400 if(fq<=-300) //500 fq=400.0; //400 wa-=1; if(wa<=(-20)) wa=-0.5; ba+=6; if(ba>=500) ba=-400; glFlush(); glutSwapBuffers(); } void display(void) { draw_all(); } void color_menu(int id) { switch(id) { case 1: flag=0;break; case 2: flag=1;break; case 3: flag=2;break; case 4: exit(0); break; } glutPostRedisplay(); } void main_menu(int id) { switch(id) { case 1: break; case 2:exit(0);break; glutPostRedisplay(); } } int main(int argc,char **argv) { int sub_menu; glutInit(&argc,argv); glutInitDisplayMode(GLUT_RGB|GLUT_DOUBLE); glutInitWindowSize(1000,1000); glutInitWindowPosition(0,0); glutCreateWindow("Ship"); sub_menu=glutCreateMenu(color_menu); glutAddMenuEntry("Morning",1); glutAddMenuEntry("Evening",2); glutAddMenuEntry("Night",3); glutAddMenuEntry("Quit",4); glutCreateMenu(main_menu); glutAddSubMenu("View",sub_menu); glutAddMenuEntry("Quit",2); glutAttachMenu(GLUT_RIGHT_BUTTON); glutDisplayFunc(display); glutIdleFunc(display); myinit(); glutMainLoop(); glFlush(); }

    Read the article

  • Gridview get image from JSON using AsyncTask

    - by kongkea
    This project I've done with image in my drawable but now I want to get image url from JSON by using Asynctask and display it. and I make php that provide a json string like below. I want to get path of image(url) by using AsyncTask from JSON. I want to use data from json instead of public mThumbId = {...}; {"count":"28","data": [{"id":"1", "first_name":"man", "last_name":"woman", "username":"man", "password":"4f70432e636970de9929bcc6f1b72412", "email":"[email protected]", "url":"http://vulcan.wr.usgs.gov/Imgs/Jpg/MSH/Images/MSH64_aerial_view_st_helens_from_NE_09-64_med.jpg"}, {"id":"2", "first_name":"first", "last_name":"Last Name", "username":"user", "password":"1a1dc91c907325c69271ddf0c944bc72", "email":"[email protected]", "url":"http://www.danheller.com/images/California/Marin/Scenics/bird-view-big.jpg"}, {"id":"3", "first_name":"first", "last_name":"Last Name", "username":"user", "password":"1a1dc91c907325c69271ddf0c944bc72", "email":"0", "url":"http://www.hermes.net.au/bodhi/images/view/large/view_03.jpg"}]} AndroidGridLayoutActivity GridView gridView = (GridView) findViewById(R.id.grid_view); gridView.setAdapter(new ImageAdapter(this)); gridView.setOnItemClickListener(new OnItemClickListener() { public void onItemClick(AdapterView<?> parent, View v, int position, long id) { Intent i = new Intent(getApplicationContext(), FullImageActivity.class); i.putExtra("id", position); startActivity(i); } }); ImageAdapter public class ImageAdapter extends BaseAdapter { private Context mContext; // Keep all Images in array public Integer[] mThumbIds = { R.drawable.pic_1, R.drawable.pic_2, R.drawable.pic_3, R.drawable.pic_4, R.drawable.pic_5, R.drawable.pic_6, R.drawable.pic_7, R.drawable.pic_8, R.drawable.pic_9, R.drawable.pic_10, R.drawable.pic_11, R.drawable.pic_12, R.drawable.pic_13, R.drawable.pic_14, R.drawable.pic_15 }; // Constructor public ImageAdapter(Context c){ mContext = c; } public int getCount() { return mThumbIds.length; } public Object getItem(int position) { return mThumbIds[position]; } public long getItemId(int position) { return 0; } public View getView(int position, View convertView, ViewGroup parent) { ImageView imageView = new ImageView(mContext); imageView.setImageResource(mThumbIds[position]); imageView.setScaleType(ImageView.ScaleType.CENTER_CROP); imageView.setLayoutParams(new GridView.LayoutParams(70, 70)); return imageView; } } FullImageActivity Intent i = getIntent(); int position = i.getExtras().getInt("id"); ImageAdapter imageAdapter = new ImageAdapter(this); ImageView imageView = (ImageView) findViewById(R.id.full_image_view); imageView.setImageResource(imageAdapter.mThumbIds[position]);

    Read the article

  • SQL SERVER – Extending SQL Azure with Azure worker role – Guest Post by Paras Doshi

    - by pinaldave
    This is guest post by Paras Doshi. Paras Doshi is a research Intern at SolidQ.com and a Microsoft student partner. He is currently working in the domain of SQL Azure. SQL Azure is nothing but a SQL server in the cloud. SQL Azure provides benefits such as on demand rapid provisioning, cost-effective scalability, high availability and reduced management overhead. To see an introduction on SQL Azure, check out the post by Pinal here In this article, we are going to discuss how to extend SQL Azure with the Azure worker role. In other words, we will attempt to write a custom code and host it in the Azure worker role; the aim is to add some features that are not available with SQL Azure currently or features that need to be customized for flexibility. This way we extend the SQL Azure capability by building some solutions that run on Azure as worker roles. To understand Azure worker role, think of it as a windows service in cloud. Azure worker role can perform background processes, and to handle processes such as synchronization and backup, it becomes our ideal tool. First, we will focus on writing a worker role code that synchronizes SQL Azure databases. Before we do so, let’s see some scenarios in which synchronization between SQL Azure databases is beneficial: scaling out access over multiple databases enables us to handle workload efficiently As of now, SQL Azure database can be hosted in one of any six datacenters. By synchronizing databases located in different data centers, one can extend the data by enabling access to geographically distributed data Let us see some scenarios in which SQL server to SQL Azure database synchronization is beneficial To backup SQL Azure database on local infrastructure Rather than investing in local infrastructure for increased workloads, such workloads could be handled by cloud Ability to extend data to different datacenters located across the world to enable efficient data access from remote locations Now, let us develop cloud-based app that synchronizes SQL Azure databases. For an Introduction to developing cloud based apps, click here Now, in this article, I aim to provide a bird’s eye view of how a code that synchronizes SQL Azure databases look like and then list resources that can help you develop the solution from scratch. Now, if you newly add a worker role to the cloud-based project, this is how the code will look like. (Note: I have added comments to the skeleton code to point out the modifications that will be required in the code to carry out the SQL Azure synchronization. Note the placement of Setup() and Sync() function.) Click here (http://parasdoshi1989.files.wordpress.com/2011/06/code-snippet-1-for-extending-sql-azure-with-azure-worker-role1.pdf ) Enabling SQL Azure databases synchronization through sync framework is a two-step process. In the first step, the database is provisioned and sync framework creates tracking tables, stored procedures, triggers, and tables to store metadata to enable synchronization. This is one time step. The code for the same is put in the setup() function which is called once when the worker role starts. Now, the second step is continuous (or on demand) synchronization of SQL Azure databases by propagating changes between databases. This is done on a continuous basis by calling the sync() function in the while loop. The code logic to synchronize changes between SQL Azure databases should be put in the sync() function. Discussing the coding part step by step is out of the scope of this article. Therefore, let me suggest you a resource, which is given here. Also, note that before you start developing the code, you will need to install SYNC framework 2.1 SDK (download here). Further, you will reference some libraries before you start coding. Details regarding the same are available in the article that I just pointed to. You will be charged for data transfers if the databases are not in the same datacenter. For pricing information, go here Currently, a tool named DATA SYNC, which is built on top of sync framework, is available in CTP that allows SQL Azure <-> SQL server and SQL Azure <-> SQL Azure synchronization (without writing single line of code); however, in some cases, the custom code shown in this blogpost provides flexibility that is not available with Data SYNC. For instance, filtering is not supported in the SQL Azure DATA SYNC CTP2; if you wish to have such a functionality now, then you have the option of developing a custom code using SYNC Framework. Now, this code can be easily extended to synchronize at some schedule. Let us say we want the databases to get synchronized every day at 10:00 pm. This is what the code will look like now: (http://parasdoshi1989.files.wordpress.com/2011/06/code-snippet-2-for-extending-sql-azure-with-azure-worker-role.pdf) Don’t you think that by writing such a code, we are imitating the functionality provided by the SQL server agent for a SQL server? Think about it. We are scheduling our administrative task by writing custom code – in other words, we have developed a “Light weight SQL server agent for SQL Azure!” Since the SQL server agent is not currently available in cloud, we have developed a solution that enables us to schedule tasks, and thus we have extended SQL Azure with the Azure worker role! Now if you wish to track jobs, you can do so by storing this data in SQL Azure (or Azure tables). The reason is that Windows Azure is a stateless platform, and we will need to store the state of the job ourselves and the choice that you have is SQL Azure or Azure tables. Note that this solution requires custom code and also it is not UI driven; however, for now, it can act as a temporary solution until SQL server agent is made available in the cloud. Moreover, this solution does not encompass functionalities that a SQL server agent provides, but it does open up an interesting avenue to schedule some of the tasks such as backup and synchronization of SQL Azure databases by writing some custom code in the Azure worker role. Now, let us see one more possibility – i.e., running BCP through a worker role in Azure-hosted services and then uploading the backup files either locally or on blobs. If you upload it locally, then consider the data transfer cost. If you upload it to blobs residing in the same datacenter, then no transfer cost applies but the cost on blob size applies. So, before choosing the option, you need to evaluate your preferences keeping the cost associated with each option in mind. In this article, I have shown that Azure worker role solution could be developed to synchronize SQL Azure databases. Moreover, a light-weight SQL server agent for SQL Azure can be developed. Also we discussed the possibility of running BCP through a worker role in Azure-hosted services for backing up our precious SQL Azure data. Thus, we can extend SQL Azure with the Azure worker role. But remember: you will be charged for running Azure worker roles. So at the end of the day, you need to ask – am I willing to build a custom code and pay money to achieve this functionality? I hope you found this blog post interesting. If you have any questions/feedback, you can comment below or you can mail me at Paras[at]student-partners[dot]com Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Azure, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • CodePlex Daily Summary for Wednesday, May 19, 2010

    CodePlex Daily Summary for Wednesday, May 19, 2010New Projects3FD - Framework For Fast Development: This is a C++ framework that provides a solid error handling structure, garbage collection, multi-threading and portability between compilers. The ...ali test project: test projectAttribute Builder: The Attribute Builder builds an attribute from a lambda expression because it can.BDK0008: it is a food lovers websitecgdigest: cg digest template for non-profit orgCokmez: Bilmuh cokmez duyuru sistemiDot Game: It is a dot game that our Bangladeshi people used to play at their childhood time and their last time when they are poor for working.ESRI Javascript .NET Integration: Visual Studio project that shows how to integrate the Esri Javascript API with .NET Exchange 2010 RBAC Editor (RBAC GUI): Exchange 2010 RBAC Editor (RBAC GUI) Developed in C# and using Powershell behind the scenes RBAC tool to simplfy RBAC administrationFile Validator (Validador de Archivos): Componente que permite realizar la validación de archivos (txt, imagenes, PDF, etc) actualmente solo tiene implementado la parte de los txt, permit...Grip 09 Lab4: GripjPageFlipper: This is a wonderful implementation of page flipper entirely based on HTML 5 <canvas> tag. It means that it can work in any browser that supports HT...Main project: Index bird families and associated species. Malware Analysis and Can Handler: MACH is a tool to organize and catalog your malware analysis canned responses, and to track the topic response lifecycle for forum experts.Perf Web: Performance team web sitePiPiBugNet: PiPiBugNet是一套全新的开源Bug管理系统。 PiPiBugNet代码基于ASP.NET 2.0平台开发,编程语言为C#。 PiPiBugNet界面基于Ext JS设计,提供了极佳的用户体验。RemoteDesktop: integrated remote console, desktop and chat utilityRuneScape emulation done right.: RuneScape emulator.Sandkasten: SandkastenSilverlight Metro Theme: Metro Theme for Silverlight.Silverlight Stereoscopy: Stereoscopy with Silverlight.Twitivia: Twitivia is an online trivia service that runs through twitter and is being used as an example set of projects. C#, MVC, Windows Services, Linq ...XPool: A simple school project.New ReleasesDot Game: 'Dot Game' first release: Dot Game first release This is the 'Dot Game' first release.DotNetNuke® Store: 02.01.35: What's New in this release? Bugs corrected: - Fixed a resource for the header in the Category list of the Store Admin module. - Added several test...ESRI Javascript .NET Integration: Map search results in a DataView: Visual Studio 2010 example showing how to pass Map results back to ASP.NET for use in a DataView.Exchange 2010 RBAC Editor (RBAC GUI): RBAC Editor: This binary is still beta (0.0.9.1) but in most case it's very stableExtending C# editor - Outlining, classification: first revision: a couple of bug has been eliminated, performance improvementFloe IRC Client: Floe IRC Client 2010-05 R6: Corrected bug where text would be unexpectedly copied to the clipboard.Floe IRC Client: Floe IRC Client 2010-05 R7: - Fixed bug where text would show up in a query window with someone if they said something on a channel that you are both present on.Free Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts v3.0.9 GA released: Hi, Today we have released the final version of Visifire v3.0.9 which contains the following enhancements: * Two new properties ActualAxisMin...Free Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts v3.5.2 GA Released: Hi, Today we have released the final version of Visifire v3.5.2 which contains the following enhancements: Two new properties ActualAxisMinimum a...HB Batch Encoder Mk 2: HB Batch Encoder Mk2 v1.02: Added .mov support.jPageFlipper: jPageFlipper 0.9: This is an initial community preview of jPageFlipper. It's not ready for production usage but has almost all functionality implemented.linq.js - LINQ for JavaScript: ver 2.1.0.0: Add Class Dictionary Lookup Grouping OrderedEnumerable Add Method ToDictionary MemoizeAll Share Let Add Overload ...Microsoft Research Biology Extension for Excel: MSR Biology Extension for Excel - M9: M9 Release includes the following updates to the previous release: > Import / Export support from Excel for multiple file formats > Bug fixes and ...Nifty CSharp Tools: Event Watcher: Event Watcher!Paint.NET Bulk Image Processor: Paint.NET Bulk Image Processor v1.0: This is the initial release of the Paint.NET Bulk Image processor plugin. All feedback is welcome.PiPiBugNet: PiPiBugNet架构设计: PiPiBugNet架构设计,未包含功能实现RuneScape emulation done right.: rc0: Release cantidate 0.Rx Contrib: V1.6: Adding CCR queue as adapter for the ReactiveQueue credits goes to Yuval Mazor http://blogs.microsoft.co.il/blogs/yuvmaz/Silverlight Metro Theme: Silverlight Metro Theme Alpha 1: Silverlight Metro Theme Alpha 1Silverlight Stereoscopy: Silverlight Stereoscopy Alpha 1: Silverlight Stereoscopy Alpha 20100518Stratosphere: Stratosphere 1.0.6.0: Introduced support for batch put Introduced Support for conditional updates and consistent read Added support for select conditions Brought t...VCC: Latest build, v2.1.30518.0: Automatic drop of latest buildVideo Downloader: Example Program - 1.1: Example Program showing the features of the DLL and what can be achieved using it. For DLL Version 1.1.Video Downloader: Version 1.1: Version 1.1 See Home Page for usage and more information regarding new features. Please remember changes at You-Tube can prevent this software from...WatchersNET.TagCloud: WatchersNET.TagCloud 01.06.00: Whats New New Tag Mode: Show Tags from Ventrian.com NewsArticles Module New Tag Mode: Show Tags from Ventrian.com SimpleGallery Module Hyperlin...Windows Double Explorer: WDE v0.4: -optimization -switch to new vst2010 -viewer close now by pressing escape -reorder tabs -send selected fullname or shortnames via email (eye button...Most Popular ProjectsRawrWBFS ManagerAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitWindows Presentation Foundation (WPF)patterns & practices – Enterprise LibraryMicrosoft SQL Server Community & SamplesPHPExcelASP.NETMost Active Projectspatterns & practices – Enterprise LibraryRawrPHPExcelGMap.NET - Great Maps for Windows Forms & PresentationCustomer Portal Accelerator for Microsoft Dynamics CRMBlogEngine.NETWindows Azure Command-line Tools for PHP DevelopersCassiniDev - Cassini 3.5/4.0 Developers EditionSQL Server PowerShell ExtensionsFluent Ribbon Control Suite

    Read the article

  • Data-driven animation states

    - by user8363
    I'm trying to handle animations in a 2D game engine hobby project, without hard-coding them. Hard coding animation states seems like a common but very strange phenomenon, to me. A little background: I'm working with an entity system where components are bags of data and subsystems act upon them. I chose to use a polling system to update animation states. With animation states I mean: "walking_left", "running_left", "walking_right", "shooting", ... My idea to handle animations was to design it as a data driven model. Data could be stored in an xml file, a rdbms, ... And could be loaded at the start of a game / level/ ... This way you can easily edit animations and transitions without having to go change the code everywhere in your game. As an example I made an xml draft of the data definitions I had in mind. One very important piece of data would simply be the description of an animation. An animation would have a unique id (a descriptive name). It would hold a reference id to an image (the sprite sheet it uses, because different animations may use different sprite sheets). The frames per second to run the animation on. The "replay" here defines if an animation should be run once or infinitely. Then I defined a list of rectangles as frames. <animation id='WIZARD_WALK_LEFT'> <image id='WIZARD_WALKING' /> <fps>50</fps> <replay>true</replay> <frames> <rectangle> <x>0</x> <y>0</y> <width>45</width> <height>45</height> </rectangle> <rectangle> <x>45</x> <y>0</y> <width>45</width> <height>45</height> </rectangle> </frames> </animation> Animation data would be loaded and held in an animation resource pool and referenced by game entities that are using it. It would be treated as a resource like an image, a sound, a texture, ... The second piece of data to define would be a state machine to handle animation states and transitions. This defines each state a game entity can be in, which states it can transition to and what triggers that state change. This state machine would differ from entity to entity. Because a bird might have states "walking" and "flying" while a human would only have the state "walking". However it could be shared by different entities because multiple humans will probably have the same states (especially when you define some common NPCs like monsters, etc). Additionally an orc might have the same states as a human. Just to demonstrate that this state definition might be shared but only by a select group of game entities. <state id='IDLE'> <event trigger='LEFT_DOWN' goto='MOVING_LEFT' /> <event trigger='RIGHT_DOWN' goto='MOVING_RIGHT' /> </state> <state id='MOVING_LEFT'> <event trigger='LEFT_UP' goto='IDLE' /> <event trigger='RIGHT_DOWN' goto='MOVING_RIGHT' /> </state> <state id='MOVING_RIGHT'> <event trigger='RIGHT_UP' goto='IDLE' /> <event trigger='LEFT_DOWN' goto='MOVING_LEFT' /> </state> These states can be handled by a polling system. Each game tick it grabs the current state of a game entity and checks all triggers. If a condition is met it changes the entity's state to the "goto" state. The last part I was struggling with was how to bind animation data and animation states to an entity. The most logical approach seemed to me to add a pointer to the state machine data an entity uses and to define for each state in that machine what animation it uses. Here is an xml example how I would define the animation behavior and graphical representation of some common entities in a game, by addressing animation state and animation data id. Note that both "wizard" and "orc" have the same animation states but a different animation. Also, a different animation could mean a different sprite sheet, or even a different sequence of animations (an animation could be longer or shorter). <entity name="wizard"> <state id="IDLE" animation="WIZARD_IDLE" /> <state id="MOVING_LEFT" animation="WIZARD_WALK_LEFT" /> </entity> <entity name="orc"> <state id="IDLE" animation="ORC_IDLE" /> <state id="MOVING_LEFT" animation="ORC_WALK_LEFT" /> </entity> When the entity is being created it would add a list of states with state machine data and an animation data reference. In the future I would use the entity system to build whole entities by defining components in a similar xml format. -- This is what I have come up with after some research. However I had some trouble getting my head around it, so I was hoping op some feedback. Is there something here what doesn't make sense, or is there a better way to handle these things? I grasped the idea of iterating through frames but I'm having trouble to take it a step further and this is my attempt to do that.

    Read the article

  • Adventures in Windows 8: Working around the navigation animation issues in LayoutAwarePage

    - by Laurent Bugnion
    LayoutAwarePage is a pretty cool add-on to Windows 8 apps, which facilitates greatly the implementation of orientation-aware (portrait, landscape) as well as state-aware (snapped, filled, fullscreen) apps. It has however a few issues that are obvious when you use transformed elements on your page. Adding a LayoutAwarePage to your application If you start with a blank app, the MainPage is a vanilla Page, with no such feature. In order to have a LayoutAwarePage into your app, you need to add this class (and a few helpers) with the following operation: Right click on the Solution and select Add, New Item from the context menu. From the dialog, select a Basic Page (not a Blank Page, which is another vanilla page). If you prefer, you can also use Split Page, Items Page, Item Detail Page, Grouped Items Page or Group Detail Page which are all LayoutAwarePages. Personally I like to start with a Basic Page, which gives me more creative freedom. Adding this new page will cause Visual Studio to show a prompt asking you for permission to add additional helper files to the Common folder. One of these helpers in the LayoutAwarePage class, which is where the magic happens. LayoutAwarePage offers some help for the detection of orientation and state (which makes it a pleasure to design for all these scenarios in Blend, by the way) as well as storage for the navigation state (more about that in a future article). Issue with LayoutAwarePage When you use UI elements such as a background picture, a watermark label, logos, etc, it is quite common to do a few things with those: Making them partially transparent (this is especially true for background pictures; for instance I really like a black Page background with a half transparent picture placed on top of it). Transforming them, for instance rotating them a bit, scaling them, etc. Here is an example with a picture of my two beautiful daughters in the Bird Park in Kuala Lumpur, as well as a transformed TextBlock. The image has an opacity of 40% and the TextBlock a simple RotateTransform. If I create an application with a MainPage that navigates to this LayoutAwarePage, however, I will have a very annoying effect: The background picture appears with an Opacity of 100%. The TextBlock is not rotated. This lasts only for less than a second (during the navigation animation) before the elements “snap into place” and get their desired effect. Here is the XAML that cause the annoying effect: <common:LayoutAwarePage x:Name="pageRoot" x:Class="App13.BasicPage1" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:common="using:App13.Common" xmlns:d="http://schemas.microsoft.com/expression/blend/2008" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" mc:Ignorable="d"> <Grid Style="{StaticResource LayoutRootStyle}"> <Grid.RowDefinitions> <RowDefinition Height="140" /> <RowDefinition Height="*" /> </Grid.RowDefinitions> <Image Source="Assets/el20120812025.jpg" Stretch="UniformToFill" Opacity="0.4" Grid.RowSpan="2" /> <Grid> <Grid.ColumnDefinitions> <ColumnDefinition Width="Auto" /> <ColumnDefinition Width="*" /> </Grid.ColumnDefinitions> <Button x:Name="backButton" Click="GoBack" IsEnabled="{Binding Frame.CanGoBack, ElementName=pageRoot}" Style="{StaticResource BackButtonStyle}" /> <TextBlock x:Name="pageTitle" Grid.Column="1" Text="Welcome" Style="{StaticResource PageHeaderTextStyle}" /> </Grid> <TextBlock HorizontalAlignment="Center" TextWrapping="Wrap" Text="Welcome to my Windows 8 Application" Grid.Row="1" VerticalAlignment="Bottom" FontFamily="Segoe UI Light" FontSize="70" FontWeight="Light" TextAlignment="Center" Foreground="#FFFFA200" RenderTransformOrigin="0.5,0.5" UseLayoutRounding="False" d:LayoutRounding="Auto" Margin="0,0,0,153"> <TextBlock.RenderTransform> <CompositeTransform Rotation="-6.545" /> </TextBlock.RenderTransform> </TextBlock> <VisualStateManager.VisualStateGroups> [...] </VisualStateManager.VisualStateGroups> </Grid> </common:LayoutAwarePage> Solving the issue In order to solve this “snapping” issue, the solution is to wrap the elements that are transformed into an empty Grid. Honestly, to me it sounds like a bug in the LayoutAwarePage navigation animation, but thankfully the workaround is not that difficult: Simple change the main Grid as follows: <Grid Style="{StaticResource LayoutRootStyle}"> <Grid.RowDefinitions> <RowDefinition Height="140" /> <RowDefinition Height="*" /> </Grid.RowDefinitions> <Grid Grid.RowSpan="2"> <Image Source="Assets/el20120812025.jpg" Stretch="UniformToFill" Opacity="0.4" /> </Grid> <Grid> <Grid.ColumnDefinitions> <ColumnDefinition Width="Auto" /> <ColumnDefinition Width="*" /> </Grid.ColumnDefinitions> <Button x:Name="backButton" Click="GoBack" IsEnabled="{Binding Frame.CanGoBack, ElementName=pageRoot}" Style="{StaticResource BackButtonStyle}" /> <TextBlock x:Name="pageTitle" Grid.Column="1" Text="Welcome" Style="{StaticResource PageHeaderTextStyle}" /> </Grid> <Grid Grid.Row="1"> <TextBlock HorizontalAlignment="Center" TextWrapping="Wrap" Text="Welcome to my Windows 8 Application" VerticalAlignment="Bottom" FontFamily="Segoe UI Light" FontSize="70" FontWeight="Light" TextAlignment="Center" Foreground="#FFFFA200" RenderTransformOrigin="0.5,0.5" UseLayoutRounding="False" d:LayoutRounding="Auto" Margin="0,0,0,153"> <TextBlock.RenderTransform> <CompositeTransform Rotation="-6.545" /> </TextBlock.RenderTransform> </TextBlock> </Grid> <VisualStateManager.VisualStateGroups> [...] </Grid> Hopefully this will help a few people, I banged my head on the wall for a while before someone at Microsoft pointed me to the solution ;) Happy coding, Laurent   Laurent Bugnion (GalaSoft) Subscribe | Twitter | Facebook | Flickr | LinkedIn

    Read the article

  • SQLAuthority Book Review – DBA Survivor: Become a Rock Star DBA

    - by pinaldave
    DBA Survivor: Become a Rock Star DBA – Thomas LaRock Link to Amazon Link to Flipkart First of all, I thank all my readers when I wrote that I could not get this book in any local book stores, because they offered me to send a copy of this good book. A very special mention goes to Sripada and Jayesh for they gave so much effort in finding my home address and sending me the hard copy. Before, I did not have the copy of the book, but now I have two of it already! It surprises me how my readers were able to find my home address, which I have not publicly shared. Quick Review: This is indeed a one easy-to-read and fun book. We all work day and night with technology yet we should not forget to show our love and care for our family at home. For our souls that starve for peace and guidance, this one book is the “it” book for all the technology enthusiasts. Though this book was specifically written for DBAs, the reach is not limited to DBAs only because the lessons incorporated in it actually applies to all. This is one of the most motivating technical books I have read. Detailed Review: Let us go over a few questions first: Who wants to be as famous as rockstars in the field of Database Administration? How can one learn what it takes to become a top notch software developer? If you are a beginner in your field, how will you go to next level? Your boss may be very kind or like Dilbert’s Boss, what will you do? How do you keep growing when Eco-system around you does not support you? You are almost at top but there is someone else at the TOP, what do you do and how do you avoid office politics? As a database developer what should be your basic responsibility? and many more… I was able to completely read book in one sitting and I loved it. Before I continue with my opinion, I want to echo the opinion of Kevin Kline who has written the Forward of the book. He has truly suggested that “You hold in your hands a collection of insights and wisdom on the topic of database administration gained through many years of hard-won experience, long nights of study, and direct mentorship under some of the industry’s most talented database professionals and information technology (IT) experts.” Today, IT field is getting bigger and better, while talking about terabytes of the database becomes “more” normal every single day. The gods and demigods of database professionals are taking care of these large scale databases and are carefully maintaining them. In this world, there are only a few beginnings on the first step. There are many experts in different technology fields who are asked to address the issues with databases. There is YOU and ME, who is just new to this work. So we ask ourselves WHERE to begin and HOW to begin. We adore and follow the religion of our rockstars, but oftentimes we really have no idea about their background and their struggles. Every rockstar has his success story which needs to be digested before learning his tricks and tips. This book starts with the same note and teaches the two most important lessons for anybody who wants to be a DBA Rockstar –  to focus on their single goal of learning and to excel the technology. The story starts with three simple guidelines – Get Prepared, Get Trained, Get Certified. Once a person learns the skills, and then, it would be about time that he needs to enrich or to improve those skills you have learned. I am sure that the right opportunity will come finding themselves and they will not have to go run behind it. However, the real challenge for any person is the first day or first week. A new employee, no matter how much experienced he is, sometimes has no clue about what should one do at new job. Chapter 2 and chapter 3 precisely talk about what one should do as soon as the new job begins. It is also written with keeping the fact in focus that each job can be very much different but there are few infrastructure setups and programming concepts are the same. Learning basics of database was really interesting. I like to focus on the roots of any technology. It is important to understand the structure of the database before suggesting what indexes needs to be created, the same way this book covers the most essential knowledge one must learn by most database developers. I think the title of the fourth chapter is my favorite sentence in this book. I can see that I will be saying this again and again in the future – “A Development Server Is a Production Server to a Developer“. I have worked in the software industry for almost 8 years now and I have seen so many developers sitting on their chairs and waiting for instructions from their lead about how to improve the code or what to do the next. When I talk to them, I suggest that the experiment with their server and try various techniques. I think they all should understand that for them, a development server is their production server and needs to pay proper attention to the code from the beginning. There should be NO any inappropriate code from the beginning. One has to fully focus and give their best, if they are not sure they should ask but should do something and stay active. Chapter 5 and 6 talks about two essential skills for any developer and database administration – what are the ethics of developers when they are working with production server and how to support software which is running on the production server. I have met many people who know the theory by heart but when put in front of keyboard they do not know where to start. The first thing they do opening the browser and searching online, instead of opening SQL Server Management Studio. This can very well happen to anybody who is experienced as well. Chapter 5 and 6 addresses that situation as well includes the handy scripts which can solve almost all the basic trouble shooting issues. “Where’s the Buffet?” By far, this is the best chapter in this book. If you have ever met me, you would know that I love food. I think after reading this chapter, I felt Thomas has written this just keeping me in mind. I think there will be many other people who feel the same way, too. Even my wife who read this chapter thought this was specifically written for me. I will not talk any more about this chapter as this is one must read chapter. And of course this is about real ‘FOOD‘. I am an SQL Server Trainer and Consultant and I totally agree with the point made in the chapter 8 of this book. Yes, it says here that what is necessary to train employees and people. Millions of dollars worth the labor is continuously done in the world which has faults and incorrect. Once something goes wrong, very expensive consultant comes in and fixes the problem. This whole cycle which can be stopped and improved if proper training is done. There is plenty of free trainings available as well, if one cannot afford paid training. “Connect. Learn. Share” – I think this is a great summary and bird’s eye view of this book. Networking is the key. Everything which is discussed in this book can be taken to next level if one properly uses this tips and continuously grow with it. Connecting with others, helping learn each other and building the good knowledge sharing environment should be the goal of everyone. Before I end the review I want to share a real experience. I have personally met one DBA who has worked in a single department in a company for so long that when he was put in a different department in his company due to closing that department, he could not adjust and quit the job despite the same people and company around him. Adjusting in the new environment gets much tougher as one person gets more and more experienced. This book precisely addresses the same issue along with their solutions. I just cannot stop comparing the book with my personal journey. I found so many things which are coincidently in the book is written as how we developer and DBA think. I must express special thanks to Thomas for taking time in his personal life and write this book for us. This book is indeed a book for everybody who wants to grow healthy in the tough and competitive environment. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Book Review, SQLAuthority News, SQLServer, T SQL, Technology

    Read the article

  • top Tweets SOA Partner Community – August 2012

    - by JuergenKress
    Send your tweets @soacommunity #soacommunity and follow us at http://twitter.com/soacommunity Lucas Jellema ?Published an article about organizing Fusion Middleware Administration: http://technology.amis.nl/2012/07/31/organizing-fusion-middleware-administration-in-a-smart-and-frugal-way … - many organizations are struggling with this. ServiceTechSymposium Countdown to the Early Bird Registration Discount deadline. Only 4 days left! http://ow.ly/cBCiv demed ?Good chatting w Bob Rhubart, Thomas Erl & Tim Hall on SOA & Cloud Symposium https://blogs.oracle.com/archbeat/entry/podcast_show_notes_thomas_erl … @soaschool @OTNArchBeat -- CU in London! SOA Community top Tweets SOA Partner Community July 2012 - are you one of them? If yes please rt! https://soacommunity.wordpress.com/2012/07/30/top-tweets-soa-partner-community-july-2012/ … #soacommunity SOA Community ?Are You a facebook member - do You follow http://www.facebook.com/soacommunity ? #soacommunity #soa SOA Community ?SOA 24/7 - Home Page: http://soa247.com/#.UBJsN8n3kyk.twitter … #soacommunity OracleBlogs ?Handling Large Payloads in SOA Suite 11g http://ow.ly/1lFAih OracleBlogs ?SOA Community Newsletter July 2012 http://ow.ly/1lFx6s OTNArchBeat Podcast Show Notes: Thomas Erl on SOA, Cloud, and Service Technology http://bit.ly/OOHTUJ SOA Community SOA Community Newsletter July 2012 http://wp.me/p10C8u-s7 OTNArchBeat ?OTN ArchBeat Podcast: Thomas Erl on SOA, Cloud, and Service Technology - Part 1 http://pub.vitrue.com/fMti OProcessAccel ?Just released! White Paper: Oracle Process Accelerators Best Practices http://www.oracle.com/technetwork/middleware/bpm/learnmore/processaccelbestpracticeswhitepaper-1708910.pdf … OTNArchBeat ?SOA, Cloud, and Service Technologies - Part 1 of 4 - A conversation with SOA, Cloud, and Service Technology Symposiu... http://ow.ly/1lDyAK OracleBlogs ?SOA Suite 11g PS5 Bundled Patch 3 (11.1.1.6.3) http://ow.ly/1lCW1S Simon Haslam My write-up of the virtues of the #ukoug App Server & Middleware SIG http://bit.ly/LMWdfY What's important to you for our next meeting? SOA Community SOA Partner Community Survey 2012 http://wp.me/p10C8u-qY Simone Geib ?RT @jswaroop: #Oracle positioned in the Leader's quadrant - Gartner Magic Quadrants for Application Infrastructure (SOA & SOA Gov)... ServiceTechSymposium New Supporting Organization, IBTI has joined the Symposium! http://www.servicetechsymposium.com/ orclateamsoa ?A-Team Blog #ateam: BPM 11g Task Form Version Considerations http://ow.ly/1lA7XS OTNArchBeat Oracle content at SOA, Cloud and Service Technology Symposium (and discount code!) http://pub.vitrue.com/FPcW OracleBlogs ?BPM 11g Task Form Version Considerations http://ow.ly/1lzOrX OTNArchBeat BPM 11g #ADF Task Form Versioning | Christopher Karl Chan #fusionmiddleware http://pub.vitrue.com/0qP2 OTNArchBeat Lightweight ADF Task Flow for BPM Human Tasks Overview | @AndrejusB #fusionmiddleware http://pub.vitrue.com/z7x9 SOA Community Oracle Fusion Middleware Summer Camps in Lisbon report by Link Consulting http://middlewarebylink.wordpress.com/2012/07/20/oracle-fusion-middleware-summer-camps-in-lisbon/ … #ofmsummercamps #soa #bpm SOA Community ?Clemens Utschig-Utschig & Manas Deb The Successful Execution of the SOA and BPM Vision Using a Business Capability Framework: Concepts… Simone Geib ?RT @oprocessaccel: Just released! White Paper: Oracle Process Accelerators Best Practices http://www.oracle.com/technetwork/middleware/bpm/learnmore/processaccelbestpracticeswhitepaper-1708910.pdf … jornica ?Report from Oracle Fusion Middleware Summer Camps in Munich: SOA Suite 11g advanced training experiences @soacommunity http://bit.ly/Mw3btE Simone Geib ?Bruce Tierney: Update - SOA & BPM Customer Insights Webcast Series: | https://blogs.oracle.com/SOA/entry/update_soa_bpm_customer_insights … OTNArchBeat Business SOA: Thinking is Dead | @mosesjones http://pub.vitrue.com/k8mw esentri ?had 3 great days in Munich at #Oracle #soacommunity Summercamp! Special thanks to Geoffroy de Lamalle from eProseed! Danilo Schmiedel ?Used my time in train to setup the ps5 soa/bpm vbox-image.Works like a dream. Setup-Readme is perfect! Saves a lot of time!!! @soacommunity 18 Jul SOA Community ?THANKS for the excellent OFM summer camps - save trip home - share your pictures at http://www.facebook.com/soacommunity #ofmsummercamps #soacommunity doors BBQ-party with Oracle @soacommunity. 5Star! #lovemunich #ofmsummercamps pic.twitter.com/ztfcGn2S leonsmiers ?New #Capgemini blog post "Continuous Improvement of Business Agility" http://bit.ly/Lr0EwG #bpm #yam Eric Elzinga ?MDS Explorer utility, http://see.sc/4qdb43 #soasuite ServiceTechSymposium ?@techsymp New speaker Demed L’Her from Oracle has been added to the symposium calendar. http://ow.ly/cjnyw SOA Community ?Last day of the Fusion Middleware summer camps - we continue at 9.00 am. send us your barbecue pictures! #ofmsummercamps #soacommunity SOA Community ?Delivering SOA Governance with EAMS and Oracle Enterprise Repository by Link Consulting http://middlewarebylink.wordpress.com/2012/06/26/delivering-soa-governance-with-eams-and-oracle-enterprise-repository/ … #soacommunity #soa #oer OracleBlogs ?Process Accelerator Kit http://ow.ly/1loaCw 15 Jul SOA Community ?Sun is back in Munich! Send your pictures Middleware summer camps! #ofmsummercamps We start tomorrow 11.00 at Oracle pic.twitter.com/6FStxomk Walter Montantes ?Gracias, Obrigado, Thank you, Danke a Lisboa y a @soacommunity @wlscommunity. From the Mexican guys!! cc @mikeintoch #ofmsummercamps Andrejus Baranovskis Tips & Tricks How to Run Oracle BPM 11g PS5 Workspace from Custom ADF 11g Application http://fb.me/1zOf3h2K8 JDeveloper & ADF ?Fusion Apps Enterprise Repository - Explained http://dlvr.it/1rpjWd Steve Walker ?Oracle #Exalogic is the logical choice for running business applications. Exalogic Software 2.0 launches 7/25. Reg at http://bit.ly/NedQ9L A. Chatziantoniou ?Landed in rainy Amsterdam after a great week in Lisbon for the #ofmsummercamps - multo obrigado for Jürgen for another fantastic event SOA Community ?Teams present #BPM11g POC results at #ofmsummercamps - great job! #soacommunity pic.twitter.com/0d4txkWF Sabine Leitner ?#DOAG SIG Middleware 29.08.2012 Köln über MW, Administration, Monitoring http://bit.ly/P47w82 @soacommunity @OracleMW @OracleFMW 12 Jul philmulhall ?Thanks @soacommunity for a great week at the #ofmsummercamps. Hard work done so time for a few cold ones in Lisboa. pic.twitter.com/LVUUuwTh peter230769 ?RT: andrea_rocco_31: RT @soacommunity: Enjoy the networking event at #ofmsummercamps want to attend next time ... pic.twitter.com/D1HRndi4 Niels Gorter ?#ofmsummercamps dinner in Lisbon. Great weather, scenery, training, people, on and on. Big THANKS @soacommunity JDeveloper & ADF ?Running Oracle BPM 11g PS5 Worklist Task Flow and Human Task Form on Non-SOA Domain http://dlvr.it/1r0c2j Andrea Rocco ?RT @soacommunity: Jamy pastry at cafe Belem - who is the ghost there?!? http://via.me/-2x33uk6 Simon Haslam ?Sounds great - sorry I couldn't make it. RT @soacommunity: 6pm BPM advanced training hard work to build the POC #ofmsummercamps philmulhall ?A well earned rest after a hard days work @soacommunity #summercamps pic.twitter.com/LKK7VOVS philmulhall ?Some more hard working delegates @soacommunity #summercamps pic.twitter.com/gWpk1HZh SOA Community ?Error message at the BPM POC - will The #ace director understand the message and solve it? #ofmsummercamps pic.twitter.com/LFTEzNck Daniel Kleine-Albers ?posted on the #thecattlecrew blog: Assigning more memory to JDeveloper http://thecattlecrew.wordpress.com/2012/07/10/jdeveloper-quicktip-assigning-more-memory/ … OTNArchBeat ?BAM design pointers | Kavitha Srinivasan http://pub.vitrue.com/TOhP SOA Community ?Did you receive the July SOA community newsletter? read it! Want to become a member http://www.oracle.com/goto/emea/soa #soacommunity #soa #opn OracleBlogs ?Markus Zirn, Big Data with CEP and SOA @ SOA, Cloud &amp; Service Technology Symposium 2012 http://ow.ly/1lcSkb Andrejus Baranovskis Running Oracle BPM 11g PS5 Worklist Task Flow and Human Task Form on Non-SOA Eric Elzinga ?Service Facade design pattern in OSB, http://bit.ly/NnOExN Eric Elzinga ?New BPEL Thread Pool in SOA 11g for Non-Blocking Invoke Activities from 11.1.1.6 (PS5), http://bit.ly/NnOc2G Gilberto Holms New Post: Siebel Connection Pool in Oracle Service Bus 11g http://wp.me/pRE8V-2z Oracle UPK & Tutor ?UPK Pre-Built Content Update: UPK pre-built content development efforts are always underway and growing. Ove... http://bit.ly/R2HeTj JDeveloper & ADF ?Troubleshooting BPMN process editor problems in 11.1.1.6 http://dlvr.it/1p0FfS orclateamsoa ?A-Team Blog #ateam: BAM design pointers - In working recently with a large Oracle customer on SOA and BAM, I discove... http://ow.ly/1kYqES SOA Community BPMN process editor problems in 11.1.1.6 by Mark Nelson http://redstack.wordpress.com/2012/06/27/bpmn-process-editor-problems-in-11-1-1-6 … #soacommunity #bpm OTNArchBeat ?SOA Learning Library: free short, topic-focused training on Oracle SOA & BPM products | @SOACommunity http://pub.vitrue.com/NE1G Andrejus Baranovskis ?ADF 11g PS5 Application with Customized BPM Worklist Task Flow (MDS Seeded Customization) http://fb.me/1coX4r1X1 OTNArchBeat ?A Universal JMX Client for Weblogic –Part 1: Monitoring BPEL Thread Pools in SOA 11g | Stefan Koser http://pub.vitrue.com/mQVZ OTNArchBeat ?BPM – Disable DBMS job to refresh B2B Materialized View | Mark Nelson http://pub.vitrue.com/3PR0 SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: SOA Community twitter,SOA Community,Oracle SOA,Oracle BPM,BPM Community,OPN,Jürgen Kress

    Read the article

  • Uploading and Importing CSV file to SQL Server in ASP.NET WebForms

    - by Vincent Maverick Durano
    Few weeks ago I was working with a small internal project  that involves importing CSV file to Sql Server database and thought I'd share the simple implementation that I did on the project. In this post I will demonstrate how to upload and import CSV file to SQL Server database. As some may have already know, importing CSV file to SQL Server is easy and simple but difficulties arise when the CSV file contains, many columns with different data types. Basically, the provider cannot differentiate data types between the columns or the rows, blindly it will consider them as a data type based on first few rows and leave all the data which does not match the data type. To overcome this problem, I used schema.ini file to define the data type of the CSV file and allow the provider to read that and recognize the exact data types of each column. Now what is schema.ini? Taken from the documentation: The Schema.ini is a information file, used to define the data structure and format of each column that contains data in the CSV file. If schema.ini file exists in the directory, Microsoft.Jet.OLEDB provider automatically reads it and recognizes the data type information of each column in the CSV file. Thus, the provider intelligently avoids the misinterpretation of data types before inserting the data into the database. For more information see: http://msdn.microsoft.com/en-us/library/ms709353%28VS.85%29.aspx Points to remember before creating schema.ini:   1. The schema information file, must always named as 'schema.ini'.   2. The schema.ini file must be kept in the same directory where the CSV file exists.   3. The schema.ini file must be created before reading the CSV file.   4. The first line of the schema.ini, must the name of the CSV file, followed by the properties of the CSV file, and then the properties of the each column in the CSV file. Here's an example of how the schema looked like: [Employee.csv] ColNameHeader=False Format=CSVDelimited DateTimeFormat=dd-MMM-yyyy Col1=EmployeeID Long Col2=EmployeeFirstName Text Width 100 Col3=EmployeeLastName Text Width 50 Col4=EmployeeEmailAddress Text Width 50 To get started lets's go a head and create a simple blank database. Just for the purpose of this demo I created a database called TestDB. After creating the database then lets go a head and fire up Visual Studio and then create a new WebApplication project. Under the root application create a folder called UploadedCSVFiles and then place the schema.ini on that folder. The uploaded CSV files will be stored in this folder after the user imports the file. Now add a WebForm in the project and set up the HTML mark up and add one (1) FileUpload control one(1)Button and three (3) Label controls. After that we can now proceed with the codes for uploading and importing the CSV file to SQL Server database. Here are the full code blocks below: 1: using System; 2: using System.Data; 3: using System.Data.SqlClient; 4: using System.Data.OleDb; 5: using System.IO; 6: using System.Text; 7:   8: namespace WebApplication1 9: { 10: public partial class CSVToSQLImporting : System.Web.UI.Page 11: { 12: private string GetConnectionString() 13: { 14: return System.Configuration.ConfigurationManager.ConnectionStrings["DBConnectionString"].ConnectionString; 15: } 16: private void CreateDatabaseTable(DataTable dt, string tableName) 17: { 18:   19: string sqlQuery = string.Empty; 20: string sqlDBType = string.Empty; 21: string dataType = string.Empty; 22: int maxLength = 0; 23: StringBuilder sb = new StringBuilder(); 24:   25: sb.AppendFormat(string.Format("CREATE TABLE {0} (", tableName)); 26:   27: for (int i = 0; i < dt.Columns.Count; i++) 28: { 29: dataType = dt.Columns[i].DataType.ToString(); 30: if (dataType == "System.Int32") 31: { 32: sqlDBType = "INT"; 33: } 34: else if (dataType == "System.String") 35: { 36: sqlDBType = "NVARCHAR"; 37: maxLength = dt.Columns[i].MaxLength; 38: } 39:   40: if (maxLength > 0) 41: { 42: sb.AppendFormat(string.Format(" {0} {1} ({2}), ", dt.Columns[i].ColumnName, sqlDBType, maxLength)); 43: } 44: else 45: { 46: sb.AppendFormat(string.Format(" {0} {1}, ", dt.Columns[i].ColumnName, sqlDBType)); 47: } 48: } 49:   50: sqlQuery = sb.ToString(); 51: sqlQuery = sqlQuery.Trim().TrimEnd(','); 52: sqlQuery = sqlQuery + " )"; 53:   54: using (SqlConnection sqlConn = new SqlConnection(GetConnectionString())) 55: { 56: sqlConn.Open(); 57: SqlCommand sqlCmd = new SqlCommand(sqlQuery, sqlConn); 58: sqlCmd.ExecuteNonQuery(); 59: sqlConn.Close(); 60: } 61:   62: } 63: private void LoadDataToDatabase(string tableName, string fileFullPath, string delimeter) 64: { 65: string sqlQuery = string.Empty; 66: StringBuilder sb = new StringBuilder(); 67:   68: sb.AppendFormat(string.Format("BULK INSERT {0} ", tableName)); 69: sb.AppendFormat(string.Format(" FROM '{0}'", fileFullPath)); 70: sb.AppendFormat(string.Format(" WITH ( FIELDTERMINATOR = '{0}' , ROWTERMINATOR = '\n' )", delimeter)); 71:   72: sqlQuery = sb.ToString(); 73:   74: using (SqlConnection sqlConn = new SqlConnection(GetConnectionString())) 75: { 76: sqlConn.Open(); 77: SqlCommand sqlCmd = new SqlCommand(sqlQuery, sqlConn); 78: sqlCmd.ExecuteNonQuery(); 79: sqlConn.Close(); 80: } 81: } 82: protected void Page_Load(object sender, EventArgs e) 83: { 84:   85: } 86: protected void BTNImport_Click(object sender, EventArgs e) 87: { 88: if (FileUpload1.HasFile) 89: { 90: FileInfo fileInfo = new FileInfo(FileUpload1.PostedFile.FileName); 91: if (fileInfo.Name.Contains(".csv")) 92: { 93:   94: string fileName = fileInfo.Name.Replace(".csv", "").ToString(); 95: string csvFilePath = Server.MapPath("UploadedCSVFiles") + "\\" + fileInfo.Name; 96:   97: //Save the CSV file in the Server inside 'MyCSVFolder' 98: FileUpload1.SaveAs(csvFilePath); 99:   100: //Fetch the location of CSV file 101: string filePath = Server.MapPath("UploadedCSVFiles") + "\\"; 102: string strSql = "SELECT * FROM [" + fileInfo.Name + "]"; 103: string strCSVConnString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + filePath + ";" + "Extended Properties='text;HDR=YES;'"; 104:   105: // load the data from CSV to DataTable 106:   107: OleDbDataAdapter adapter = new OleDbDataAdapter(strSql, strCSVConnString); 108: DataTable dtCSV = new DataTable(); 109: DataTable dtSchema = new DataTable(); 110:   111: adapter.FillSchema(dtCSV, SchemaType.Mapped); 112: adapter.Fill(dtCSV); 113:   114: if (dtCSV.Rows.Count > 0) 115: { 116: CreateDatabaseTable(dtCSV, fileName); 117: Label2.Text = string.Format("The table ({0}) has been successfully created to the database.", fileName); 118:   119: string fileFullPath = filePath + fileInfo.Name; 120: LoadDataToDatabase(fileName, fileFullPath, ","); 121:   122: Label1.Text = string.Format("({0}) records has been loaded to the table {1}.", dtCSV.Rows.Count, fileName); 123: } 124: else 125: { 126: LBLError.Text = "File is empty."; 127: } 128: } 129: else 130: { 131: LBLError.Text = "Unable to recognize file."; 132: } 133:   134: } 135: } 136: } 137: } The code above consists of three (3) private methods which are the GetConnectionString(), CreateDatabaseTable() and LoadDataToDatabase(). The GetConnectionString() is a method that returns a string. This method basically gets the connection string that is configured in the web.config file. The CreateDatabaseTable() is method that accepts two (2) parameters which are the DataTable and the filename. As the method name already suggested, this method automatically create a Table to the database based on the source DataTable and the filename of the CSV file. The LoadDataToDatabase() is a method that accepts three (3) parameters which are the tableName, fileFullPath and delimeter value. This method is where the actual saving or importing of data from CSV to SQL server happend. The codes at BTNImport_Click event handles the uploading of CSV file to the specified location and at the same time this is where the CreateDatabaseTable() and LoadDataToDatabase() are being called. If you notice I also added some basic trappings and validations within that event. Now to test the importing utility then let's create a simple data in a CSV format. Just for the simplicity of this demo let's create a CSV file and name it as "Employee" and add some data on it. Here's an example below: 1,VMS,Durano,[email protected] 2,Jennifer,Cortes,[email protected] 3,Xhaiden,Durano,[email protected] 4,Angel,Santos,[email protected] 5,Kier,Binks,[email protected] 6,Erika,Bird,[email protected] 7,Vianne,Durano,[email protected] 8,Lilibeth,Tree,[email protected] 9,Bon,Bolger,[email protected] 10,Brian,Jones,[email protected] Now save the newly created CSV file in some location in your hard drive. Okay let's run the application and browse the CSV file that we have just created. Take a look at the sample screen shots below: After browsing the CSV file. After clicking the Import Button Now if we look at the database that we have created earlier you'll notice that the Employee table is created with the imported data on it. See below screen shot.   That's it! I hope someone find this post useful! Technorati Tags: ASP.NET,CSV,SQL,C#,ADO.NET

    Read the article

  • CodePlex Daily Summary for Thursday, June 03, 2010

    CodePlex Daily Summary for Thursday, June 03, 2010New ProjectsAlbatross: Albatross framework. We are still working on the documentation, more details will be available soon.ApiChange: ApiChange is the Swiss army knife for inspecting your assemblies from the command line. Now you can do basic operations like diff, who uses (method...BaseCalendar: BaseCalendar is a server-side ASP.NET web control (WebForms or MVC) that renders a calendar while giving you full control over the generated HTML. ...CESAVE: Proyectos para el Comité Estatal de Sanidad Vegetal.Closure Compiler w/ Annotations Visual Studio 2010 Snippets: This is an attempt to create reusable Visual Studio snippets to make working with closure compiler annotated JavaScript more productive. VS2010 ...Common Service Host: Common Service Host is a generic Windows Communication Service Host and factory that uses the Common Service Locator to create Service objects. ...DarkLight: DarkLight is a 2D Lighting Engine written in XNA, and allows developers to create 2D shadowing effects in their 2D games easily. It supports poi...Earn Burn Tracker: A tool to track earned value against a given release, initiative, feature set, and objects.eOfficeAACS: eOffice is an open source access control and attendance management system developed by e-bird Innovation (www.ebirdinfo.com).Its flexible design al...FLV Video conversion library for .Net 3.5: This is a component created to call the ffmpeg tool to convert various video formats to the Adobe Flash FLV output format. The component also takes...Google Moderator: .NET client library for the Google Moderator API.linq to jquery: provides support for linq to jquery objectsMobile Vikings Data: App to view your data usage RefBrowser: RefBrowserRESX Translator with Bing (from Microsoft Consulting Services, UK): A Windows Form application that automatically translates RESX files using Bing web servicesRhyduino - Remote Arduino Control via Managed Code: Rhyduino makes it easy for Visual Studio / Windows devs to control the Arduino using a computer. It's like supercharging your Arduino with all the ...SharePoint 2010 CSV Bulk Term Set Importer: Allows for multiple import of *.csv files to a given term group in SharePoint 2010 Term Store. It will create new term group based on the name pr...SharePoint Feature - Export history version to Excel: Add a function to list the action button, the ability to export history version of the item sheet to Excel from the specified date. Features suppo...SwEntry: A system that allows people to open doors by using a Bluetooth enabled phone. Things to Do with the DLR: This project is about ideas and sample code around the Dynamic Language Runtime.Work Recorder - Hold on own time!: Work Recorder is a office aid software which can recorde the time used on PC for researchers, office workers and students. And it is also a good he...xuezhixu: xuezhixu foundYaget: Yet Another Game Engine TechnologyNew ReleasesBackUpAnyWhere: backupanywhere RC1: this is the RC of our programBaseCalendar: BaseControls 1.0: BaseControls 1.0 contains the BaseCalendar ASP.NET control.BizTalk Server Pipeline Component Wizard: 2.20: Version suitable for 2010 release.CheckHeader: CheckHeader v0.8.6: The Microsoft .NET Framework 4.0 is needed to run this program.Chirpy - VS Add In For Handling Js, Css, and DotLess Files: Chirpy Installer for VS 2010 (Ver-1.0.0.2): VS 2010 Installer for the Chirpy AddIn. Version 1.0.0.2Christoc's DotNetNuke C# Module Development Template: 00.00.01: This is the initial release of Christoc's DotNetNuke C# Module Development Template. You can use the Template as-is, or you can customize the VSTem...Closure Compiler w/ Annotations Visual Studio 2010 Snippets: v1 release: The initial release of the projectCommunity Forums NNTP bridge: Community Forums NNTP Bridge V22: Release of the Community Forums NNTP Bridge to access the social and anwsers MS forums with a single, open source NNTP bridge. This release has ad...Community Forums NNTP bridge: Community Forums NNTP Bridge V23: Release of the Community Forums NNTP Bridge to access the social and anwsers MS forums with a single, open source NNTP bridge. This release has ad...Community Forums NNTP bridge: Community Forums NNTP Bridge V24: Release of the Community Forums NNTP Bridge to access the social and anwsers MS forums with a single, open source NNTP bridge. This release has ad...DarkLight: DarkLight Engine v1.0: This is the first version of the DarkLight engine and currently supports point, spot and area lights with no upper limit on the number of lights. ...DotNetNuke® Skin Collaborate: Collaborate Package 1.1.0: Newer version of Collaborate included fixes: - removed conditional code to display control panel - changed background color to match with backgroun...dotSpatial: System.Spatial.Projection Zip June 2, 2010: This version tries to fix a problem with reprojecting to UTM zones. It is still being tested though.Entity Framework Repository & Unit of Work Template: 1.0.1: This version has more than just the T4 template. I have added a new template that has a RepositoryHelper class for use with StructureMap. Also th...FLV Video conversion library for .Net 3.5: Beta 1: This is the first release of this project. Improvements may be added if necessary.HERB.IQ: Alpha 0.1 Preview: Only clone tab works, just setting up the GUI and getting the XML data handling working correctlyJetfire - Workflow DSL: V1.2.0: The complete source code required for a Jetfire system (server and client nexus) is included in the release. Highlights of Changes Full programmat...linq to jquery: linq to jquery alpha: beta development projectMapWindow6: MapWindow 6.0 June 2, 2010: This version fixes a problem with projecting to UTM zones. I'm not sure that this works perfectly yet. It seemed to require a zone adjustment by ...patterns & practices Web Client Developer Guidance: Developing Web Apps May 2010 Beta: This RelesaeThis drop includes updated documentation, links, and graphics. We are still looking for feedback on this release. Plans going forward...patterns & practices: Composite WPF and Silverlight: Prism 4.0 Drop 1: Prism 4.0 Drop 1 Welcome to the first drop of Prism 4.0 (formally known as the Composite Application Guidance for WPF and Silverlight). This drop i...Powershell4SQL: Version 1.3: Changes from version 1.2 Added support for -Confirm and -WhatIf parameters Added support for -Verbose mode. Includes SQL Batch text, parameters ...RESX Translator with Bing (from Microsoft Consulting Services, UK): v1.0: This is the initial release of the toolRhyduino - Remote Arduino Control via Managed Code: Beta Release (v0.80): LibraryAuto-detects connected Arduino devices. Uses system resources intelligently to take advantage of multiple CPU cores when present. Firmata ...SharePoint Feature - Export history version to Excel: Export Item List Version: - multilanguage support Czech, English Install: "C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\BIN\stsadm.exe" -o addsol...Simulo: Simulo v2.5: That's the third release of Simulo (v2.5). For detailed info on what's new, read the changes log block at the project's home page. System requirem...Site Directory for SharePoint 2010 (from Microsoft Consulting Services, UK): v1.5: Please carefully follow the Installation Guideas there are additional actions that need to be undertaken in this release. As 1.4 with the followin...Spackle.NET: 4.1.0.0 Release: Added IEquatable<T> to Range<T>StreamInsight Samples: Microsoft StreamInsight Product Team Samples: This is the current snapshot of the samples created by the Streaminsight Product Team.Touch Mice: 0.1: Initial release of Touch MiceVCC: Latest build, v2.1.30602.0: Automatic drop of latest buildVivoSocial: VivoSocial 7.2.0: Version 7.2.0 of VivoSocial has been released. If you experienced any issues with the previous version, please update your modules to the 7.2.0 rel...Work Recorder - Hold on own time!: WorkRecorder 1.0: +Finished Version 1.0Most Popular ProjectsCommunity Forums NNTP bridgeOutSyncASP.NET MVC Time PlannerNeatUploadMoonyDesk (windows desktop widgets)Mute4eXpress Persistent Objects (XPO) ToolkitAgUnit - Silverlight unit testing with ReSharperASP.NET MVC ExtensionsAviva Solutions C# Coding GuidelinesMost Active ProjectsCommunity Forums NNTP bridgeGMap.NET - Great Maps for Windows Forms & PresentationRawrIonics Isapi Rewrite FilterN2 CMSpatterns & practices – Enterprise LibraryBlogEngine.NETGameSetFarseer Physics EngineMirror Testing System

    Read the article

< Previous Page | 8 9 10 11 12 13  | Next Page >