Search Results

Search found 1119 results on 45 pages for 'jon buys'.

Page 41/45 | < Previous Page | 37 38 39 40 41 42 43 44 45  | Next Page >

  • [Java]Queue in while loop, cannot modify the value?

    - by javaLearner.java
    This is my code: Iterator it = queue.iterator(); while(it.hasNext()){ random = randNumber(1,2); if(random == 1){ queue.poll(); } else { queue.add("new"); queue.poll(); } } It gives me: Exception in thread "test" java.util.ConcurrentModificationException at java.util.LinkedList$ListItr.checkForComodification(LinkedList.java:761) at java.util.LinkedList$ListItr.next(LinkedList.java:696) Edit @Jon Skeet: What I want to do is: I have a queue list in, let say the size is 10, lets say: a,b,c,d ... j Generate a number between 1 and 2. if 1, pull (remove the top element) else if 2 add new element I will stop the loop until I added 3 new elements

    Read the article

  • How to decide between a method or event?

    - by Wil
    I read something ages ago I think by Jon Skeet (which I can't find now) saying that in IL, all events get converted to methods... it was before I understood C# and did not understand it all, but if that is (or even if it isn't) the gist of it.... In a purely hypothetical situation, I was wondering if someone could explain or point me to a resource that says when to use an event over a method? Basically, If I want to have a big red/green status picture which is linked to a Bool field, and I wanted to change it based on the value of the bool, should I: a) Have a method called Changepicture which is linked to the field and changes the state of the bool and the picture. b) Have a get/set part to the field and stick an event in the set part. c) Have a get/set part to the field and stick a method in the set part. d) Other?

    Read the article

  • SQL SERVER – Free Print Book on SQL Server Joes 2 Pros Kit

    - by pinaldave
    Rick Morelan and I were discussing earlier this month that what we can give back to the community. We believe our books are very much successful and very well received by the community. The five books are a journey from novice to expert. The books have changed many lives and helped many get jobs as well pass the SQL Certifications. Rick is from Seattle, USA and I am from Bangalore, India. There are 12 hours difference between us. We try to do weekly meeting to catch up on various personal and SQL related topics. Here is one of our recent conversations. Rick and Pinal Pinal: Good Morning Rick! Rick: Good Morning…err… Good Evening to you – Pinal! Pinal: Hey Rick, did you read the recent email which I sent you – one of our reader is thanking us for writing Joes 2 Pros series. He wants to dedicate his success to us. Can you believe it? Rick: Yeah, he is very kind but did you tell him that it is all because of his hard work on learning subject and we have very little contribution in his success. Pinal: Absolutely, I told him the same – I said we just wrote the book but it is he who learned from it and proved himself in his job. It is all him! We were just igniters. Rick: Good response. Pinal: Hey Rick! Are we doing enough for the community? What can we do more? Rick: Hmmm… Let us do something more. Pinal: Remember once we discussed the idea of if anyone who buys our Joes 2 Pros Combo Kit in the next 2 weeks – we will send them SQL Wait Stats for free. What do you say? Rick: I agree! Great Idea! Let us do it. Free Giveaway Well Rick and I liked the idea of doing more. We have decided to give away free SQL Server Wait Stats books to everybody who will purchase Joes 2 Pros Combo Kit between today (Oct 15, 2012) and Oct 26, 2012. This is not a contest or a lucky winner opportunity. Everybody who participates will qualify for it. Combo Availability USA – Amazon India - Flipkart | Indiaplaza Note1: USA kit contains FREE 5 DVDs. India Kit does not contain 5 DVDs due to legal issues. Note2: Indian Kit is priced at special Indian Economic Price. Qualify for Free Giveaway You must have purchased our Joes 2 Pros Combo Kit of 5 books between Oct 15, 2012 and Oct 26, 2012. Purchase before Oct 15, 2012 and after Oct 26, 2012 will not qualify for this giveaway. Send your original receipt (email, order details) to following addresses: “[email protected];[email protected]” with the subject line “Joes 2 Pros Kit Promotion Free Offer”. Do not change the subject line or your email may be missed.  Clearly mention your shipping address with phone number and pin/zip code. Send your receipt before Oct 30, 2012. We will not entertain any conversation after Oct 30, 2012 cut off date. The Free books will be sent to USA and India address only. Availability USA - Amazon | India - Flipkart | Indiaplaza Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Joes 2 Pros, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Book Review, SQLServer, T SQL, Technology

    Read the article

  • Silverlight Cream for January 14, 2011 -- #1027

    - by Dave Campbell
    In this Issue: Sigurd Snørteland, Yochay Kiriaty, WindowsPhoneGeek(-2-), Jesse Liberty(-2-), Kunal Chowdhury, Martin Krüger(-2-), Jonathan Cardy. Above the Fold: Silverlight: "Image Viewer using a GridSplitter control" Martin Krüger WP7: "Implementing WP7 ToggleImageControl from the ground up: Part1" WindowsPhoneGeek VS2010 Templates: "MVVM Project Templates for Visual Studio 2010" Jonathan Cardy From SilverlightCream.com: BabySmash7 - a WP7 children's game (source code included) Sigurd Snørteland not only brings Scott Hanselman's Baby Smash to WP7, but he's delivering the source to us as well as discussion of the app. Windows Push Notification Server Side Helper Library Yochay Kiriaty has a tutorial up on Push Notification... not explaining them, but a discussion of a WP7 Push Recipe that provides an easy way for sending all 3 kinds of push notification messages currently supported. Implementing WP7 ToggleImageControl from the ground up: Part1 WindowsPhoneGeek has a great 2-part series up on building a useful WP7 custom control -- a ToggleImage control... this part 1 is definition, deciding on Visual states, etc... buckle up... this is good stuff Implementing WP7 ToggleImageControl from the ground up: Part2 Part 2 in WindowsPhoneGeek's series is also up and where the real fun lives -- implementing the behavior of the control... and the source is available at the end of this post. The Full Stack #5 – Entity Framework Code First Jesse Liberty has episode 5 of the "Full Stack" series he and Jon Galloway are doing and are discussing Entity Framework Code First. Windows Phone From Scratch #18 – MVVM Light Toolkit Soup To Nuts 3 Jesse Liberty also has part 3 of his MVVMLight and WP7 post up and is digging into messaging in this one... for example view <--> ViewModel communication. Exploring Ribbon Control for Silverlight (Part - 1) Kunal Chowdhury has part 1 of a series up on using the Silverlight Ribbon Control from DevComponents... lots of information and a great intro to a great control. Image Viewer using a GridSplitter control Martin Krüger has a very nice picture viewer up as a demo and code available that simply uses the GridSplitter to implement tha aperture... check it out. How to: Gentle animation of a magnify effect Martin Krüger's latest is a take-off on a prior post he links to called 'just for fun' in which he smoothly animates a magnify effect... just very cool animation... explanation and source. MVVM Project Templates for Visual Studio 2010 Jonathan Cardy has a couple resources you probably wanna grab... two MVVM project templates for VS2010... one WPF and one Silverlight Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Plan Operator Tuesday round-up

    - by Rob Farley
    Eighteen posts for T-SQL Tuesday #43 this month, discussing Plan Operators. I put them together and made the following clickable plan. It’s 1000px wide, so I hope you have a monitor wide enough. Let me explain this plan for you (people’s names are the links to the articles on their blogs – the same links as in the plan above). It was clearly a SELECT statement. Wayne Sheffield (@dbawayne) wrote about that, so we start with a SELECT physical operator, leveraging the logical operator Wayne Sheffield. The SELECT operator calls the Paul White operator, discussed by Jason Brimhall (@sqlrnnr) in his post. The Paul White operator is quite remarkable, and can consume three streams of data. Let’s look at those streams. The first pulls data from a Table Scan – Boris Hristov (@borishristov)’s post – using parallel threads (Bradley Ball – @sqlballs) that pull the data eagerly through a Table Spool (Oliver Asmus – @oliverasmus). A scalar operation is also performed on it, thanks to Jeffrey Verheul (@devjef)’s Compute Scalar operator. The second stream of data applies Evil (I figured that must mean a procedural TVF, but could’ve been anything), courtesy of Jason Strate (@stratesql). It performs this Evil on the merging of parallel streams (Steve Jones – @way0utwest), which suck data out of a Switch (Paul White – @sql_kiwi). This Switch operator is consuming data from up to four lookups, thanks to Kalen Delaney (@sqlqueen), Rick Krueger (@dataogre), Mickey Stuewe (@sqlmickey) and Kathi Kellenberger (@auntkathi). Unfortunately Kathi’s name is a bit long and has been truncated, just like in real plans. The last stream performs a join of two others via a Nested Loop (Matan Yungman – @matanyungman). One pulls data from a Spool (my post – @rob_farley) populated from a Table Scan (Jon Morisi). The other applies a catchall operator (the catchall is because Tamera Clark (@tameraclark) didn’t specify any particular operator, and a catchall is what gets shown when SSMS doesn’t know what to show. Surprisingly, it’s showing the yellow one, which is about cursors. Hopefully that’s not what Tamera planned, but anyway...) to the output from an Index Seek operator (Sebastian Meine – @sqlity). Lastly, I think everyone put in 110% effort, so that’s what all the operators cost. That didn’t leave anything for me, unfortunately, but that’s okay. Also, because he decided to use the Paul White operator, Jason Brimhall gets 0%, and his 110% was given to Paul’s Switch operator post. I hope you’ve enjoyed this T-SQL Tuesday, and have learned something extra about Plan Operators. Keep your eye out for next month’s one by watching the Twitter Hashtag #tsql2sday, and why not contribute a post to the party? Big thanks to Adam Machanic as usual for starting all this. @rob_farley

    Read the article

  • Roll your own free .NET technical conference

    - by Brian Schroer
    If you can’t get to a conference, let the conference come to you! There are a ton of free recorded conference presentations online… Microsoft TechEd Let’s start with the proverbial 800 pound gorilla. Recent TechEds have recorded the majority of presentations and made them available online the next day. Check out presentations from last month’s TechEd North America 2012 or last week’s TechEd Europe 2012. If you start at http://channel9.msdn.com/Events/TechEd, you can also drill down to presentations from prior years or from other regional TechEds (Australia, New Zealand, etc.) The top presentations from my “View Queue”: Damian Edwards: Microsoft ASP.NET and the Realtime Web (SignalR) Jennifer Smith: Design for Non-Designers Scott Hunter: ASP.NET Roadmap: One ASP.NET – Web Forms, MVC, Web API, and more Daniel Roth: Building HTTP Services with ASP.NET Web API Benjamin Day: Scrum Under a Waterfall NDC The Norwegian Developer Conference site has the most interesting presentations, in my opinion. You can find the videos from the June 2012 conference at that link. The 2011 and 2010 pages have a lot of presentations that are still relevant also. My View Queue Top 5: Shay Friedman: Roslyn... hmmmm... what? Hadi Hariri: Just ‘cause it’s JavaScript, doesn’t give you a license to write rubbish Paul Betts: Introduction to Rx Greg Young: How to get productive in a project in 24 hours Michael Feathers: Deep Design Lessons ØREDEV Travelling on from Norway to Sweden... I don’t know why, but the Scandinavians seem to have this conference thing figured out. ØREDEV happens each November, and you can find videos here and here. My View Queue Top 5: Marc Gravell: Web Performance Triage Robby Ingebretsen: Fonts, Form and Function: A Primer on Digital Typography Jon Skeet: Async 101 Chris Patterson: Hacking Developer Productivity Gary Short: .NET Collections Deep Dive aspConf - The Virtual ASP.NET Conference Formerly known as “mvcConf”, this one’s a little different. It’s a conference that takes place completely on the web. The next one’s happening July 17-18, and it’s not too late to register (It’s free!). Check out the recordings from February 2011 and July 2010. It’s two years old and talks about ASP.NET MVC2, but most of it is still applicable, and Jimmy Bogard’s Put Your Controllers On a Diet presentation is the most useful technical talk I have ever seen. CodeStock Videos from the 2011 edition of this Tennessee conference are available. Presentations from last month’s 2012 conference should be available soon here. I’m looking forward to watching Matt Honeycutt’s Build Your Own Application Framework with ASP.NET MVC 3. UserGroup.tv User Group.tv was founded in January of 2011 by Shawn Weisfeld, with the mission of providing User Group content online for free. You can search by date, group, speaker and category tags. My View Queue Top 5: Sergey Rathon & Ian Henehan: UI Test Automation with Selenium Rob Vettor: The Repository Pattern Latish Seghal: The .NET Ninja’s Toolbelt Amir Rajan: Get Things Done With Dynamic ASP.NET MVC Jeffrey Richter: .NET Nuggets – Houston TechFest Keynote

    Read the article

  • Speeding up procedural texture generation

    - by FalconNL
    Recently I've begun working on a game that takes place in a procedurally generated solar system. After a bit of a learning curve (having neither worked with Scala, OpenGL 2 ES or Libgdx before), I have a basic tech demo going where you spin around a single procedurally textured planet: The problem I'm running into is the performance of the texture generation. A quick overview of what I'm doing: a planet is a cube that has been deformed to a sphere. To each side, a n x n (e.g. 256 x 256) texture is applied, which are bundled in one 8n x n texture that is sent to the fragment shader. The last two spaces are not used, they're only there to make sure the width is a power of 2. The texture is currently generated on the CPU, using the updated 2012 version of the simplex noise algorithm linked to in the paper 'Simplex noise demystified'. The scene I'm using to test the algorithm contains two spheres: the planet and the background. Both use a greyscale texture consisting of six octaves of 3D simplex noise, so for example if we choose 128x128 as the texture size there are 128 x 128 x 6 x 2 x 6 = about 1.2 million calls to the noise function. The closest you will get to the planet is about what's shown in the screenshot and since the game's target resolution is 1280x720 that means I'd prefer to use 512x512 textures. Combine that with the fact the actual textures will of course be more complicated than basic noise (There will be a day and night texture, blended in the fragment shader based on sunlight, and a specular mask. I need noise for continents, terrain color variation, clouds, city lights, etc.) and we're looking at something like 512 x 512 x 6 x 3 x 15 = 70 million noise calls for the planet alone. In the final game, there will be activities when traveling between planets, so a wait of 5 or 10 seconds, possibly 20, would be acceptable since I can calculate the texture in the background while traveling, though obviously the faster the better. Getting back to our test scene, performance on my PC isn't too terrible, though still too slow considering the final result is going to be about 60 times worse: 128x128 : 0.1s 256x256 : 0.4s 512x512 : 1.7s This is after I moved all performance-critical code to Java, since trying to do so in Scala was a lot worse. Running this on my phone (a Samsung Galaxy S3), however, produces a more problematic result: 128x128 : 2s 256x256 : 7s 512x512 : 29s Already far too long, and that's not even factoring in the fact that it'll be minutes instead of seconds in the final version. Clearly something needs to be done. Personally, I see a few potential avenues, though I'm not particularly keen on any of them yet: Don't precalculate the textures, but let the fragment shader calculate everything. Probably not feasible, because at one point I had the background as a fullscreen quad with a pixel shader and I got about 1 fps on my phone. Use the GPU to render the texture once, store it and use the stored texture from then on. Upside: might be faster than doing it on the CPU since the GPU is supposed to be faster at floating point calculations. Downside: effects that cannot (easily) be expressed as functions of simplex noise (e.g. gas planet vortices, moon craters, etc.) are a lot more difficult to code in GLSL than in Scala/Java. Calculate a large amount of noise textures and ship them with the application. I'd like to avoid this if at all possible. Lower the resolution. Buys me a 4x performance gain, which isn't really enough plus I lose a lot of quality. Find a faster noise algorithm. If anyone has one I'm all ears, but simplex is already supposed to be faster than perlin. Adopt a pixel art style, allowing for lower resolution textures and fewer noise octaves. While I originally envisioned the game in this style, I've come to prefer the realistic approach. I'm doing something wrong and the performance should already be one or two orders of magnitude better. If this is the case, please let me know. If anyone has any suggestions, tips, workarounds, or other comments regarding this problem I'd love to hear them.

    Read the article

  • eSTEP Newsletter November 2012

    - by uwes
    Dear Partners,We would like to inform you that the November '12 issue of our Newsletter is now available.The issue contains information to the following topics: News from CorpOracle Celebrates 25 Years of SPARC Innovation; IDC White Papers Finds Growing Customer Comfort with Oracle Solaris Operating System; Oracle Buys Instantis; Pillar Axiom OpenWorld Highlights; Announcement Oracle Solaris 11.1 Availability (data sheet, new features, FAQ's, corporate pages, internal blog, download links, Oracle shop); Announcing StorageTek VSM 6; Announcement Oracle Solaris Cluster 4.1 Availability (new features, FAQ's, cluster corp page, download site, shop for media); Announcement: Oracle Database Appliance 2.4 patch update becomes available Technical SectionOracle White papers on SPARC SuperCluster; Understanding Parallel Execution; With LTFS, Tape is Gaining Storage Ground with additional link to How to Create Oracle Solaris 11 Zones with Oracle Enterprise Manager Ops Center; Provisioning Capabilities of Oracle Enterprise Ops Center Manager 12c; Maximizing your SPARC T4 Oracle Solaris Application Performance with the following articles: SPARC T4 Servers Set World Record on Siebel CRM 8.1.1.4 Benchmark, SPARC T4-Based Highly Scalable Solutions Posts New World Record on SPECjEnterprise2010 Benchmark, SPARC T4 Server Delivers Outstanding Performance on Oracle Business Intelligence Enterprise Edition 11g; Oracle SUN ZFS Storage Appliance Reference Architecture for VMware vSphere4;  Why 4K? - George Wilson's ZFS Day Talk; Pillar Axiom 600 with connected subjects: Oracle Introduces Pillar Axiom Release 5 Storage System Software, Driving down the high cost of Storage, This Provisioning with Pilar Axiom 600, Pillar Axiom 600- System overview and architecture; Migrate to Oracle;s SPARC Systems; Top 5 Reasons to Migrate to Oracle's SPARC Systems Learning & EventsRecently delivered Techcasts: Learning Paths; Oracle Database 11g: Database Administration (New) - Learning Path; Webcast: Drill Down on Disaster Recovery; What are Oracle Users Doing to Improve Availability and Disaster Recovery; SAP NetWeaver and Oracle Exadata Database Machine ReferencesARTstor Selects Oracle’s Sun ZFS Storage 7420 Appliances To Support Rapidly Growing Digital Image Library, Scottish Widows Cuts Sales Administration 20%, Reduces Time to Prepare Reports by 75%, and Achieves Return on Investment in First Year, Oracle's CRM Cloud Service Powers Innovation: Applications on Demand; Technology on Demand, How toHow to Migrate Your Data to Oracle Solaris 11 Using Shadow Migration; Using svcbundle to Create SMF Manifests and Profiles in Oracle Solaris 11; How to prepare a Sun ZFS Storage Appliance to Serve as a Storage Devise with Oracle Enterprise Manager Ops Center 12c; Command Summary: Basic Operations with the Image Packaging System In Oracle Solaris 11; How to Update to Oracle Solaris 11.1 Using the Image Packaging System, How to Migrate Oracle Database from Oracle Solaris 8 to Oracle Solaris 11;  Setting Up, Configuring, and Using an Oracle WebLogic Server Cluster; Ease the Chaos with Automated Patching: Oracle Enterprise Manager Cloud Control 12c; Book excerpt: Oracle Exalogic Elastic Cloud Handbook You find the Newsletter on our portal under eSTEP News ---> Latest Newsletter. You will need to provide your email address and the pin below to get access. Link to the portal is shown below.URL: http://launch.oracle.com/PIN: eSTEP_2011Previous published Newsletters can be found under the Archived Newsletters section and more useful information under the Events, Download and Links tab. Feel free to explore and any feedback is appreciated to help us improve the service and information we deliver.Thanks and best regards,Partner HW Enablement EMEA

    Read the article

  • Inside Red Gate - Experimenting In Public

    - by Simon Cooper
    Over the next few weeks, we'll be performing experiments on SmartAssembly to confirm or refute various hypotheses we have about how people use the product, what is stopping them from using it to its full extent, and what we can change to make it more useful and easier to use. Some of these experiments can be done within the team, some within Red Gate, and some need to be done on external users. External testing Some external testing can be done by standard usability tests and surveys, however, there are some hypotheses that can only be tested by building a version of SmartAssembly with some things in the UI or implementation changed. We'll then be able to look at how the experimental build is used compared to the 'mainline' build, which forms our baseline or control group, and use this data to confirm or refute the relevant hypotheses. However, there are several issues we need to consider before running experiments using separate builds: Ideally, the user wouldn't know they're running an experimental SmartAssembly. We don't want users to use the experimental build like it's an experimental build, we want them to use it like it's the real mainline build. Only then will we get valid, useful, and informative data concerning our hypotheses. There's no point running the experiments if we can't find out what happens after the download. To confirm or refute some of our hypotheses, we need to find out how the tool is used once it is installed. Fortunately, we've applied feature usage reporting to the SmartAssembly codebase itself to provide us with that information. Of course, this then makes the experimental data conditional on the user agreeing to send that data back to us in the first place. Unfortunately, even though this does limit the amount of useful data we'll be getting back, and possibly skew the data, there's not much we can do about this; we don't collect feature usage data without the user's consent. Looks like we'll simply have to live with this. What if the user tries to buy the experiment? This is something that isn't really covered by the Lean Startup book; how do you support users who give you money for an experiment? If the experiment is a new feature, and the user buys a license for SmartAssembly based on that feature, then what do we do if we later decide to pivot & scrap that feature? We've either got to spend time and money bringing that feature up to production quality and into the mainline anyway, or we've got disgruntled customers. Either way is bad. Again, there's not really any good solution to this. Similarly, what if we've removed some features for an experiment and a potential new user downloads the experimental build? (As I said above, there's no indication the build is an experimental build, as we want to see what users really do with it). The crucial feature they need is missing, causing a bad trial experience, a lost potential customer, and a lost chance to help the customer with their problem. Again, this is something not really covered by the Lean Startup book, and something that doesn't have a good solution. So, some tricky issues there, not all of them with nice easy answers. Turns out the practicalities of running Lean Startup experiments are more complicated than they first seem!

    Read the article

  • VirtualBox image SOA Suite &amp; BPM Suite 11.1.1.6.0 & Your feedback?

    - by JuergenKress
    The integration PM team is very pleased to announce the release of a new version of our pre-configured SOA/BPM VirtualBox image for testing and evaluation. This VirtualBox appliance contains a fully configured, ready-to-use SOA/BPM/Webcenter 11.1.1.6.0 installation. All you need is to install Oracle VM VirtualBox on your desktop/laptop and import the SOA/BPM appliance and you are ready to try out SOA Suite and BPM Suite -- no installation and configuration required! The following software is installed in this VritualBox image: Oracle Enterprise Linux (64-bit) EL 5 Update 5 Oracle XE Database 11.2.0 Oracle SOA Suite 11.1.1.6.0 (includes Service Bus) Oracle BPM Suite 11.1.1.6.0 Oracle Webcenter Content (Enterprise Content Management) 11.1.1.6.0 Oracle Webcenter Suite 11.1.1.6.0 Oracle JDeveloper 11.1.1.6.0 JRockit R28.2.0-79-146777-1.6.0_29s Sun Java SDK 1.6.0_29-b11 If you want to try it out, please go to the Pre-built Virtual Machine for SOA Suite and BPM Suite 11g OTN page for detailed instructions on downloading and importing the VirtualBox image. Jon Petter Hjulstad published the first impression at his blog Twitter & LinkedIn We have been waiting for the new VirtualBox Image for a long time, and finally it is here. The appliance has improved in many ways since last release, so it has been worth waiting for. Both the appliance itself and the documentation is excellent. It is evident that Oracle has listened to feedback on the previous release, and I think the developer VMs are useful. Especially the adoption of new patchsets and versions (ex when 12c will be available) will gain a lot from quick getting hands-on experiences. This VirtualBox appliance is a multipurpose image which can be used in different domain configurations. The image has a number of pre-configured domains that you can use depending on your need. The image can be set up so that it requires use of as few resources as possible, you can for instance easily disable B2B if you do not need it, or you can shut down the desktop console and save 600MB. It is important to say that this image is not for production purposes. Read the full article SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix ForumTechnorati Tags: SOA Suite Image,VirtualBox,BPM suite Image,SOA Specialization award,SOA Community,Oracle SOA,Oracle BPM,BPM Community,OPN,Jürgen Kress

    Read the article

  • SQL SERVER – Last Two Days to Get FREE Book – Joes 2 Pros Certification 70-433

    - by pinaldave
    Earlier this week we announced that we will be giving away FREE SQL Wait Stats book to everybody who will get SQL Server Joes 2 Pros Combo Kit. We had a fantastic response to the contest. We got an overwhelming response to the offer. We knew there would be a great response but we want to honestly say thank you to all of you for making it happen. Rick and I want to make sure that we express our special thanks to all of you who are reading our books. The offer is still on and there are two more days to avail this offer. We want to make sure that everybody who buys our most selling combo kits, we will send our other most popular SQL Wait Stats book. Please read all the details of the offer here. The books are great resources for anyone who wants to learn SQL Server from fundamentals and eventually go on the certification path of 70-433. Exam 70-433 contains following important subject and the book covers the subject of fundamental. If you are taking the exam or not taking the exam – this book is for every SQL Developer to learn the subject from fundamentals.  Create and alter tables. Create and alter views. Create and alter indexes. Create and modify constraints. Implement data types. Implement partitioning solutions. Create and alter stored procedures. Create and alter user-defined functions (UDFs). Create and alter DML triggers. Create and alter DDL triggers. Create and deploy CLR-based objects. Implement error handling. Manage transactions. Query data by using SELECT statements. Modify data by using INSERT, UPDATE, and DELETE statements. Return data by using the OUTPUT clause. Modify data by using MERGE statements. Implement aggregate queries. Combine datasets. INTERSECT, EXCEPT Implement subqueries. Implement CTE (common table expression) queries. Apply ranking functions. Control execution plans. Manage international considerations. Integrate Database Mail. Implement full-text search. Implement scripts by using Windows PowerShell and SQL Server Management Objects (SMOs). Implement Service Broker solutions. Track data changes. Data capture Retrieve relational data as XML. Transform XML data into relational data. Manage XML data. Capture execution plans. Collect output from the Database Engine Tuning Advisor. Collect information from system metadata. Availability of Book USA - Amazon | India - Flipkart | Indiaplaza Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Joes 2 Pros, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • June 23, 1983: First Successful Test of the Domain Name System [Geek History]

    - by Jason Fitzpatrick
    Nearly 30 years ago the first Domain Name System (DNS) was tested and it changed the way we interacted with the internet. Nearly impossible to remember number addresses became easy to remember names. Without DNS you’d be browsing a web where numbered addresses pointed to numbered addresses. Google, for example, would look like http://209.85.148.105/ in your browser window. That’s assuming, of course, that a numbers-based web every gained enough traction to be popular enough to spawn a search giant like Google. How did this shift occur and what did we have before DNS? From Wikipedia: The practice of using a name as a simpler, more memorable abstraction of a host’s numerical address on a network dates back to the ARPANET era. Before the DNS was invented in 1983, each computer on the network retrieved a file called HOSTS.TXT from a computer at SRI. The HOSTS.TXT file mapped names to numerical addresses. A hosts file still exists on most modern operating systems by default and generally contains a mapping of the IP address 127.0.0.1 to “localhost”. Many operating systems use name resolution logic that allows the administrator to configure selection priorities for available name resolution methods. The rapid growth of the network made a centrally maintained, hand-crafted HOSTS.TXT file unsustainable; it became necessary to implement a more scalable system capable of automatically disseminating the requisite information. At the request of Jon Postel, Paul Mockapetris invented the Domain Name System in 1983 and wrote the first implementation. The original specifications were published by the Internet Engineering Task Force in RFC 882 and RFC 883, which were superseded in November 1987 by RFC 1034 and RFC 1035.Several additional Request for Comments have proposed various extensions to the core DNS protocols. Over the years it has been refined but the core of the system is essentially the same. When you type “google.com” into your web browser a DNS server is used to resolve that host name to the IP address of 209.85.148.105–making the web human-friendly in the process. Domain Name System History [Wikipedia via Wired] What is a Histogram, and How Can I Use it to Improve My Photos?How To Easily Access Your Home Network From Anywhere With DDNSHow To Recover After Your Email Password Is Compromised

    Read the article

  • Sustainability Activities at Oracle OpenWorld

    - by Evelyn Neumayr
    Close to 50,000 participants will come to San Francisco for Oracle OpenWorld and JavaOne events, held September 30-October 4, 2012 at Moscone Center. Oracle is very conscious of the impact that these events have on the environment and, as part of its ongoing commitment to sustainability, has developed a sustainable event program-now in its fifth year-that aims to maximize positive benefits and minimize negative impacts in a variety of ways. Click here for more details. At the Oracle OpenWorld conference, there will be many sessions and even a hands-on lab which discuss the sustainability solutions that Oracle provides for our customers. I wanted to highlight a few of those sessions here so if you will be at Oracle OpenWorld, you can make sure to attend them. One of the most compelling sessions promises to be our “Eco-Enterprise Innovation Awards and the Business Case for Sustainability” session on Wednesday, October 3 from 10:15 a.m. to 11:15 a.m. in Moscone West 3005. Oracle Chairman of the Board Jeff Henley, Chief Sustainability Officer Jon Chorley, and other Oracle executives will honor select customers with Oracle's Eco-Enterprise Innovation award. This award recognizes customers and their respective partners who rely on Oracle products to support their green business practices in order to reduce their environmental impact, while improving business efficiencies and reducing costs. Another interesting session is the “Tracking, Reporting, and Reducing Environmental Impact with Oracle Solutions” which occurs on Monday, October 1 from 4:45 p.m. to 5:45 p.m. in Moscone West Room 2022. This session covers Oracle’s overall sustainability strategy as well as Oracle Environmental Accounting and Reporting (EA&R), which leverages Oracle ERP and BI solutions for accurate, efficient tracking of energy, emissions, and other environmental data. If you want more details, make sure to visit the hands-on lab titled “Oracle Environmental Accounting & Reporting for Integrated Sustainability Reporting”. This hour-long lab will take place on Tuesday, October 2 at 5:00 p.m. in the Marriott Marquis Hotel-Nob Hill CD. Here you can learn how to use Oracle EA&R to collect sustainability-related data in an efficient and reliable manner as part of existing business processes in Oracle E-Business Suite or JD Edwards Enterprise One. Register for this hands-on lab here.  

    Read the article

  • C# 5: At last, async without the pain

    - by Alex.Davies
    For me, the best feature in Visual Studio 11 is the async and await keywords that come with C# 5. I am a big fan of asynchronous programming: it frees up resources, in particular the thread that a piece of code needs to run in. That lets that thread run something else, while waiting for your long-running operation to complete. That's really important if that thread is the UI thread, or if it's holding a lock because it accesses some data structure. Before C# 5, I think I was about the only person in the world who really cared about asynchronous programming. The trouble was that you had to go to extreme lengths to make code asynchronous. I would forever be writing methods that, instead of returning a value, accepted an extra argument that is a "continuation". Then, when calling the method, I'd have to pass a lambda in to it, which contained all the stuff that needed to happen after the method finished. Here is a real snippet of code that is in .NET Demon: m_BuildControl.FilterEnabledForBuilding(     projects,     enabledProjects = m_OutOfDateProjectFinder.FilterNeedsBuilding(         enabledProjects,         newDirtyProjects =         {             // Mark any currently broken projects as dirty             newDirtyProjects.UnionWith(m_BrokenProjects);             // Copy what we found into the set of dirty things             m_DirtyProjects = newDirtyProjects;             RunSomeBuilds();         })); It's just obtuse. Who puts a lambda inside a lambda like that? Well, me obviously. But surely enabledProjects should just be the return value of FilterEnabledForBuilding? And newDirtyProjects should just be the return value of FilterNeedsBuilding? C# 5 async/await lets you write asynchronous code without it looking so stupid. Here's what I plan to change that code to, once we upgrade to VS 11: var enabledProjects = await m_BuildControl.FilterEnabledForBuilding(projects); var newDirtyProjects = await m_OutOfDateProjectFinder.FilterNeedsBuilding(enabledProjects); // Mark any currently broken projects as dirty newDirtyProjects.UnionWith(m_BrokenProjects); // Copy what we found into the set of dirty things m_DirtyProjects = newDirtyProjects; RunSomeBuilds(); Much easier to read! But how is this the same code? If we were on the UI thread, doesn't the UI thread have to block while FilterEnabledForBuilding runs? No, it doesn't, and that's the magic of the await keyword! It cuts your method up into its constituent pieces, much like I did manually with lambdas before. When you run it, only the piece up to the first await actually runs. The rest is passed to FilterEnabledForBuilding as a continuation, which will get called back whenever that method is finished. In the meantime, our thread returns, and can go back to making the UI responsive, or whatever else threads do in their spare time. This is actually a massive simplification, and if you're interested in all the gory details, and speed hacks that the await keyword actually does for you, I recommend Jon Skeet's blog posts about it.

    Read the article

  • Inside Red Gate - Exercising Externally

    - by simonc
    Over the next few weeks, we'll be performing experiments on SmartAssembly to confirm or refute various hypotheses we have about how people use the product, what is stopping them from using it to its full extent, and what we can change to make it more useful and easier to use. Some of these experiments can be done within the team, some within Red Gate, and some need to be done on external users. External testing Some external testing can be done by standard usability tests and surveys, however, there are some hypotheses that can only be tested by building a version of SmartAssembly with some things in the UI or implementation changed. We'll then be able to look at how the experimental build is used compared to the 'mainline' build, which forms our baseline or control group, and use this data to confirm or refute the relevant hypotheses. However, there are several issues we need to consider before running experiments using separate builds: Ideally, the user wouldn't know they're running an experimental SmartAssembly. We don't want users to use the experimental build like it's an experimental build, we want them to use it like it's the real mainline build. Only then will we get valid, useful, and informative data concerning our hypotheses. There's no point running the experiments if we can't find out what happens after the download. To confirm or refute some of our hypotheses, we need to find out how the tool is used once it is installed. Fortunately, we've applied feature usage reporting to the SmartAssembly codebase itself to provide us with that information. Of course, this then makes the experimental data conditional on the user agreeing to send that data back to us in the first place. Unfortunately, even though this does limit the amount of useful data we'll be getting back, and possibly skew the data, there's not much we can do about this; we don't collect feature usage data without the user's consent. Looks like we'll simply have to live with this. What if the user tries to buy the experiment? This is something that isn't really covered by the Lean Startup book; how do you support users who give you money for an experiment? If the experiment is a new feature, and the user buys a license for SmartAssembly based on that feature, then what do we do if we later decide to pivot & scrap that feature? We've either got to spend time and money bringing that feature up to production quality and into the mainline anyway, or we've got disgruntled customers. Either way is bad. Again, there's not really any good solution to this. Similarly, what if we've removed some features for an experiment and a potential new user downloads the experimental build? (As I said above, there's no indication the build is an experimental build, as we want to see what users really do with it). The crucial feature they need is missing, causing a bad trial experience, a lost potential customer, and a lost chance to help the customer with their problem. Again, this is something not really covered by the Lean Startup book, and something that doesn't have a good solution. So, some tricky issues there, not all of them with nice easy answers. Turns out the practicalities of running Lean Startup experiments are more complicated than they first seem! Cross posted from Simple Talk.

    Read the article

  • eSTEP Newsletter November 2012

    - by mseika
    Dear Partners,We would like to inform you that the November '12 issue of our Newsletter is now available.The issue contains information to the following topics: News from CorpOracle Celebrates 25 Years of SPARC Innovation; IDC White Papers Finds Growing Customer Comfort with Oracle Solaris Operating System; Oracle Buys Instantis; Pillar Axiom OpenWorld Highlights; Announcement Oracle Solaris 11.1 Availability (data sheet, new features, FAQ's, corporate pages, internal blog, download links, Oracle shop); Announcing StorageTek VSM 6; Announcement Oracle Solaris Cluster 4.1 Availability (new features, FAQ's, cluster corp page, download site, shop for media); Announcement: Oracle Database Appliance 2.4 patch update becomes available Technical SectionOracle White papers on SPARC SuperCluster; Understanding Parallel Execution; With LTFS, Tape is Gaining Storage Ground with additional link to How to Create Oracle Solaris 11 Zones with Oracle Enterprise Manager Ops Center; Provisioning Capabilities of Oracle Enterprise Ops Center Manager 12c; Maximizing your SPARC T4 Oracle Solaris Application Performance with the following articles: SPARC T4 Servers Set World Record on Siebel CRM 8.1.1.4 Benchmark, SPARC T4-Based Highly Scalable Solutions Posts New World Record on SPECjEnterprise2010 Benchmark, SPARC T4 Server Delivers Outstanding Performance on Oracle Business Intelligence Enterprise Edition 11g; Oracle SUN ZFS Storage Appliance Reference Architecture for VMware vSphere4; Why 4K? - George Wilson's ZFS Day Talk; Pillar Axiom 600 with connected subjects: Oracle Introduces Pillar Axiom Release 5 Storage System Software, Driving down the high cost of Storage, This Provisioning with Pilar Axiom 600, Pillar Axiom 600- System overview and architecture; Migrate to Oracle;s SPARC Systems; Top 5 Reasons to Migrate to Oracle's SPARC Systems Learning & EventsRecently delivered Techcasts: Learning Paths; Oracle Database 11g: Database Administration (New) - Learning Path; Webcast: Drill Down on Disaster Recovery; What are Oracle Users Doing to Improve Availability and Disaster Recovery; SAP NetWeaver and Oracle Exadata Database Machine ReferencesARTstor Selects Oracle’s Sun ZFS Storage 7420 Appliances To Support Rapidly Growing Digital Image Library, Scottish Widows Cuts Sales Administration 20%, Reduces Time to Prepare Reports by 75%, and Achieves Return on Investment in First Year, Oracle's CRM Cloud Service Powers Innovation: Applications on Demand; Technology on Demand, How toHow to Migrate Your Data to Oracle Solaris 11 Using Shadow Migration; Using svcbundle to Create SMF Manifests and Profiles in Oracle Solaris 11; How to prepare a Sun ZFS Storage Appliance to Serve as a Storage Devise with Oracle Enterprise Manager Ops Center 12c; Command Summary: Basic Operations with the Image Packaging System In Oracle Solaris 11; How to Update to Oracle Solaris 11.1 Using the Image Packaging System, How to Migrate Oracle Database from Oracle Solaris 8 to Oracle Solaris 11; Setting Up, Configuring, and Using an Oracle WebLogic Server Cluster; Ease the Chaos with Automated Patching: Oracle Enterprise Manager Cloud Control 12c; Book excerpt: Oracle Exalogic Elastic Cloud HandbookYou find the Newsletter on our portal under eSTEP News ---> Latest Newsletter. You will need to provide your email address and the pin below to get access. Link to the portal is shown below.URL: http://launch.oracle.com/PIN: eSTEP_2011Previous published Newsletters can be found under the Archived Newsletters section and more useful information under the Events, Download and Links tab. Feel free to explore and any feedback is appreciated to help us improve the service and information we deliver.Thanks and best regards,Partner HW Enablement EMEA

    Read the article

  • The Growing Importance of Network Virtualization

    - by user12608550
    The Growing Importance of Network Virtualization We often focus on server virtualization when we discuss cloud computing, but just as often we neglect to consider some of the critical implications of that technology. The ability to create virtual environments (or VEs [1]) means that we can create, destroy, activate and deactivate, and more importantly, MOVE them around within the cloud infrastructure. This elasticity and mobility has profound implications for how network services are defined, managed, and used to provide cloud services. It's not just servers that benefit from virtualization, it's the network as well. Network virtualization is becoming a hot topic, and not just for discussion but for companies like Oracle and others who have recently acquired net virtualization companies [2,3]. But even before this topic became so prominent, Solaris engineers were working on technologies in Solaris 11 to virtualize network services, known as Project Crossbow [4]. And why is network virtualization so important? Because old assumptions about network devices, topology, and management must be re-examined in light of the self-service, elasticity, and resource sharing requirements of cloud computing infrastructures. Static, hierarchical network designs, and inter-system traffic flows, need to be reconsidered and quite likely re-architected to take advantage of new features like virtual NICs and switches, bandwidth control, load balancing, and traffic isolation. For example, traditional multi-tier Web services (Web server, App server, DB server) that share net traffic over Ethernet wires can now be virtualized and hosted on shared-resource systems that communicate within a larger server at system bus speeds, increasing performance and reducing wired network traffic. And virtualized traffic flows can be monitored and adjusted as needed to optimize network performance for dynamically changing cloud workloads. Additionally, as VEs come and go and move around in the cloud, static network configuration methods cannot easily accommodate the routing and addressing flexibility that VE mobility implies; virtualizing the network itself is a requirement. Oracle Solaris 11 [5] includes key network virtualization technologies needed to implement cloud computing infrastructures. It includes features for the creation and management of virtual NICs and switches, and for the allocation and control of the traffic flows among VEs [6]. Additionally it allows for both sharing and dedication of hardware components to network tasks, such as allocating specific CPUs and vNICs to VEs, and even protocol-specific management of traffic. So, have a look at your current network topology and management practices in view of evolving cloud computing technologies. And don't simply duplicate the physical architecture of servers and connections in a virtualized environment…rethink the traffic flows among VEs and how they can be optimized using Oracle Solaris 11 and other Oracle products and services. [1] I use the term "virtual environment" or VE here instead of the more commonly used "virtual machine" or VM, because not all virtualized operating system environments are full OS kernels under the control of a hypervisor…in other words, not all VEs are VMs. In particular, VEs include Oracle Solaris zones, as well as SPARC VMs (previously called LDoms), and x86-based Solaris and Linux VMs running under hypervisors such as OEL, Xen, KVM, or VMware. [2] Oracle follows VMware into network virtualization space with Xsigo purchase; http://www.mercurynews.com/business/ci_21191001/oracle-follows-vmware-into-network-virtualization-space-xsigo [3] Oracle Buys Xsigo; http://www.oracle.com/us/corporate/press/1721421 [4] Oracle Solaris 11 Networking Virtualization Technology, http://www.oracle.com/technetwork/server-storage/solaris11/technologies/networkvirtualization-312278.html [5] Oracle Solaris 11; http://www.oracle.com/us/products/servers-storage/solaris/solaris11/overview/index.html [6] For example, the Solaris 11 'dladm' command can be used to limit the bandwidth of a virtual NIC, as follows: dladm create-vnic -l net0 -p maxbw=100M vnic0

    Read the article

  • Starter question of declarative style SQLAlchemy relation()

    - by jfding
    I am quite new to SQLAlchemy, or even database programming, maybe my question is too simple. Now I have two class/table: class User(Base): __tablename__ = 'users' id = Column(Integer, primary_key=True) name = Column(String(40)) ... class Computer(Base): __tablename__ = 'comps' id = Column(Integer, primary_key=True) buyer_id = Column(None, ForeignKey('users.id')) user_id = Column(None, ForeignKey('users.id')) buyer = relation(User, backref=backref('buys', order_by=id)) user = relation(User, backref=backref('usings', order_by=id)) Of course, it cannot run. This is the backtrace: File "/Library/Python/2.6/site-packages/SQLAlchemy-0.5.8-py2.6.egg/sqlalchemy/orm/state.py", line 71, in initialize_instance fn(self, instance, args, kwargs) File "/Library/Python/2.6/site-packages/SQLAlchemy-0.5.8-py2.6.egg/sqlalchemy/orm/mapper.py", line 1829, in _event_on_init instrumenting_mapper.compile() File "/Library/Python/2.6/site-packages/SQLAlchemy-0.5.8-py2.6.egg/sqlalchemy/orm/mapper.py", line 687, in compile mapper._post_configure_properties() File "/Library/Python/2.6/site-packages/SQLAlchemy-0.5.8-py2.6.egg/sqlalchemy/orm/mapper.py", line 716, in _post_configure_properties prop.init() File "/Library/Python/2.6/site-packages/SQLAlchemy-0.5.8-py2.6.egg/sqlalchemy/orm/interfaces.py", line 408, in init self.do_init() File "/Library/Python/2.6/site-packages/SQLAlchemy-0.5.8-py2.6.egg/sqlalchemy/orm/properties.py", line 716, in do_init self._determine_joins() File "/Library/Python/2.6/site-packages/SQLAlchemy-0.5.8-py2.6.egg/sqlalchemy/orm/properties.py", line 806, in _determine_joins "many-to-many relation, 'secondaryjoin' is needed as well." % (self)) sqlalchemy.exc.ArgumentError: Could not determine join condition between parent/child tables on relation Package.maintainer. Specify a 'primaryjoin' expression. If this is a many-to-many relation, 'secondaryjoin' is needed as well. There's two foreign keys in class Computer, so the relation() callings cannot determine which one should be used. I think I must use extra arguments to specify it, right? And howto? Thanks

    Read the article

  • Anyone got a nifty credit expiry algorithm?

    - by garethkeenan
    Our website uses a credit system to allow users to purchase inexpensive digital goods (eg. photos). We use credits, rather than asking the user to pay for items individually, because the items are cheap and we are trying to keep our credit-card/PayPal overhead low. Because we aren't a bank, we have to expire credits after a certain amount of time. We expire deposit credits after a year, but other types of credits (bonuses, prizes, refunds) may have a different shelf-life. When a buyer buys an item, we spend the credit that is going to expire first. Our current system keeps track of every deposit by storing the original value and the remainder to be spent. We keep a list of all purchases as well, of course. I am currently moving to a system which is much more like a traditional double-entry accounting system. A deposit will create a ledger item, increasing the user's 'spending' account balance. Every purchase will also create a ledger item, decreasing the user's 'spending' account balance. The new system has running balances, while the old system does not, which greatly improves our ability to find problems and do reconciliations. We do not want to use the old system of keeping a 'remainder' value attached to each deposit record because it is inefficient to replay a user's activities to calculate what the remainder of each deposit is over time (for the user's statement). So, after all of this verbose introduction, my question is "Does anyone else out there have a similar system of expiring credits?" If you could describe how you calculate expired credits it would be a great help. If all expired credits had the exact same shelf life, we would be able to calculate the expired amount using: Total Deposits - Total Spending - Deposits Not Due To Expire = Amount to Expire However, because deposits can have different shelf lives, this formula does not work because more than one deposit can be partially spent at any given time.

    Read the article

  • Mysterious extra hashtable entry

    - by Harm De Weirdt
    Good evening everyone, I'm back :) Let me explain my problem. I have a hashtable in wich I store the products a costumors buys (%orders). It uses the productcode as key and has a reference to an array with the other info as value. At the end of the program, I have to rewrite the inventory to the updated version (i.e. subtract the quantity of the bought items) This is how I do this: sub rewriteInventory{ open(FILE,'>inv.txt'); foreach $key(%inventory){ print FILE "$key\|$inventory{$key}[0]\|$inventory{$key}[1]\|$inventory{$key}[2]\n" } close(FILE); } where $inventory{$key}[x] is 0 - Title, 1 - price, 2 - quantity. The problem here is that when I look at inv.txt afterwards, I see things like this: CD-911|Lady Gaga - The Fame|15.99|21 ARRAY(0x145030c)||| BOOK-1453|The Da Vinci Code - Dan Brown|14.75|12 ARRAY(0x145bee4)||| Where do these "ARRAY(0x145030c)|||" entries come from? Or more important, how do I get rid of them? This is the last part of this school task, I had so much problems programming all this and this stupid little thing comes up now and I'm really fed up with this whole Perl thing. (this aside :p) I hope someone can help me :) Fuji

    Read the article

  • PayPal integration woes: PDT hangs on return to site

    - by Tom
    Hi, I'm implementing PayPal IPN & PDT. After some headache & time at the sandbox, IPN is working well and PDT returns the correct $_GET data. The implementation is as follows: Pass user ID in form to PayPal User buys product and triggers IPN which updates database for given user ID PDT returns transaction ID when user returns to site The return page says "please wait" and repeat-Ajax-checks for the transaction status User is redirected to success/failure page Everything works well, EXCEPT that when using the PayPal ready PHP code for PDT to do a return POST, the page hangs. PayPal waits for a response and the user never gets back to my site. I'm not getting a fail status, just nothing. The funny thing is that once the unknown error occurs, my test domain becomes unresponsive for a short period. The code (PHP): https://www.paypal.com/us/cgi-bin/webscr?cmd=p/xcl/rec/pdt-code-outside If I comment out the POST back, it all works fine. I'm able to pin down the problem to once the code enters the while{} loop. Unfortunately, I'm not experienced enough to write a replacement from scratch for the PayPal code, so would really appreciate any ideas on what might be wrong. The POST back goes to ssl://www.sandbox.paypal.com, and I'm using button code and an authorisation token that have all been created via a sandbox test account. Thanks in advance.

    Read the article

  • Sql query - selecting top 5 rows and further selecting rows only if User is present

    - by Gublooo
    Hello, I kind of stuck on how to implement this query - this is pretty similar to the query I posted earlier but I'm not able to crack it. I have a shopping table where everytime a user buys anything, a record is inserted. Some of the fields are * shopping_id (primary key) * store_id * user_id Now what I need is to pull only the list of those stores where he's among the top 5 visitors: When I break it down - this is what I want to accomplish: * Find all stores where this UserA has visited * For each of these stores - see who the top 5 visitors are. * Select the store only if UserA is among the top 5 visitors. The corresponding queries would be: select store_id from shopping where user_id = xxx select user_id,count(*) as 'visits' from shopping where store_id in (select store_id from shopping where user_id = xxx) group by user_id order by visits desc limit 5 Now I need to check in this resultset if UserA is present and select that store only if he's present. For example if he has visited a store 5 times - but if there are 5 or more people who have visited that store more than 5 times - then that store should not be selected. So I'm kind of lost here. Thanks for your help

    Read the article

  • How do I merge multiple PDB files ?

    - by blue.tuxedo
    We are currently using a single command line tool to build our product on both Windows and Linux. Si far its works nicely, allowing us to build out of source and with finer dependencies than what any of our previous build system allowed. This buys us great incremental and parallel build capabilities. To describe shortly the build process, we get the usual: .cpp -- cl.exe --> .obj and .pdb multiple .obj and .pdb -- cl.exe --> single .dll .lib .pdb multiple .obj and .pdb -- cl.exe --> single .exe .pdb The msvc C/C++ compiler supports it adequately. Recently the need to build a few static libraries emerged. From what we gathered, the process to build a static library is: multiple .cpp -- cl.exe --> multiple .obj and a single .pdb multiple .obj -- lib.exe --> a single .lib The single .pdb means that cl.exe should only be executed once for all the .cpp sources. This single execution means that we can't parallelize the build for this static library. This is really unfortunate. We investigated a bit further and according to the documentation (and the available command line options): cl.exe does not know how to build static libraries lib.exe does not know how to build .pdb files Does anybody know a way to merge multiple PDB files ? Are we doomed to have slow builds for static libraries ? How do tools like Incredibuild work around this issue ?

    Read the article

  • Database design: Calculating the Account Balance

    - by 001
    How do I design the database to calculate the account balance? 1) Currently I calculate the account balance from the transaction table In my transaction table I have "description" and "amount" etc.. I would then add up all "amount" values and that would work out the user's account balance. I showed this to my friend and he said that is not a good solution, when my database grows its going to slow down???? He said I should create separate table to store the calculated account balance. If did this, I will have to maintain two tables, and its risky, the account balance table could go out of sync. Any suggestion? EDIT: OPTION 2: should I add an extra column to my transaction tables "Balance". now I do not need to go through many rows of data to perform my calculation. Example John buys $100 credit, he debt $60, he then adds $200 credit. Amount $100, Balance $100. Amount -$60, Balance $40. Amount $200, Balance $240.

    Read the article

  • Specific Shopping Cart Recommendations

    - by Dean J
    I'm trying to suggest a solution for a friend who owns an existing web shop. The current solution isn't cutting it. The new solution needs to have a few things that look like they're enterprise-only if I go with Magento, and $12k a year for a store with maybe $20k in stock just doesn't work. The site should have items, which have one or more categories. Each category may have a parent category. Items have MSRP, and a discount rate by supplier, brand, and sometimes additional discount by product. When a user buys something, it should automatically setup a shipping label with UPS or USPS, depending on user's choice, and build two invoices; one to go in the box, one to go into records. This is crucial; it's low profit per item, so it needs to minimize labor here. Need to be able to have sales (limited by time), discount codes/coupon codes. Ideally would have private sales and/or members-only rates as well. It needs a payment gateway; Paypal/GCheckout-only isn't going to fly. Must be able to accept Visa/MC. Suggestions? I'm debating just building this myself in Java or PHP, but wanted to point my friend to a reasonable-cost solution that already exists if I can. This all seems pretty straightforward to code, save working with the UPS/USPS/Visa/MC APIs, and doing CSS for it.

    Read the article

< Previous Page | 37 38 39 40 41 42 43 44 45  | Next Page >