Search Results

Search found 8185 results on 328 pages for 'technical tests'.

Page 129/328 | < Previous Page | 125 126 127 128 129 130 131 132 133 134 135 136  | Next Page >

  • [Not in Vermont] IT Jobs: Sharepoint/ASP.NET Dev + Winforms C# Dev in Western Mass

    Two .NET jobs in Western Mass from a recruiter, contact info below Requirement #1: Our client is looking for the best engineers in the world, and then we give them the opportunity to excel. Our light, scrum-based process keeps you focused on delivering functionality that our customers need. We try to do things right (unit tests, continuous builds, bug tracking, etc) and were looking for others who work this way too. Primary Responsibilities Develop SharePoint applications in ASP.NET with a heavy...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Updated Agenda for OTN Architect Day Los Angeles (Oct 25)

    - by Bob Rhubart
    Here's the latest information on the session schedule and content for Oracle Technology Network Architect Day in Los Angeles on October 25, 2012. Registration is open, but seating is limited. When: Thursday October 25 12, 2012 8:30am – 5:00pm Where: Sofitel Los Angeles 8555 Beverly Boulevard Los Angeles, CA 90048 Agenda Time Session Title Room 8:30 am - 9:00 am Registration and Continental Breakfast 9:00 am - 9:15 am Welcome and Opening Comments | Bob Rhubart Beverly Ballroom 9:15 am - 10:00 am Engineered Systems: Oracle's Vision for the Future | Ralf Dossmann Oracle's Exadata and Exalogic are impressive products in their own right. But working in combination they deliver unparalleled transaction processing performance with up to a 30x increase over existing legacy systems, with the lowest cost of ownership over a 3 or 5 year basis than any other hardware. In this session you'll learn how to leverage Oracle's Engineered Systems within your enterprise to deliver record-breaking performance at the lowest TCO. Beverly Ballroom 10:00 am - 10:30 am Monitoring and Managing Applications in the Cloud | Basheer Khan Oracle offers a broad portfolio of software and hardware products and services to enable public, private and hybrid clouds to power the enterprise. However, enterprise cloud computing presents new management challenges, that need to be addressed to realize the economic benefits of cloud computing. In this session you will learn about the methods and tools you can use to proactively monitor your end-to-end Oracle Applications environment in the cloud, define service-level objectives, gain insight into your end users, and troubleshoot performance problems from a single console. Beverly Ballroom 10:30 am - 10:45 am Break 10:45 am - 11:30 am Breakout Sessions (pick one) Cloud Computing - Making IT Simple | Dr. James Baty The road to Cloud Computing is not without a few bumps. This session will help to smooth out your journey by tackling some of the potential complications. We'll examine whether standardization is a prerequisite for the Cloud. We'll look at why refactoring isn't just for application code. We'll check out deployable entities and their simplification via higher levels of abstraction. And we'll close out the session with a look at engineered systems and modular clouds. Beverly Ballroom Innovations in Grid Computing with Oracle Coherence | Ashok Aletty Learn how Oracle Coherence can increase the availability, scalability and performance of your existing applications with its advanced low-latency data-grid technologies. Also hear some interesting industry-specific use cases that customers had implemented and how Oracle is integrating Coherence into its Enterprise Java stack. Hollywood Room 11:30 am - 12:15 pm Breakout Sessions (pick one) Enterprise Strategy for Cloud Security | Dave Chappelle Security is high on the list of concerns for many organizations as they evaluate their cloud computing options. This session will examine security in the context of the various forms of cloud computing. We'll consider technical and non-technical aspects of security, and discuss several strategies for cloud computing, from both the consumer and producer perspectives. Beverly Ballroom Oracle Enterprise Manager | Perren Walker This session examines new Oracle Enterprise Manager monitoring, administration, and management features for Oracle Exalogic. It focuses on two management themes: cloud management related to virtualization and applications-to-disk management. For private cloud management, it discusses virtualization management features providing an enhanced set of application deployment capabilities enabling IaaS as well as PaaS interactions. Then from an end-to-end perspective, it covers the specific capabilities and—where applicable—best practices for machine, cloud, middleware, and application administration. Hollywood Room 12:15 pm - 1:15 pm Lunch Beverly Ballroom Lounge 1:15 pm - 2:00 pm Panel Discussion - Q&A with session speakers Beverly Ballroom 2:00 pm - 2:45 pm Breakout Sessions (pick one) Oracle Cloud Reference Architecture | Anbu Krishnaswamy Cloud initiatives are beginning to dominate enterprise IT roadmaps. Successful adoption of Cloud and the subsequent governance challenges warrant a Cloud reference architecture that is applied consistently across the enterprise. This presentation will answer the important questions: What exactly is a Cloud, why you need it, what changes it will bring to the enterprise, and what are the key capabilities of a Cloud infrastructure are - using Oracle's Cloud Reference Architecture, which is part of the IT Strategies from Oracle (ITSO) Cloud Enterprise Technology Strategy ETS). Beverly Ballroom 21st Century SOA | Jeff Davies Service Oriented Architecture has evolved from concept to reality in the last decade. The right methodology coupled with mature SOA technologies has helped customers demonstrate success in both innovation and ROI. In this session you will learn how Oracle SOA Suite's orchestration, virtualization, and governance capabilities provide the infrastructure to run mission critical business and system applications. We'll also take a special look at the convergence of SOA & BPM using Oracle's Unified technology stack. Hollywood Room 2:45 pm - 3:00 pm Break 3:00 pm - 4:00 pm Roundtable Discussion Beverly Ballroom 4:00 pm - 4:15 pm Closing Comments & Readouts from Roundtables Beverly Ballroom 4:15 pm - 5:00 pm Networking / Reception Beverly Ballroom Lounge Note: Session schedule and content subject to change.

    Read the article

  • How easily recognized are new TLDs?

    - by Ryan Muller
    I'm interested in purchasing a domain name for a new service I intend to market. I know that .com is instantly recognizable as a domain ending, and if I see stackoverflow.com I know it's a web address. However, I also recognize strings like github.io and mysite.tk as domains, since I've worked with domains like these. To the average member of the public, if one sees an address ending in .io or similar, non-mainstream TLD (e.g. on a billboard or business card) would they immediately know it's a URL and to type it into a browser? Or are these new domains only useful 1) for a technical audience or 2) when you will be primarily promoting your site through links and not print?

    Read the article

  • How do you end up with event-sourcing if you use a xDD approach?

    - by Tomas Jansson
    When working in a TDD or BDD manner your unit tests are supposed to drive your design. But how do you end up with event-sourcing using a xDD techniques? As I see it event sourcing is something you need to adopt early on to take full advantage of it. Lets say that you start without event-sourcing and do a release. Later on when you are releasing version 2.0 you realize that it would be great to use event-sourcing, but at that point you alread have missed all the events from version 1.0 so it makes it much harder to implement. Or do you take some kind of backup of your db from before event-sourcing and use that as base line and then add event-sourcing on top of that?

    Read the article

  • Ceská obchodní banka, a.s. Upgrades to Oracle Database 11g On Time, On Budget and without Disrupting Business Operations

    - by jgelhaus
    You want the new features of the latest release, but upgrading a database is one of those things DBAs can "lose sleep" over.  Ceská obchodní banka, a.s."CSOB" needed to upgrade its production systems in the Czech Republic and Slovakia that supported 90 key applications for its retail, corporate, internet, and ATM services from Oracle Database 9i to Oracle Database 11g with simultaneous migration from Alpha processors/OpenVMS-based hardware to a Power7, AIX system. Oracle Consulting helped to complete the upgrade within schedule and budget, while meeting tight restrictions on downtime. Knowledge transfer by Oracle Consulting to the bank’s IT team has improved self-sufficiency in support and maintenance while the technical and advisory services of Oracle Consulting Expert Services continue to optimize performance and availability while lowering cost of ownership. Read how CSOB maximized the value of its investment in Oracle Database technology with an upgrade to Oracle Database 11g.

    Read the article

  • custom facebook connect image - Is it facebook's policy violation?

    - by Viruthagiri
    I was going to change facebook's default login button with my custom image like mashable I mean like this But I found a article which state its against facebook's policies Is it really a violation? If it is how come mashable using custom image? Can someone answer me? Update This is the exact image i would like to use. Facebook mentioned like this in this page. While you may scale the size to suit your needs, you may not modify the “f” logo in any other way (such as by changing the design or color). If you are unable to use the correct colour due to technical limitations, you may revert to black and white. So my sign in with facebook image violating facebook policy in anyway?

    Read the article

  • Will small random dynamic snippets break caching

    - by Saif Bechan
    I am busy writing a WordPress plugin. Now most users have cache plugins installed, they cache the pages. I know also some webservers as nginx have php caching and whatnot. There are also things like memcached. Now I have to admit I do not know exactly how they work, if anyone have some good links on how they work I would be glad. Some links where it's explained simple, not to technical. Now the question. My plugin displays different statistics on posts, they are always different, will this break the caching of the page. To take is a step further, sometimes the statistics of the post needs updating, and there is a small javascript snippet added to the page. Now will these two action result in the page not caching, or am I ok.

    Read the article

  • Managed code and the Shell – Do?

    Back in 2006 I wrote a blog post titled: Managed code and the Shell – Don't!. Please visit that post to see why that advice was given.The crux of the issue has been addressed in the latest CLR via In-Process Side-by-Side Execution. In addition to the MSDN documentation I just linked, there is also an MSDN article on the topic: In-Process Side-by-Side.Now, even though the major technical impediment seems to be removed, I don’t know if Microsoft is now officially supporting managed extensions to the shell. Either way, I noticed a CodePlex project that is marching ahead to enable exactly that: Managed Mini Shell Extension Framework. Not much activity there, but maybe it will grow once .NET 4 is released... Comments about this post welcome at the original blog.

    Read the article

  • First Foray&ndash;About timeout

    - by SQLMonger
    It has been quite a while since I signed up for this blog site and high time that something was posted.  I have a list of topics that I will be working through and posting.  Some I am sure will have been posted by others, but I will be sticking to the technical problems and challenges that I’ve recently faced, and the solutions that worked for me.  My motto when learning something new has always been “My kingdom for an example!”, and I plan on delivering useful examples here so others can learn from my efforts, failures and successes.   A bit of background about me… My name is Clayton Groom. I am a founding partner of a consulting firm in St. Louis Missouri, Covenant Technology Partners, LLC and focus on SQL Server Data Warehouse design, Analysis Services and Enterprise Reporting solutions.  I have been working with SQL Server since the early nineties, when it still only ran on OS/2. I love solving puzzles and technical challenges.   Enough about me… On to a real problem… SSIS Connection Time outs versus Command Time outs Last week, I was working on automating the processing for a large Analysis Services cube.  I had reworked an SSIS package and script task originally posted by Vidas Matelis that automates the process of adding new and dropping old partitions to/from an Analysis Services cube.  I had the package working great, tested, and ready for deployment.  It basically performs a query against the source system to determine if there is new data in the warehouse that will require a new partition to be added to the cube, and it checks the cube to see if there are any partitions that are present that are no longer needed in a rolling 60 month window. My client uses Tivoli for running all their production jobs, and not SQL Agent, so I had to build a command line file for Tivoli to use to run the package. Everything was going great. I had tested the command file from my development workstation using an XML configuration file to pass in server-specific parameters into the package when executed using the DTExec utility. With all the pieces ready, I updated the dtsconfig file to point to the UAT environment and started working with the Tivoli developer to test the job.  On the first run, the job failed, and from what I could see in the SSIS log, it had failed because of a timeout. Other errors in the log made me think that perhaps the connection string had not been passed into the package correctly. We bumped the Connection Manager  timeout values from 20 seconds to 120 seconds and tried again. The job still failed. After changing the command line to use the /SET option instead of the /CONFIGFILE option, we tested again, and again failure. After a number more failed attempts, and getting the Teradata DBA involved to monitor and see if we were connecting and failing or just failing to connect, we determined that the job was indeed connecting to the server and then disconnecting itself after 30 seconds.  This seemed odd, as we had the timeout values for the connection manager set to 180 seconds by then.  At this point one of the DBA’s found a post on the Teradata forum that had the clues to the puzzle: There is a separate “CommandTimeout” custom property on the Data source object that may needed to be adjusted for longer running queries.  I opened up the SSIS package, opened the data flow task that generated the partition list table and right-clicked on the data source. from the context menu, I selected “Show Advanced Editor” and found the property. Sure enough, it was set to 30 seconds. The CommandTimeout property can also be edited in the SSIS Properties sheet. In order to determine how long the timeout needed to be, I ran the query from the task in the development environment and received a response in a matter of seconds.  I then tried the same query against the production database and waited several minutes for a response. This did not seem to be a reasonable response time for the query involved, and indeed it wasn’t. The Teradata DBA’s adjusted the query governor settings for the service account I was testing with, and we were able to get the response back down under a minute.  Still, I set the CommandTimeout property to a much higher value in case the job was ever started during a time of high-demand on the production server. With this change in place, the job finally completed successfully.  The lesson learned for me was two-fold: Always compare query execution times between development and production environments, and don’t assume that production will always be faster.  With higher user demands, query governors, and a whole lot more data, the execution time of even what might seem to be simple queries can vary greatly. SSIS Connection time out settings do not affect command time outs.  Connection timeouts control how long the package will wait for a response from the server before assuming the server is not available or is not responding. Command time outs control how long a task will wait for results to start being returned before deciding that the server is not responding. Both lessons seem pretty straight forward, and I felt pretty sheepish once I finally figured out what the issue was.  To be fair though, In the 5+ years that I have been working with SSIS, I could only recall one other time where I had to set the CommandTimeout property, and that memory only resurfaced while I was penning this post.

    Read the article

  • Software development life cycle in the industry

    - by jiewmeng
    I am taking a module called "Requirements Analysis & Design" in a local university. Common module, I'd say (on software development life cycle (SDLC) and UML). But there is a lot of things I wonder if they are actually (strictly) practiced in the industry. For example, will a domain class diagram, an not anything extra (from design class), be strictly the output from Analysis or Discovery phase? I'm sure many times you will think a bit about the technical implementation too? Else you might end up with a design class diagram later that is very different from the original domain class diagram? I also find it hard to remember what diagrams are from Initiation, Discovery, Design etc etc. Plus these phases vary from SDLC to SDLC, I believe? So I usually will create a diagram when I think will be useful. Is it the wrong way?

    Read the article

  • Workflow Overview & Best Practices - EMEA

    - by Annemarie Provisero
    ADVISOR WEBCAST: Workflow Overview & Best Practices - EMEA PRODUCT FAMILY: EBS - ATG - Workflow   February 16, 2011 at 10:00 am CET, 02:30 pm India, 06:00 pm Japan, 08:00 pm Australia This 1.5-hour session is recommended for technical and functional Users who are interested to get an generic overview about the Tools and Utilities available to get a closer look into the Java Virtual Machine used in an E-Business Suite Environment and how to tune it. TOPICS WILL INCLUDE: Introduction of Workflow Useful Utilities and Tools Best Practices Q&A A short, live demonstration (only if applicable) and question and answer period will be included. Oracle Advisor Webcasts are dedicated to building your awareness around our products and services. This session does not replace offerings from Oracle Global Support Services. Click here to register for this session ------------------------------------------------------------------------------------------------------------- The above webcast is a service of the E-Business Suite Communities in My Oracle Support.For more information on other webcasts, please reference the Oracle Advisor Webcast Schedule.Click here to visit the E-Business Communities in My Oracle Support Note that all links require access to My Oracle Support.

    Read the article

  • Gauging Maturity of your BPM Strategy - part 2 / 2

    - by Sanjeev Sharma
    In my earlier post I had discussed the essence of maturity assessment and the business imperative for doing the same in the context of BPM. In this post I will discuss Oracle’s BPM Maturity assessment methodology. Oracle’s BPM Maturity model comprises of the following components: Maturity – represents stages of evolution of your BPM capability with 0 being the lowest level and 5 being the highest level  Domain – represents multiple perspectives both technical and business oriented against which your BPM capability can be assessed Adoption – represents scale of BPM rollout starting at the project level to the enterprise level Note: Your BPM capability can be at different levels of maturity for the different domains. Oracle’s BPM assessment methodology measures the maturity of your BPM capability at the individual domain level as well as the aggregate level. The output of Oracle’s BPM assessment benefits you in two ways: Gap Analysis by comparing the “As-Is” BPM capability with the desired “To-Be” BPM capability along the various domains  (see Figure 1) Systematic Adoption by aligning evolution of BPM capability with its rollout in multiple phases (see Figure 2)

    Read the article

  • How to fix “Unit Test Runner failed to load test assembly”

    - by ybbest
    I encountered this issue a couple times during my recent project, every time I forgot what actually cause the issue. Therefore, I decide to write a quick blog post to make sure I can identify the issue quickly. Problem: Run unit test using a test runner and received a Unit Test Runner failed to load test assembly exception. Analysis: Basically, I have changed some code and start the test runner to run tests. The same dll have already been deployed to GAC. So the test runner actually tries to use the old version of the assembly thus could not load the assembly. Solution: Deploy the current version of dll to the GAC and re-run your test, it works like a charm.

    Read the article

  • Tales of a corrupt SQL log

    Warning: Im a simple dev, not an all powerful DBA with godly powers. This morning, one of my sites was down and DNN reported a problem with the database.  A quick series of tests revealed that the culprit was a corrupted log file. Easy fix I said, I have daily backups so its just a mater of restoring a good copy of the database and log files.  Well, I found out thats not exactly true.  You see, for this database, I have daily file backups and these are not database backups created...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Twig Code Completion

    - by Ondrej Brejla
    Hi all! After few weeks we have a new feature for you which will be available in upcoming NetBeans 7.3. It's about new code completion in Twig files! So let us introduce it a bit. Now we hopefully support all of Twig built-in elements. It means Tags, Filters, Functions, Tests and Operators (see Twig documentation for more information). All elements are also documented, so if you don't know what which element does, just read it in the IDE documentation window! We try to resolve some Completion Context to suggest you only the most corresponding element types, but it's not so visible, since almost everything can be used everywhere in Twig files ;) And that's all for today and as usual, please test it and if you find something strange, don't hesitate to file a new issue (product php, component Twig). Thanks a lot!

    Read the article

  • All day optimizer event....

    - by noreply(at)blogger.com (Thomas Kyte)
    I've recently taken over some of the responsibilities of Maria Colgan (also known as the "optimizer lady") so she can move onto supporting our new In Memory Database features (note her new twitter handle for that: https://twitter.com/db_inmemory ).To that end, I have two one day Optimizer classes scheduled this year (and more to follow next year!).  The first one will be Wednesday November 20th in Northern California.  You can find details for that here: http://www.nocoug.org/ .The next one will be 5,500 miles (about 8,800 km) away in the UK - in Manchester.  That'll take place immediately following the UKOUG technical conference taking place the first week of December on December 5th.  You can see all of the details for that here: http://www.ukoug.org/events/tom-kyte-seminar-2013/I know I'll be doing one in Belgrade early next year, probably the first week in April. Stay tuned for details on that and for more to come.

    Read the article

  • Large sparse (stiff) ODE system needed for testing

    - by macydanim
    I hope this is the right place for this question. I have been working on a sparse stiff implicit ODE solver and have finished the code so far. I now tested the solver with the Van der Pol equation, and another stiff problem, which is of dimension 4. But to perform better tests I am searching for a bigger system. I'm thinking of the order N = 100...1000, if possible stiff and sparse. Does anybody have an example I could use? I really don't know where to search.

    Read the article

  • How do I get my mac to boot from an Ubuntu USB key?

    - by Vinay Gupta
    so if you select "mac" and "usb" on this download page, it gives a series of command line instructions to make a USB key which the MacBook will boot into Ubuntu from. http://www.ubuntu.com/desktop/get-ubuntu/download I've followed them to the letter two or three times on different USB keys, and it doesn't work. There's a very great deal of technical discussion about EFI etc. but this set of instructions seems to suggest it should Just Work, and it doesn't. Help? I'm increasingly unhappy with the more locked-down approach Apple is taking, and I'd quite like to start using Linux with a view to transitioning over to using it as my main operating system, but booting from the CD takes forever, runs slowly and I'm really hoping to get it moving off USB. Can anybody help me?

    Read the article

  • 9/18 Live Webcast: Three Compelling Reasons to Upgrade to Oracle Database 11g

    - by jgelhaus
    Webcast: Three Compelling Reasons to Upgrade to Oracle Database 11g Date: Tuesday, September 18, 2012 Time: 10 a.m. PT/1 p.m. ET If you or your organization is still working with Oracle Database 10g or an even older version, now is the time to upgrade. Oracle Database 11g offers a wide variety of advantages to enhance your operation. Join us for this live Webcast and learn about what you’re missing: the business, operational, and technical benefits. With Oracle Database 11g, you can: Upgrade with zero downtime Improve application performance and database security Reduce the amount of storage required Save time and money Register today here

    Read the article

  • Books are Dead! Long Live the Books!

    - by smisner
    We live in interesting times with regard to the availability of technical material. We have lots of free written material online in the form of vendor documentation online, forums, blogs, and Twitter. And we have written material that we can buy in the form of books, magazines, and training materials. Online videos and training – some free and some not free – are also an option. All of these formats are useful for one need or another. As an author, I pay particular attention to the demand for books, and for now I see no reason to stop authoring books. I assure you that I don’t get rich from the effort, and fortunately that is not my motivation. As someone who likes to refer to books frequently, I am still a big believer in books and have evidence from book sales that there are others like me. If I can do my part to help others learn about the technologies I work with, I will continue to produce content in a variety of formats, including books. (You can view a list of all of my books on the Publications page of my site and my online training videos at Pluralsight.) As a consumer of technical information, I prefer books because a book typically can get into a topic much more deeply than a blog post, and can provide more context than vendor documentation. It comes with a table of contents and a (hopefully accurate) index that helps me zero in on a topic of interest, and of course I can use the Search feature in digital form. Some people suggest that technology books are outdated as soon as they get published. I guess it depends on where you are with technology. Not everyone is able to upgrade to the latest and greatest version at release. I do assume, however, that the SQL Server 7.0 titles in my library have little value for me now, but I’m certain that the minute I discard the book, I’m going to want it for some reason! Meanwhile, as electronic books overtake physical books in sales, my husband is grateful that I can continue to build my collection digitally rather than physically as the books have a way of taking over significant square footage in our house! Blog posts, on the other hand, are useful for describing the scenarios that come up in real-life implementations that wouldn’t fit neatly into a book. As many years that I have working with the Microsoft BI stack, I still run into new problems that require creative thinking. Likewise, people who work with BI and other technologies that I use share what they learn through their blogs. Internet search engines help us find information in blogs that simply isn’t available anywhere else. Another great thing about blogs, also, is the connection to community and the dialog that can ensue between people with common interests. With the trend towards electronic formats for books, I imagine that we’ll see books continue to adapt to incorporate different forms of media and better ways to keep the information current. At the moment, I wish I had a better way to help readers with my last two Reporting Services books. In the case of the Microsoft® SQL Server™ 2005 Reporting Services Step by Step book, I have heard many cases of readers having problems with the sample database that shipped on CD – either the database was missing or it was corrupt. So I’ve provided a copy of the database on my site for download from http://datainspirations.com/uploads/rs2005sbsDW.zip. Then for the Microsoft® SQL Server™ 2008 Reporting Services Step by Step book, we decided to avoid the database problem by using the AdventureWorks2008 samples that Microsoft published on Codeplex (although code samples are still available on CD). We had this silly idea that the URL for the download would remain constant, but it seems that expectation was ill-founded. Currently, the sample database is found at http://msftdbprodsamples.codeplex.com/releases/view/37109 but I have no idea how long that will remain valid. My latest books (#9 and #10 which are milestones I never anticipated), Building Integrated Business Intelligence Solutions with SQL Server 2008 R2 and Office 2010 (McGraw Hill, 2011) and Business Intelligence in Microsoft SharePoint 2010 (Microsoft Press, 2011), will not ship with a CD, but will provide all code samples for download at a site maintained by the respective publishers. I expect that the URLs for the downloads for the book will remain valid, but there are lots of references to other sites that can change or disappear over time. Does that mean authors shouldn’t make reference to such sites? Personally, I think the benefits to be gained from including links are greater than the risks of the links becoming invalid at some point. Do you think the time for technology books has come to an end? Is the delivery of books in electronic format enough to keep them alive? If technological barriers were no object, what would make a book more valuable to you than other formats through which you can obtain information?

    Read the article

  • Is BDD actually writable by non-programmers?

    - by MattiSG
    Behavior-Driven Development with its emblematic “Given-When-Then” scenarios syntax has lately been quite hyped for its possible uses as a boundary object for software functionality assessment. I definitely agree that Gherkin, or whichever feature definition script you prefer, is a business-readable DSL, and already provides value as such. However, I disagree that it is writable by non-programmers (as does Martin Fowler). Does anyone have accounts of scenarios being written by non-programmers, then instrumented by developers? If there is indeed a consensus on the lack of writability, then would you see a problem with a tool that, instead of starting with the scenarios and instrumenting them, would generate business-readable scenarios from the actual tests?

    Read the article

  • Unity 5.1 audio issues (no sound in back channels)

    - by N0xus
    I've trying to bring in surround sound audio into my project. I've set my computer up to run in 5.1 and when I play a 6 channel audio through windows media player (it's a test audio that does left speaker, right speaker etc) it works fine. However, when I run it through Unity, all I get is the front 3 channels. I've set it in the Edit - project settings - audio to be 5.1 in there. I even set it in code with following: void Start() { AudioSettings.speakerMode = AudioSpeakerMode.Mode5point1; } How ever, when I run a debug line of: print ( AudioSettings.driverCaps); It tells me that Unity is only playing in stereo. Is there something I'm still not doing? I should also add I've ran 10 different tests using the 3D audio pan and spread options. I've set both to either being fully off, half way on and full. Still the same results.

    Read the article

  • What are some easy techniques to scan books for new information?

    - by aditya menon
    I find it irresistible to keep purchasing cheap programming and technical e-books in fields such as Drupal, PHP, etc., and also compulsively download free material made available such as those from Microsoft's developer blog... The main problem with the large library I've developed is that there are many chapters (especially the first few) in these books packed with information I already know, but with helpful tidbits hidden in between. The logical step would be to skip those chapters and read the ones I don't seem to know anything about, but I'm afraid I may lose out on really important information this way. But naturally it is tedious to have to read about variables, functions and objects all over again when you are trying to know more about the Registry pattern, for example. It's hard to research on the net for this, because my question itself seems vague and difficult to formulate into a single search query. I need people-advice - what do you do in this situation?

    Read the article

  • Why is using C++ libraries so complicated?

    - by Pius
    First of all, I want to note I love C++ and I'm one of those people who thinks it is easier to code in C++ than Java. Except for one tiny thing: libraries. In Java you can simply add some jar to the build path and you're done. In C++ you usually have to set multiple paths for the header files and the library itself. In some cases, you even have to use special build flags. I have mainly used Visual Studio, Code Blocks and no IDE at all. All 3 options do not differ much when talking about using external libraries. I wonder why was there made no simpler alternative for this? Like having a special .zip file that has everything you need in one place so the IDE can do all the work for you setting up the build flags. Is there any technical barrier for this?

    Read the article

  • Solution with multiple projects and (GitHub) single issue tracker and repository

    - by Luiz Damim
    I have a Visual Studio solution with multiple projects: Acme.Core Acme.Core.Tests Acme.UI.MvcSite1 Acme.UI.MvcSite2 Acme.UI.WinformsApp1 Acme.UI.WinformsApp2 ... The entire solution is checked-in in a single GitHub (private) repo. Acme.Core contains our business logic and all UI projects are deployables. UI projects have different requirements and features, but some of them are implemented in more than one project. All issues are opened in a single issue tracker and classified using labels ([MvcSite1], [WinformsApp1], etc) but I'm thinking it's starting to get messy. Is it ok to use a single repository and issue tracker to track multiple projects in one solution?

    Read the article

< Previous Page | 125 126 127 128 129 130 131 132 133 134 135 136  | Next Page >