Search Results

Search found 48063 results on 1923 pages for 'how to create dynamically'.

Page 489/1923 | < Previous Page | 485 486 487 488 489 490 491 492 493 494 495 496  | Next Page >

  • Porting Ruby/NCruses Rogue-Like to .NET and FlatRedBall

    - by ashes999
    I created an awesome rogue-like game in Ruby. For the GUI, I used NCurses. Since I'm using FlatRedBall as my engine of choice for Silverlight game development, I want to port this game over. What is the best way to efficiently doing this, and what are the pitfalls I should expect? For example, Ruby is object-oriented, like C#, and I should be able to just convert (rewrite) classes one by one. However, I will run into issues like: NCurses API. I need to possibly create my own notions of a "Window", or else rewrite GUI code. It's one class, but it's BIG. Mix-Ins. These are essentially aspect-oriented development. There are a couple of solutions in .NET, like dynamic classes. What else? Also, I should mention that I want to create a C# application out of this. As much as possible, I'll dump reusable and helper code, algorithms, etc. into separate projects and generate reusable DLLs.

    Read the article

  • Your Experience Platform

    - by David Dorf
    Crosstalk once again exceeded my expectations, improving upon last year's conference in terms of venue, knowledge sharing, and entertainment.  Its great to see the Oracle Retail family continues to grow, especially outside the US.  I had a great time talking to retailers, analysts, press, and colleagues from around the world. Because the economy, demographics, technology, etc. are constantly changing, retailers must always be evolving their business to capture the next market.  But it takes guts to change something that appears to be working, and it takes a bit of luck to get the timing right.  To a large extent, innovation is about "guts and luck." To help retailers innovate, Oracle Retail provides all the necessary software to create Your Experience Platform.  There is no "Oracle Experience Platform" as each retailer needs something different to deliver on their brand promise.  We provide the actionable insight, optimized operations, and connected interactions, but its still up to the retailer to make it theirs. One such retailer is Masters, a home improvement retailer in Australia formed through a partnership between Woolworths and Lowes.  Woolworths is an established retailer in Australia, so they are already close to their customers and able to understand their needs.  In Australia 74% of dwellings are detached houses and the population is continues to "move up" into bigger and bigger homes. Masters is using Oracle Retail's software to create their experience platform that will deliver on their brand promise, which includes everyday low prices, wide range of products, smarter self-service, and an inviting store environment.  The Oracle Retail software provides the foundation that allows them to rapidly deliver on this promise -- Masters is engineered for success.

    Read the article

  • CodePlex Daily Summary for Thursday, June 21, 2012

    CodePlex Daily Summary for Thursday, June 21, 2012Popular ReleasesuComponents: uComponents v3.1.1: Continuing on from 84817, we are proud to announce our 3.1.1 release! The following issues have been resolved: 14640 14696 14704 14724 Please note: This release is not to be confused with the upcoming 80410 (which will support .NET 4.0)MVVM Light Toolkit: V4RTM (binaries only) including Windows 8 RP: This package contains all the latest DLLs for MVVM Light V4 RTM. It includes the DLLs for Windows 8 Release Preview. An updated Nuget package is also available at http://nuget.org/packages/MvvmLightLibs An installer with binaries, snippets and templates will follow ASAP.Weapsy - ASP.NET MVC CMS: 1.0.0: - Some changes to Layout and CSS - Changed version number to 1.0.0.0 - Solved Cache and Session items handler error in IIS 7 - Created the Modules, Plugins and Widgets Areas - Replaced CKEditor with TinyMCE - Created the System Info page - Minor changesAuto Proxy Configuration: Windows Proxy Setup V1.2: Bug FixesXDA ROM Hub: XDA ROM HUB v0.5: Added XRH Backup -- Backup & restore data, system and cache USE AT YOUR OWN RISK! - USE ONLY IN RECOVERY!AcDown????? - AcDown Downloader Framework: AcDown????? v3.11.7: ?? ●AcDown??????????、??、??????。????,????,?????????????????????????。???????????Acfun、????(Bilibili)、??、??、YouTube、??、???、??????、SF????、????????????。 ●??????AcPlay?????,??????、????????????????。 ● AcDown??????????????????,????????????????????????????。 ● AcDown???????C#??,????.NET Framework 2.0??。?????"Acfun?????"。 ????32??64? Windows XP/Vista/7/8 ??:????????Windows XP???,?????????.NET Framework 2.0???(x86),?????"?????????"??? ??????????????,??????????: ??"AcDown?????"????????? ...Apex: Apex 1.4: Apex 1.4Apex 1.4 provides a framework for rapid MVVM development. Download Apex 1.4 to get the core binaries, Visual Studio Extensions, Project Templates, Samples and Documentation. The 1.4 Release provides a vast number of enhancements via the Apex Broker. The Apex Broker is an object that can be used to retrieve models, get the view for a view model and more, much like an IoC container. The new Zune Style application templates for WPF and Silverlight give a great starting point for makin...NShader - HLSL - GLSL - CG - Shader Syntax Highlighter AddIn for Visual Studio: NShader 1.3 - VS2010 + VS2012: This is a small maintenance release to support new VS2012 as well as VS2010. This release is also fixing the issue The "Comment Selection" include the first line after the selection If the new NShader version doesn't highlight your shader, you can try to: Remove the registry entry: HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\11.0\FontAndColors\Cache Remove all lines using "fx" or "hlsl" in file C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\CommonExtensions\Micr...JSON Toolkit: JSON Toolkit 4.0: Up to 2.5x performance improvement in stringify operations Up to 1.7x performance improvement in parse operations Improved error messages when parsing invalid JSON strings Extended support to .Net 2.0, .Net 3.5, .Net 4.0, Silverlight 4, Windows Phone, Windows 8 metro apps and Xbox JSON namespace changed to ComputerBeacon.Json namespaceXenta Framework - extensible enterprise n-tier application framework: Xenta Framework 1.8.0: System Requirements OS Windows 7 Windows Vista Windows Server 2008 Windows Server 2008 R2 Web Server Internet Information Service 7.0 or above .NET Framework .NET Framework 4.0 WCF Activation feature HTTP Activation Non-HTTP Activation for net.pipe/net.tcp WCF bindings ASP.NET MVC ASP.NET MVC 3.0 Database Microsoft SQL Server 2005 Microsoft SQL Server 2008 Microsoft SQL Server 2008 R2 Additional Deployment Configuration Started Windows Process Activation service Start...ASP.NET REST Services Framework: Release 1.3 - Standard version: The REST-services Framework v1.3 has important functional changes allowing to use complex data types as service call parameters. Such can be mapped to form or query string variables or the HTTP Message Body. This is especially useful when REST-style service URLs with POST or PUT HTTP method is used. Beginning from v1.1 the REST-services Framework is compatible with ASP.NET Routing model as well with CRUD (Create, Read, Update, and Delete) principle. These two are often important when buildin...NanoMVVM: a lightweight wpf MVVM framework: v0.10 stable beta: v0.10 Minor fixes to ui and code, added error example to async commands, separated project into various releases (mainly into logical wholes), removed expression blend satellite assembliesMFCMAPI: June 2012 Release: Build: 15.0.0.1034 Full release notes at SGriffin's blog. If you just want to run the MFCMAPI or MrMAPI, get the executables. If you want to debug them, get the symbol files and the source. The 64 bit builds will only work on a machine with Outlook 2010 64 bit installed. All other machines should use the 32 bit builds, regardless of the operating system. Facebook BadgeMonoGame - Write Once, Play Everywhere: MonoGame 2.5.1: Release Notes The MonoGame team are pleased to announce that MonoGame v2.5.1 has been released. This release contains important bug fixes and minor updates. Recent additions include project templates for iOS and MacOS. The MonoDevelop.MonoGame AddIn also works on Linux. We have removed the dependency on the thirdparty GamePad library to allow MonoGame to be included in the debian/ubuntu repositories. There have been a major bug fix to ensure textures are disposed of correctly as well as some ...????: ????2.0.2: 1、???????????。 2、DJ???????10?,?????????10?。 3、??.NET 4.5(Windows 8)????????????。 4、???????????。 5、??????????????。 6、???Windows 8????。 7、?????2.0.1???????????????。 8、??DJ?????????。Azure Storage Explorer: Azure Storage Explorer 5 Preview 1 (6.17.2012): Azure Storage Explorer verison 5 is in development, and Preview 1 provides an early look at the new user interface and some of the new features. Here's what's new in v5 Preview 1: New UI, similar to the new Windows Azure HTML5 portal Support for configuring and viewing storage account logging Support for configuring and viewing storage account monitoring Uses the Windows Azure 1.7 SDK libraries Bug fixesCodename 'Chrometro': Developer Preview: Welcome to the Codename 'Chrometro' Developer Preview! This is the very first public preview of the app. Please note that this is a highly primitive build and the app is not even half of what it is meant to be. The Developer Preview sports the following: 1) An easy to use application setup. 2) The Assistant which simplifies your task of customization. 3) The partially complete Metro UI. 4) A variety of settings 5) A partially complete web browsing experience To get started, download the Ins...KangaModeling: Kanga Modeling 1.0: This is the public release 1.0 of Kanga Modeling. -Cosmos (C# Open Source Managed Operating System): Release 92560: Prerequisites Visual Studio 2010 - Any version including Express. Express users must also install Visual Studio 2010 Integrated Shell runtime VMWare - Cosmos can run on real hardware as well as other virtualization environments but our default debug setup is configured for VMWare. VMWare Player (Free). or Workstation VMWare VIX API 1.11AutoUpdaterdotNET : Autoupdate for VB.NET and C# Developer: AutoUpdater.NET 1.1: Release Notes New feature added that allows user to select remind later interval.New Projects.NET Heatmap: This is a simple project using C#, JQuery, and heatmap.js that allows you to create a heatmap for a web page using static data from a SQL database.Advanced Data Server: Advanced Data Server (ADS) is a library that enables you to create powerful server applications with little code.ARPAMISproject: This is an ambitious project that aims to provide an extensive business solution to schools, universities or any academic institutions alike. ArraySegments (by Stephen Cleary): Lightweight extension methods for ArraySegment<T>, particularly useful for byte arrays.Auto Proxy Configuration: This Tool sets proxy server automatically according to the DNSDomain.Bongiozzo Photosite: Simple photo site written using ASP.NET MVC 4.0 over Flickr API and Galleria image gallery framework. Cloud Media SharePoint Extension: With this extensions you can easily add media from the cloud like YouTube or Vimeo. Metadata from Vimeo or YouTube are also .and will be added tooContent Organizer Rule Manager SharePoint 2010: Create and manage content organizer rules faster for SharePoint 2010.Crm Customization Manager: Crm Customization Manager (CCM) by N.JL helps Dynamics CRM System Adiminstrators to easly Import Customisations and Treanslations with scheduling possibilitydemoHello: asp.net Ghost Puzzle: Protect your files with Ghost Puzzle.IonoWumpus: A simple Hunt the Wumpus implementation.LargeSky Personal Project: This a personal project, just for code place.Maze Game: Maze Game is a game for children/ computer beginners to practice the mouse movement with fun.NASM Develop IDE (The Open Source NASM Development Environment For Windows): A simple, light weight, all one in application that can help developers develop NASM applications in Windows without the need of remember fancy commands.Navigational: Gator brings navigation to projects built around life situations.Real Folders for Visual Studio: Real Folders for Visual Studio is a free plugin which makes Solution Folders map to real file system folders. With Real Folders you have the opportunity to organize your files in a simpler way than standard Visual Studio Solution Folders behave (completely uncommited to any folder on your file system). SaltFx: SaltFx is an N-Layered Domain Driven Design (DDD) framework for .NET development.SharePoint Location-Based Weather Webpart: Uses UPS information to display local weather for the user via the Yahoo Weather service.SharePoint Publishing: The Project is aimed at supplying Publishers with tools & design elements to provide a means of publishing through SharePoint.SpellLight - Lightweight Silverlight Spell Checker Library: SpellLight is a lightweight Silverlight Spell Check librarySUDOKU APP: NoneTrainer: This is an application used for logging runs, nutrition, weight, and other goals.weibospace: An ios projecct

    Read the article

  • How-To Geek Gets the Microsoft MVP Award, Thanks to You

    - by The Geek
    The How-To Geek has won a Microsoft MVP award for the second year in a row, and it’s all thanks to you, our great readers that keep the site going. Join us for some mutual back-patting and some terrible photography of all the award stuff. Of course, if you’re familiar with the MVP award you’ll probably know that it’s actually for a single person, but in my opinion the award belongs to the entire How-To Geek community, without which this site would be nothing. Latest Features How-To Geek ETC HTG Projects: How to Create Your Own Custom Papercraft Toy How to Combine Rescue Disks to Create the Ultimate Windows Repair Disk What is Camera Raw, and Why Would a Professional Prefer it to JPG? The How-To Geek Guide to Audio Editing: The Basics How To Boot 10 Different Live CDs From 1 USB Flash Drive The 20 Best How-To Geek Linux Articles of 2010 Five Sleek Audi R8 Car Themes for Chrome and Iron MS Notepad Replacement Metapad Returns with a New Beta Version Spybot Search and Destroy Now Available as a Portable App (PortableApps.com) ShapeShifter: What Are Dreams? [Video] This Computer Runs on Geek Power Wallpaper Bones, Clocks, and Counters; A Look at the First 35,000 Years of Computing

    Read the article

  • Do I need to uninstall lxde before installing kde-standard?

    - by A Roy
    I have ubuntu 12.04 (upgraded from 10.04) and since I disliked the default desktop, I installed lxde (sudo apt-get install lxde). This was good except that occasionally there would be trouble with Firefox (blinking on panel) so that finally I had to close it and then a error message from Ubuntu was issued. I had asked about it before but there was no useful response so now I want to move to another desktop which will hopefully not create the problem I have now. My doubt is, should I first uninstall lxde and then install kde (sudo apt-get install kde-standard) or is it enough to install kde without uninstalling lxde? In case it is necessary to uninstall, should I use the command sudo apt-get remove lxde or is there a better command for it? You may also help me with choice of desktop. I installed lxde since this is simple and lightweight. I am assuming that kde will not be as simple but hopefully not create problem like above. But I hate if it takes too long to log in or to launch a program like Firefox etc and also there should not be icons fixed on the left part of terminal (I hate to keep icons on desktop since these are distracting). Some of these issues were present with default Ubuntu 12.04. So is my choice of kde-standard appropriate or are there better desktop alternatives?

    Read the article

  • CodePlex Daily Summary for Tuesday, December 28, 2010

    CodePlex Daily Summary for Tuesday, December 28, 2010Popular ReleasesMonitorWang: MonitorWang v1.0.5 (Growler): What's new?Added Growl Notification Finalisers - these are interceptor components that work exclusively with the Growl Publisher. These allow you to modify the Growl Notification just prior to it being sent by the publisher. You can inject custom logic to precisely control how the Growl Notification will appear; this includes changing the Growl Priority level and message text. I've created to two Growl Notification Finalisers - one allows you to change the Growl Notification Priorty based on ...Catel - WPF and Silverlight MVVM library: 1.0.0: And there it is, the final release of Catel, and it is no longer a beta version!Multicore Task Framework: MTF 1.0.2: Release 1.0.2 of Multicore Task Framework.EnhSim: EnhSim 2.2.7 ALPHA: 2.2.7 ALPHAThis release supports WoW patch 4.03a at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - Mongoose has bee...LINQ to Twitter: LINQ to Twitter Beta v2.0.19: Mono 2.8, Silverlight, OAuth, 100% Twitter API coverage, streaming, extensibility via Raw Queries, and added documentation. Bug fixes.Hammock for REST: Hammock v1.1.4: v1.1.4 ChangesOAuth fixes for post content handling, encoding, and url parameter doubling v1.1.3 ChangesAdded an event handler for use with retries Made improvements to OAuth for performance, plus additional fixes Added OAuth token refresh support Fixed memory leak in content streaming, and regression issue with async GET call response handling v1.1.2 ChangesAdded OAuth Echo native support and static helper methods on OAuthCredentials Fixes for multi-part stream writing and recovery...Rocket Framework (.Net 4.0): Rocket Framework for Windows V 1.0.0: Architecture is reviewed and adjusted in a way so that I can introduce the Web version and WPF version of this framework next. - Rocket.Core is introduced - Controller button functions revisited and updated - DB is renewed to suite the implemented features - Create New button functionality is changed - Add Question Handling featuresFxCop Integrator for Visual Studio 2010: FxCop Integrtor 1.1.0: New FeatureSearch violation information FxCop Integrator provides the violation search feature. You can find out specific violation information by simple search expression.Analyze with FxCop project file FxCop Integrator supports code analysis with FxCop project file. You can customize code analysis behavior (e.g. analyze specifid types only, use specific rules only, and so on). ImprovementImproved the code analysis result list to show more information (added Proejct and File column). Change...NoSimplerAccounting: NoSimplerAccounting 6.0: -Fixed a bug in expense category report.NHibernate Mapping Generator: NHibernate Mapping Generator 2.0: Added support for Postgres (Thanks to Angelo)NewLife XCode: XCode v6.5.2010.1223 ????(????v3.5??): XCode v6.5.2010.1223 ????,??: NewLife.Core ??? NewLife.Net ??? XControl ??? XTemplate ????,??C#?????? XAgent ???? NewLife.CommonEnitty ??????(???,XCode??????) XCode?? ?????????,??????????????????,?????95% XCode v3.5.2009.0714 ??,?v3.5?v6.0???????????????,?????????。v3.5???????????,??????????????。 XCoder ??XTemplate?????????,????????XCode??? XCoder_Src ???????(????XTemplate????),??????????????????MiniTwitter: 1.64: MiniTwitter 1.64 ???? ?? 1.63 ??? URL ??????????????VivoSocial: VivoSocial 7.4.0: Please see changes: http://support.vivoware.com/project/ChangeLog.aspx?PROJID=48Umbraco CMS: Umbraco 4.6 Beta - codename JUNO: The Umbraco 4.6 beta (codename JUNO) release contains many new features focusing on an improved installation experience, a number of robust developer features, and contains more than 89 bug fixes since the 4.5.2 release. Improved installer experience Updated Starter Kits (Simple, Blog, Personal, Business) Beautiful, free, customizable skins included Skinning engine and Skin customization (see Skinning Documentation Kit) Default dashboards on install with hide option Updated Login t...SSH.NET Library: 2010.12.23: This release includes some bug fixes and few new fetures. Fixes Allow to retrieve big directory structures ssh-dss algorithm is fixed Populate sftp file attributes New Features Support for passhrase when private key is used Support added for diffie-hellman-group14-sha1,diffie-hellman-group-exchange-sha256 and diffie-hellman-group-exchange-sha1 key exchange algorithms Allow to provide multiple key files for authentication Add support for "keyboard-interactive" authentication method...ASP.NET MVC SiteMap provider: MvcSiteMapProvider 2.3.0: Using NuGet?MvcSiteMapProvider is also listed in the NuGet feed. Learn more... Like the project? Consider a donation!Donate via PayPal via PayPal. Release notesThis will be the last release targeting ASP.NET MVC 2 and .NET 3.5. MvcSiteMapProvider 3.0.0 will be targeting ASP.NET MVC 3 and .NET 4 Web.config setting skipAssemblyScanOn has been deprecated in favor of excludeAssembliesForScan and includeAssembliesForScan ISiteMapNodeUrlResolver is now completely responsible for generating th...Media Companion: Media Companion 3.400: Extract the entire archive to a folder which has user access rights, eg desktop, documents etc. A manual is included to get you startedWindows Media Player GNTP Plugin: WMP-GNTP v1.0.5: This is the installer for WMP-GNTP. Install it to get the plugin on your system.Google Geo Kit: Static Google Map WinForm Control Nightly Build: 12/22/2010 MD5sum - b8118c9970d6dc9480fe7c41f042537f add event OnGMapNotDefined. When anything went wrong in StaticGmap internally and return null stream, this event will be firedSilverlight Sockets Sample: No binaries: Whole source code in a ZIP. Shame 'Source Code' tab isn't working, so I'll just upload a ZIP.New ProjectsAsfMojo: AsfMojo is an open source .NET ASF parsing library, providing support for parsing WMA audio and WMV video files. It offers classes to create streams from packet data within a media file, gather file statistics and extract audio segments or frame accurate still frames.asp.net ajax file manager: Features : Full Ajax Support security View Image Create folder Delete file & folder Copy file & folder Move file & folder multiple selection ( Ctrl + Select ) Easy installation and configuration Open source. compatible IE7,IE8,FF,Safari,Chrome http://www.filemanager.3ntar.netBoxNet: BoxNet is a opensource library which will provide possibility for Windows Phone 7 developers to integrate with DropBox.C# Sqlite For WP7: C# Sqlite Port for Windows phone 7 and possibly Silverlight 3, 4. The core engine was slightly modified to be used with IsolatedStorage and SqliteClient were ported by using missing codes from Mono project in order to maximize usability and portability from desktop.Comparison of Managed Compression Algorithms: Visual Studio 2010 Console Test Harness with samples of MiniLZO, QuickLz, SharpZipLib, iRolz and other managed compression classes. Tests compression of a 2MB Word file and shows timings. Useful in deciding what type of compression to use. Developers can easily add their own.Cypher Bot 2011: Cypher Bot 2011 Brings the word cipher to a whole new level. You can now easily open, save, print, and send ciphers. Make a short message or completly encrypt a document. The cipher is impossible to figure out unless you have the keyword and algorithm to solve it! Try it out!Directed Graph for .NET: This project presents a simple directed graph implementation for the .NET framework using C# language.DynamicAccess: DynamicAccess is a library to aid connecting DLR languages such as ironpython and ironruby to non-dynamic languages like managed C++. It also fills in some gaps in the current C# support of dynamic objects, such as member access by string and deletion of members or indexes.Euro for Windows XP: A simple tool and sample to change Estonian currency from Estonian Krone (kr) to Euro (€). Applies to all versions of Windows and from .NET 2.0 which is default build. The sample creates a custom locale and updates existing users through Registry.Excel PowerShell Console: An Excel AddIn to enable using PowerShell for Excel automation.fantastic: fantasticGomarket Toolbar: GoMarket FireFox toolbarjs: jsLiving Agile's Common Framework: This project is a collection of commonly used routines here at Living Agile. There are a lot of helpful extension methods, logging, configuration and threading utilities. All other Living Agile projects have a dependency on this project.MapWindow 4: MapWindow4 is a free windows GIS application and uses an ActiveX control at it's core that can be embedded into many applications that don't support .Net such as excel, access, visual basic 6, or other pre-.Net languages.Mdelete API: Delete all files and directores in windows shell. Support long path (less then 32000 chars) and network path (eg. \\server\share or \\127.0.0.1\share)OP-Code SyntaxEditor: OP-Code SyntaxEditor is a Windows Forms Control, similar to a multi-line TextBox, which syntax highlights text and provides some features for code editing.passion: passionSharpHighlighter: SharpHighlighter is an extension for Visual Studio; a fairly simple code highlighter for C# I made it for anyone who wants to download and learn how to create is own parser or Syntax Highlighter. This sample will highlight only C# classes and structs but it's quite extensible.Test Center Locator: Test center locator makes it easier to find closest toefl or Ielts test center

    Read the article

  • Enigmail - how to encrypt only part of the message?

    - by Lukasz Zaroda
    When I confirmed my OpenPGP key on launchpad I got a mail from them, that was only partially encrypted with my key (only few paragraphs inside the message). Is it possible to encrypt only chosen part of the message with Enigmail? Or what would be the easiest way to accomplish it? Added #1: I found a pretty convenient way for producing ASCII armoured encrypted messages by using Nautilus interface (useful for ones that for some reason doesn't like to work with terminal). You need to install Nautilus-Actions Configuration Tool, and add there a script with a name eg. "Encrypt in ASCII" and parameters: path: gpg parameters: --batch -sear %x %f The trick is that now you can create file, with extension that would be name of your recipient, you can then fill it with your message, right click it in Nautilus, choose "Encrypt in ASCII", and you will have encrypted ascii file which content you can (probably) just copy to your message. But if anybody knows more convenient solution please share it. Added #1B: In the above case if you care more about security of your messages, It's worth to turning off invisible backup files that gedit creates every time, you create new document, or just remember to delete them.

    Read the article

  • Should mock objects for tests be created at a high or low level

    - by Danack
    When creating unit tests for those other objects, what is the best way to create mock objects that provide data to other objects. Should they be created at a 'high level' and intercept the calls as soon as possible, or should they be done at a 'low level' and so make as much as the real code still be called? e.g. I'm writing a test for some code that requires a NoteMapper object that allows Notes to be loaded from the DB. class NoteMapper { function getNote($sqlQueryFactory, $noteID) { // Create an SQL query from $sqlQueryFactory // Run that SQL // if null // return null // else // return new Note($dataFromSQLQuery) } } I could either mock this object at a high level by creating a mock NoteMapper object, so that there are no calls to the SQL at all e.g. class MockNoteMapper { function getNote($sqlQueryFactory, $noteID) { //$mockData = {'Test Note title', "Test note text" } // return new Note($mockData); } } Or I could do it at a very low level, by creating a MockSQLQueryFactory that instead of actually querying the database just provides mock data back, and passing that to the current NoteMapper object. It seems that creating mocks at a high level would be easier in the short term, but that in the long term doing it at a low level would be more powerful and possibly allow more automation of tests e.g. by recording data in an out of a DB and then replaying that data for tests. Is there a recommended way of creating mocks? Are there any hard and fast rules about which are better, or should they both be used where appropriate?

    Read the article

  • Implementing an automatic navigation mesh generation for 2d top down map?

    - by J2V
    I am currently in the middle of implementing an A* pathfinding for enemies. In order to implement the actual A* logic, I need a navigation mesh for my map. I am working on a 2D top down rpg map. The world is static, meaning there is no requirement for dynamic runtime mesh generation. My world objects are pixel based, not tile based and have associated data with them such as scale, rotation, origin etc. I will obviously need some vertex data being generated from my world objects, maybe create a polygon generation from color data? I could create a colormap with objects for my whole map, but I have no idea how to begin creating nav mesh polygons. How would an actual navigation mesh generation look like with this kind of available information? Can anyone maybe point to some great resources? I have looked into some 3D nav mesh tools, but they seem kind of overly complex for my situation and also have a lot of their req data available from models. Thanks a lot in advance! I have been trying to get my head around it for some time now.

    Read the article

  • Sales and Procurement Contracts 12.1.3++ Release Information

    - by LuciaC
    New functionality has been released for Sales and Procurement Contracts in a new patch: Contracts 12.1.3++: Patch 13877401: 12.1.3 Rollup for Oracle Contracts Core. The new functionality includes: APIs for Import of Contract Templates, Contract Expert rules, Questions and Constants: The three APIs are as follows: API for Templates, API for Rules, and API for Questions and Constants. These can be used to both create entities and update existing templates and rules. The APIs will display error and warning messages which can be processed and analyzed by the customer. Ability to Apply Multiple Templates to a Sourcing, Procurement or Sales Document: The buyer can select and add multiple templates to a quote,sales agreement document, sourcing or purchasing odcument.  All the clauses and deliverables from the new templates are synchronized with the document. The Contract Expert rules are from the original template. The buyer can also view the list of templates that are added to any sales or procurement document. Ability to Define Multi-Row Variables: You can create user defined manual variables that are tables containing one row per line or multiple rows. Contract Preview will print the variable values according to the layout defined for the variable. These variables are not available for Contract Expert Rules and Supplier. Enhancement to Suggested Sections for Clauses by Contract Expert: You can associate multiple default sections with a clause. A clause is associated with multiple values of any system variable and for each such value a section name is associated in Contracts Terms Library. When Contract Expert is run in the contract authoring flow, the clause is automatically placed in the associated section name. Plus many more new features. Read the following notes for details on all the new and changed functionality: Oracle Procurement Contracts Release Notes, Release 12.1.3++ (Doc ID 1467140.1) Oracle Sales Contracts Release Notes, Release 12.1.3++ (Doc ID 1467149.1) Oracle E-Business Suite Releases 12.1 and 12.2 Release Content Documents (Doc ID 1302189.1)

    Read the article

  • Best practices: Ajax and server side scripting with stored procedures

    - by Luka Milani
    I need to rebuild an old huge website and probably to port everyting to ASP.NET and jQuery and I would like to ask for some suggestion and tips. Actually the website uses: Ajax (client site with prototype.js) ASP (vb script server side) SQL Server 2005 IIS 7 as web server This website uses hundred of stored procedures and the requests are made by an ajax call and only 1 ASP page that contain an huge select case Shortly an example: JAVASCRIPT + PROTOTYPE: var data = { action: 'NEWS', callback: 'doNews', param1: $('text_example').value, ......: ..........}; AjaxGet(data); // perform a call using another function + prototype SERVER SIDE ASP: <% ...... select case request("Action") case "NEWS" With cmmDB .ActiveConnection = Conn .CommandText = "sp_NEWS_TO_CALL_for_example" .CommandType = adCmdStoredProc Set par0DB = .CreateParameter("Param1", adVarchar, adParamInput,6) Set par1DB = .CreateParameter(".....", adInteger, adParamInput) ' ........ ' can be more parameters .Parameters.Append par0DB .Parameters.Append par1DB par0DB.Value = request("Param1") par1DB.Value = request(".....") set rs=cmmDB.execute RecodsetToJSON rs, jsa ' create JSON response using a sub End With .... %> So as you can see I have an ASP page that has a lot of CASE and this page answers to all the ajax request in the site. My question are: Instead of having many CASES is it possible to create dynamic vb code that parses the ajax request and creates dynamically the call to the desired SP (also implementing the parameters passed by JS)? What is the best approach to handle situations like this, by using the advantages of .Net + protoype or jQuery? How the big sites handle situation like this? Do they do it by creating 1 page for request? Thanks in advance for suggestion, direction and tips.

    Read the article

  • How can I strip down Ubuntu?

    - by Thomas Owens
    I'm trying to fix what I consider a bloated install of Ubuntu. When I install Ubuntu on a machine, I get things that I don't want - web browsers, office applications, media players, accessibility utilities, Ubuntu One, and so on. My goal is to create a way that I can have an install of Ubuntu that contains only the most minimal packages - the administrative tools and package manager, a GUI (my preference would be GNOME), a text editor, core drivers (video cards, network cards - wired and wireless, input devices), and anything else that I have to have to run a stable distribution. From there, I would like to pick and choose which packages I install to create my own customized system. After playing around with other distros like Arch and Slackware, like how they provide a barebones install by default. However, I get trapped in a "configuration hell" - right now, I tried moving away from Ubuntu and to Arch, but after spending 6 hours with it, I still don't have a usable system. It's half configured and I don't have any usable software packages to enable me to work. Is anything that can help me available? Either something like the OpenSUSE builder that lets you choose applications and packages for the CD, an advanced installation mode where I can choose the packages to install and which to ignore, or a guide on how to strip Ubuntu down to its bare bones? And I suppose a natural follow up to this is once I have a stripped down Ubuntu, will this affect updating at all? When Canonical releases the next version of Ubuntu, I don't want any bloatware reinstalled. And yes, most of the applications that come with Ubuntu, I simply don't use. Ever.

    Read the article

  • Write DAX queries in Report Builder #ssrs #dax #ssas #tabular

    - by Marco Russo (SQLBI)
    If you use Report Builder with Reporting Services, you can use DAX queries even if the editor for Analysis Services provider does not support DAX syntax. In fact, the DMX editor that you can use in Visual Studio editor of Reporting Services (see a previous post on that), is not available in Report Builder. However, as Sagar Salvi commented in this Microsoft Connect entry, you can use the DAX query text in the query of a Dataset by using the OLE DB provider instead of the Analysis Services one. I think it’s a good idea to show the steps required. First, create a DataSet using the OLE DB connection type, and provide the connection string the provider (Provider), the server name (Data Source) and the database name (Initial Catalog), such as: Provider=MSOLAP;Data Source=SERVERNAME\\TABULAR;Initial Catalog=AdventureWorks Tabular Model SQL 2012 Then, create a Dataset using the data source previously defined, select the Text query type, and write the DAX code in the Query pane: You can also use the Query Designer window, that doesn’t provide any particular help in writing the DAX query, but at least can show a preview of the result of the query execution. I hope DAX will get better editors in the future… in the meantime, remember you can use DAX Studio to write and test your DAX queries, and DAX Formatter to improve their readability!If you want to learn the DAX Query Language, I suggest you watching my video Data Analysis Expressions as a Query Language on Project Botticelli!

    Read the article

  • Mailbox move issue from Exchange 2003 to Exchange 2010

    - by Ryan Roussel
    Today while moving mailboxes between Exchange 2003 and Exchange 2010, I hit an issue with a couple of mailboxes.  These mailboxes all popped access denied errors or more exactly: Insufficient Access Rights to perform the operation.   The cause was similar to the mail flow issue in that inheritable permissions were not turned on for the user object in Active Directory.  This also presented it’s own unique problem in that since the initial move request failed because of permissions, it had to be cleared before a new move request could be created. On top of that, the request did not show up in the EMC.  I used the following process to clear the request, assign permission, then create a new request:   1. First you need to know the ExchangeGUID of the mailbox for the remove-moverequest command.  To quickly get the GUID for a mailbox simply run:         2. Next we need to clear out the move request using PowerShell by running: [PS] c:\>Remove-moverequest -moverequestqueue "mailbox database 1030639620" -mailboxguid 8525686f-d4d3-42b7-92f1-46d77ea841a3   3. Then to re-establish inheritable permissions. This can be done by using AD Users and Computers, switching to View Advanced Features, then under the Security tab of the object.  Click Advanced, then check “allow inheritable permissions of parent to propagate to this object”   4. Once the Inheritable permissions are restored, we need to create a new move request: NOTE:  The EMC can also be used to initiate the Move Request once the permissions are corrected. [PS] c:\>New-moverequest –identity jyoung  -baditemlimit 100 -targetdatabase "mailbox database 1030639620"   And that’s it.  The mailbox should move over smoothly with no access denied error.

    Read the article

  • Managed Service Architectures Part I

    - by barryoreilly
    Instead of thinking about service oriented architecture, a concept that is continually defined, redefined, abused and mistreated, perhaps it is time to drop the acronym and consider what we actually need to get the job done.   ‘Pure’ SOA involves the modeling of an organisation’s processes, the so called ‘Top Down’ approach, followed by the implementation of these processes as services.     Another approach, more commonly seen in the wild, is the bottom up approach. This usually involves services that simply start popping up in the organization, and SOA in this case is often just an attempt to rein in these services. Such projects, although described as SOA projects for a variety of reasons, have clearly little relation to process driven architecture. Much has been written about these two approaches, with many deciding that a hybrid of both methods is needed to succeed with SOA.   These hybrid methods are a sensible compromise, but one gets the feeling that there is too much focus on ‘Succeeding with SOA’. Organisations who focus too much on bottom up development, or who waste too much time and money on top down approaches that don’t produce results, are often recommended to attempt an ‘agile’(Erl) or ‘middle-out’ (Microsoft) approach in order to succeed with SOA.  The problem with recommending this approach is that, in most cases, succeeding with SOA isn’t the aim of the project. If a project is started with the simple aim of ‘Succeeding with SOA’ then the reasons for the projects existence probably need to be questioned.   There are a number of things we can be sure of: ·         An organisation will have a number of disparate IT systems ·         Some of these systems will have redundant data and functionality ·         Integration will give considerable ROI ·         Integration will already be under way. ·         Services will already exist in the organisation ·         These services will be inconsistent in their implementation and in their governance   So there are three goals here: 1.       Alignment between the business and IT 2.     Integration of disparate systems 3.     Management of services.   2 and 3 are going to happen,  in fact they must happen if any degree of return is expected from the IT department. Ignoring 1 is considered a typical mistake in SOA implementations, as it ignores the business implications. However, the business implication of this approach is the money saved in more efficient IT processes. 2 and 3 are ongoing, and they will continue happening, even if a large project to produce a SOA metamodel is started. The result will then be an unstructured cackle of services, and a metamodel that is already going out of date. So we get stuck in and rebuild our services so that they match the metamodel, with the far reaching consequences that this will have on all our LOB systems are current. Lets imagine that this actually works ( how often do we rip and replace working software because it doesn't fit a certain pattern? Never -that's the point of integration), we will now be working with a metamodel that is out of date, and most likely incomplete if the organisation is large.      Accepting that an object can have more than one model over time, with perhaps more than one model being  at any given time will help us realise the limitations of the top down model. It is entirely normal , and perhaps necessary, for an organisation to be able to view an entity from different perspectives.   So, instead of trying to constantly force these goals in a straight line, why not let them happen in parallel, and manage the changes in each layer.     If  company A has chosen to model their business processes and create a business architecture, there will be a reason behind this. Often the aim is to make the business more flexible and able to cope with change, through alignment between the business and the IT department.   If company B’s IT department recognizes the problem of wild services springing up everywhere, and decides to do something about it, by designing a platform and processes for the introduction of services, is this not a valid approach?   With the hybrid approach, it is recommended that company A begin deploying services as quickly as possible. Based on models that are clearly incomplete, and which will therefore change rapidly and often in the near future. Natural business evolution will also mean that the models can be guaranteed to change in the not so near future. To ‘Succeed with SOA’ Company B needs to go back to the drawing board and start modeling processes and objects. So, in effect, we are telling business analysts to start developing code based on a model they are unsure of, and telling programmers to ignore the obvious and growing problems in their IT department and start drawing lines and boxes.     Could the problem be that there are two different problem domains? And the whole concept of SOA as it being described by clever salespeople today creates an example of oft dreaded ‘tight coupling’ between these two domains?   Could it be that we have taken two large problem areas, and bundled the solution together in order to create a magic bullet? And then convinced ourselves that the bullet actually exists?   Company A wants to have a closer relationship between the business and its IT department, in order to become a more flexible organization. Company B wants to decrease the maintenance costs of its IT infrastructure. If both companies focus on succeeding with SOA, then they aren’t focusing on their actual goals.   If Company A starts building services from incomplete models, without a gameplan, they will end up in the same situation as company B, with wild services. If company B focuses on modeling, they could easily end up with the same problems as company A.   Now we have two companies, who a short while ago had one problem each, that now have two problems each. This has happened because of a focus on ‘Succeeding with SOA’, rather than solving the problem at hand.   This is not to suggest that the two problem domains are unrelated, a strategy that encompasses both will obviously be good for the organization. But only if the organization realizes this and can develop such a strategy. This strategy cannot be bought in a box.       Anyone who has worked with SOA for a while will be used to analyzing the solutions to a problem and judging the solution’s level of coupling. If we have two applications that each perform separate functions, but need to communicate with each other, we create a integration layer between them, perhaps with a service, but we do all we can to reduce the dependency between the two systems. Using the same approach, we can separate the modeling (business architecture) and the service hosting (technical architecture).     The business architecture describes the processes and business objects in the business domain.   The technical architecture describes the hosting and management and implementation of services.   The glue that binds these together, the integration layer in our analogy, is the service contract, where the operations map the processes to their technical implementation, and the messages map business concepts to software objects in the implementation.   If we reduce the coupling between these layers, we should be able to allow developers to develop services, and business analysts to develop models, without the changes rippling through from one side to the other.   This would allow company A to carry on modeling, and company B to develop a service platform, each achieving their intended goal, without necessarily creating the problems seen in pure top down or bottom up approaches. Company B could then at a later date map their service infrastructure to a unified model, and company A could carry on modeling, insulating deployed services from changes in the ongoing modeling.   How do we do this?  The concept of service virtualization has been around for a while, and is instantly realizable in Microsoft’s Managed Services Engine. Here we can create a layer of virtual services, which represent the business analyst’s view, presenting uniform contracts to the outside world. These services can then transform and route messages to the actual service implementations. I like to think of the virtual services with their beautifully modeled interfaces as ‘SOA services’, and the implementations as simple integration ‘adapter’ services providing an interface to a technical implementation. The Managed Services Engine also provides policy based control over services, regardless of where they are deployed, simplifying handling of security, logging, exception handling etc.   This solves a big problem. The pressure to deliver services quickly is always there in projects. It is very important to quickly show value when implementing service architectures. There is also pressure to deliver quality, and you can’t easily do both at the same time. This approach allows quick delivery with quality increasing over time, allowing modeling and service development to occur in parallel and independent of each other. The link between business modeling and service implementation is not one that is obvious to many organizations, and requires a certain maturity to realize and drive forward. It is also completely possible that a company can benefit from one without the other, even if this approach is frowned upon today, there are many companies doing so and seeing ROI.   Of course there are disadvantages to this. The biggest one being the transformations necessary between the virtual interfaces and the service implementations. Bad choices in developing the services in the service implementation could mean that it is impossible to map the modeled processes to the implementation with redevelopment of the service. In many cases the architect will not have a choice here anyway, as proprietary systems are often delivered with predeveloped services. The alternative is to wait until the model is finished and then build the service according the model. However, if that approach worked we wouldn’t be having this discussion! And even when it does work, natural business evolution will mean that the two concepts (model and implementation) will immediately start to drift away from each other, so coupling them tightly together so that they are forever bound to the model that only applies at the time of the modeling work will not really achieve a great deal. Architecture is all about trade offs, and here a choice has to be made. The choice is between something will initially be of low quality but will work, or something that may well be impossible to achieve in most situations.         In conclusion, top-down is a natural approach for business analysts, and bottom-up  is a natural approach for developers. Instead of trying to force something on both that neither want, and which has not shown itself to be successful,  why not let them get on with their jobs, and let an enterprise architect coordinate the processes?

    Read the article

  • Example of DOD design (on a generic Zombie game)

    - by Jeffrey
    I can't seem to find a nice explanation of the Data Oriented Design for a generic zombie game (it's just an example, pretty common example). Could you make an example of the Data Oriented Design on creating a generic zombie class? Is the following good? Zombie list class: class ZombieList { GLuint vbo; // generic zombie vertex model std::vector<color>; // object default color std::vector<texture>; // objects textures std::vector<vector3D>; // objects positions public: unsigned int create(); // return object id void move(unsigned int objId, vector3D offset); void rotate(unsigned int objId, float angle); void setColor(unsigned int objId, color c); void setPosition(unsigned int objId, color c); void setTexture(unsigned int, unsigned int); ... void update(Player*); // move towards player, attack if near } Example: Player p; Zombielist zl; unsigned int first = zl.create(); zl.setPosition(first, vector3D(50, 50)); zl.setTexture(first, texture("zombie1.png")); ... while (running) { // main loop ... zl.update(&p); zl.draw(); // draw every zombie } Or would creating a generic World container that contains every action from bite(zombieId, playerId) to moveTo(playerId, vector) to createPlayer() to shoot(playerId, vector) to face(radians)/face(vector); and contains: std::vector<zombie> std::vector<player> ... std::vector<mapchunk> ... std::vector<vbobufferid> player_run_animation; ... be a good example? Whats the proper way to organize a game with DOD?

    Read the article

  • HLSL - Creating Shadows in 2D

    - by richard
    The way that I create shadows is by the following technique: http://www.catalinzima.com/2010/07/my-technique-for-the-shader-based-dynamic-2d-shadows/ But I have questions to HLSL. The way that I currently do it is, I have a black and white image, where Black means 'object', and white means 'nothing'. I then distort the image like in the tutorial. I do this with a pixel shader, but instead of rendering to the screen, I render to a texture, back to my application. I then take this, and create the shadows, and then send it back to the graphics card to undo the distortion, after the shadow has been added - this comes back and I have a stencil of shadow. I can put this ontop of the original image and send them back to the graphics card, which then puts them on the screen. To me this is alot of back and forth. Is there a way i can avoid this? The problem that I am having is that I need to basically go through all positions in the texture 3 times, and use the new new texture every time instead of the orginal one. I tried to read up on Passes, but i don't think that i am heading in the right direction there. Help?

    Read the article

  • What resources are there for creating a dedicated NES emulator box?

    - by normalocity
    Where do I start, and what communities should I get involved in, in order to achieve the following? Ideally, I'd like to have a box that does the following (doesn't have to do this out of the box, I'm just looking to be able to achieve these goals through configs and necessary dependencies): Either bypasses login, or auto login Auto-start FCEUX with options that will (a) automatically start a ROM of my choosing, and (b) go into full-screen mode. You can assume that before I get that far, I've already configured the input devices and video options. I'd like to create (or install, if it exists) a full-screen app that takes a list of ROMs, allows me to select one with a gamepad/arcade stick, and press a button to open that game Be able to map a button on a gamepad/arcade stick to the "Power off" or exit function of the emulator, such that it will take me back to the ROM selection screen. I've already successfully installed FCEUX and tested it with an arcade stick I own, so I'm not looking for an emulator installer guide. I don't know if the ROM selector app exists already, but I'm a Java developer, and could probably create one (so long as it's not too difficult to support controllers - I was thinking of using Slick2D for this - a gaming library that I'm already pretty familiar with). The goal would be a dedicated box that I have connected to my TV. I power it on. It boots up and starts the ROM selection app, which passes the proper parameters to FCEUX (or another emulator that I might switch to at a later time), and I'm ready to go. Basically an NES emulator as a real, living room console. Also, as far as mapping a controller button to functions in the app, well, I've also played around with hardware, and it would be pretty trivial for me to modify a gamepad to trigger key presses. I just don't want to go to that length if it's not necessary.

    Read the article

  • Implications on automatically "open" third party domain aliasing to one of my subdomains

    - by Giovanni
    I have a domain, let's call it www.mydomain.com where I have a portal with an active community of users. In this portal users cooperate in a wiki way to build some "kind of software". These software applications can then be run by accessing "public.mydomain.com/softwarename" I then want to let my users run these applications from their own subdomains. I know I can do that by automatically modifying the.htaccess file. This is not a problem. I want to let these users create dns aliases to let them access one specific subdomain. So if a user "pippo" that owns "www.pippo.com" wants to run software HelloWorld from his own subdomains he has to: Register to my site Create his own subdomain on his own site, run.pippo.com From his DNS control panel, he creates a CNAME record "run.pippo.com" pointing to "public.mydomain.com" He types in a browser http://run.pippo.com/HelloWorld When the software(that is physically run on my server) is called, first it checks that the originating domain is a trusted one. I don't do any other kind of check that restricts software execution. From a SEO perspective, I care about Google indexing of www.mydomain.com but I don't care about indexing of public.mydomain.com What are the possible security implications of doing this for my site? Is there a better way to do this or software that already does this that I can use?

    Read the article

  • Enabling support of EUS and Fusion Apps in OUD

    - by Sylvain Duloutre
    Since the 11gR2 release, OUD supports Enterprise User Security (EUS) for database authentication and also Fusion Apps. I'll plan to blog on that soon. Meanwhile, the R2 OUD graphical setup does not let you configure both EUS and FusionApps support at the same time. However, it can be done manually using the dsconfig command line. The simplest way to proceed is to select EUS from the setup tool, then manually add support for Fusion Apps using dsconfig using the commands below: - create a FA workflow element with eusWfe as next element: dsconfig create-workflow-element \           --set enabled:true \           --set next-workflow-element:Eus0 \           --type fa \           --element-name faWfe - modify the workflow so that it starts from your FA workflow element instead of Eus: dsconfig set-workflow-prop \           --workflow-name userRoot0 \           --set workflow-element:faWfe  Note: the configuration changes may slightly differ in case multiple databases/suffixes are configured on OUD.

    Read the article

  • Oracle Global HR Cloud Implementation Training Can Help Meet Your Business Needs

    - by HCM-Oracle
    By Jim Vonick A key goal for the deployment of your Oracle Global HR Cloud applications is to accelerate the implementation and adoption of your applications, so that your business can start realizing all of the benefits that this rich solution offers.    Implementation team members need to have the skills and knowledge to ensure a smooth, rapid and successful implementation of your applications. During set-up, you want to optimize the configuration to best meet your business needs. In order to do this you need to understand the foundation and configuration options of your applications, so that decisions can be made during set-up that best align with your business.  To that end product level implementation training is recommended for Oracle Global HR Cloud deployments. Training For Implementation Team Members and Consultants Fusion Applications: HCM Security: Learn how to implement security for Oracle Fusion HCM applications by creating and customizing roles. You'll learn how to create security profiles to restrict data access, provision roles to users, create and manage user accounts, and verify security setup. Fusion Applications: HCM Global Human Resources: Learn how to set up your enterprise and workforce structures, how to perform functional tasks, and how to configure security for Global Human Resources data. Fusion Applications: HCM Compensation: Learn how to implement, configure, and use Oracle Fusion Compensation to manage base pay, individual compensation, workforce compensation, and total compensation statements. Fusion Applications: HCM Benefits: This course teaches you to implement, configure and manage Oracle Fusion Benefits, including how to implement benefit plans and programs.  Fusion Applications: HCM Payroll Implementation (US): This course provides implementation training for payroll managers or payroll administrators. Learn how to process payroll to ensure accurate setup results.  Learn More: See all Fusion HCM Training Jim Vonick is a Senior Product Manager with Oracle University focusing on training for Oracle Applications and Industry Solutions.

    Read the article

  • Controlling what data populates STAR

    - by user10747017
    Beginning with the Primavera Reporting Database 2.2\P6 Analytics 1.2 release, the first release that supported the P6 Extended Schema, a new ability was added to filter which projects could be included during an ETL run. In previous releases, all projects were included in an ETL run. Additionally, all projects with the option to enable publication are included in the ETL run by default.Because the reporting needs for P6 Extended Schema are different from those of STAR, you can define a filter that will limit the data that is included in the STAR schema. For example, your STAR schema can be filter to only include all projects in a specific Portfolio, or all projects with a project code assignment of 'For Analytics.'  Any criteria that can be defined in a Where clause and added to a view can be used to filter the projects included in the STAR schema. I highly suggest this approach when dealing with large databases. Unnecessary projects could cause the Extract portion of the ETL process to take longer. A table in STAR called etl_projectlist is the key for what projects are targeted during the ETL process. To setup the filter, perform the following steps:1. Connect to your Primavera P6 Project Management Database as Pxrptuser (extended schema owner) and create a new view:create or replace view star_project_viewasselect PROJECTOBJECTID objectidfrom projectportfolio pp, projectprojectportfolio pppwhere pp.objectid = ppp.PROJECTPORTFOLIOOBJECTIDand pp.name = 'STAR Projects'--The main field that MUST be selected in the view is the projectobjectid. Selecting any other field besides the projectobjectid will cause the view to be invalid and will not work. Any Where clause can be used, but projectobjectid is the key.2. In your STAR installation directory go the \res folder and edit the staretl.properties file.  Here you will define the view to be used.  Add the following line or update if exists:star.project.filter.ds1=star_project_view3. When running the  staretl.cmd or staretl.sh process the database link to Pxrtpuser will be accessed and this view will be used to populate the etl_projectlist table  with the appropriate projectobjectids as defined in the view created in step 1 above.

    Read the article

  • excel vba to CRUD drupal nodes

    - by Kirk Hings
    We need to periodically migrate Excel reports data into Drupal nodes. We looked at replicating some Excel functionality in Drupal with slickgrid, but it wasn't up to snuff. The Excel reports people don't want to double-enter their data, but their data is important to be in this Drupal site. They have hundreds of Excel reports, and update a row in each weekly. We want a button at the row end to fire a VBA macro that submits the data to Drupal, where a new node is created from the info submitted. (Yes, we are experienced with both Drupal and VBA; all users and the site are behind our firewall.) We need the new node's nid or URL returned so we can then create a link in Excel directly to that node Site is D6, using Services 3.x module. I tried the REST server module, but we can't get it to retrieve data without session authentication on, which we can't do from Excel. (unless you can?) I also noticed the 'data' it was returning via browser url was 14 or 20 nodes' info, not the one nid requested (Example: http://mysite.com/services/rest/report/node/30161) When I attempt to create a simple node like this from VBA: Dim MyURL as String MyURL = "http://mysite.com/services/rest/report/node?node[type]=test&node[title]=testing123&node[field_test_one][0][value]=123" Set objHTTP = CreateObject("MSXML2.ServerXMLHTTP") With objHTTP .Open "POST", MyURL, False .setRequestHeader "Content-Type", "application/x-www-form-urlencoded" .send (MyURL) End With I get HTTP Status: Unauthorized: Access denied for user 0 "anonymous" and HTTP Response: null Everything I search for has examples in php or java, nothing in VBA. Also tried switching to using an XMLRPC server but that's even more confusing. We would like json (used application/json, set formatter accordingly in REST server settings), but will use anything that works. Ideas? Thanks in advance!

    Read the article

  • 2013 U.S. GAAP Financial Reporting Taxonomy Available for Public Review and Comment

    - by Theresa Hickman
    FASB recently released the proposed 2013 U.S. GAAP Reporting Taxonomy. Comments are due October 29, 2012 to be finalized and published early 2013.  The proposed 2013 U.S. GAAP taxonomy and instructions on how to submit comments are available at the FASB’s XBRL page. In previous blog entries, I talked about how Oracle Hyperion Disclosure Management supports the latest taxonomy, enabling financial managers to easily comply with the latest filing requirements. The taxonomy is a list of computer-readable tags in XBRL that allows companies to annotate the voluminous financial data that is included in typical long-form financial statements and related footnote disclosures. The tags allow computers to automatically search for, assemble, and process data so it can be readily accessed and analyzed by investors, analysts, journalists, and regulators. You do not have to have Oracle Hyperion Financial Management, used for consolidating financial results, to generate XBRL. You just need Oracle Hyperion Disclosure Management to generate XBRL instance documents from financial applications, such as Oracle E-Business Suite, Oracle PeopleSoft, Oracle JD Edwards EnterpriseOne, and Oracle Fusion General Ledger. To generate XBRL tags and complete SEC filings using your existing financial applications with Oracle Hyperion Disclosure Management, here are the steps: Download the XBRL taxonomy from the SEC or XBRL Website into Hyperion Disclosure Management to create a company taxonomy. Publish financial statements from the general ledger to Microsoft Excel or Microsoft Word. Create the SEC filing in the Microsoft programs and perform the XBRL tag mapping in Oracle Hyperion Disclosure Management. Ensure that the SEC filing meets XBRL and SEC EDGAR Filer Manual validation requirements. Validate and submit the company taxonomy and XBRL instance document to the SEC. Get more details about Oracle Hyperion Disclosure Management.

    Read the article

  • What data to send when tracking clicks with Google Analytics events (and how)?

    - by user359650
    When tracking clicks on links, there are 3 items I'm interested in: link location in the page by grabbing the id of the closest parent: to see influence of location on click-through link text: to see influence of text on click-through link href attribute value: to see where people go when leaving my website The problem when using Google Analytics to track those clicks is that events only have 3 available text fields, one of which being the category, which if you use to store one of the above items will create a mess in your Event reporting because you will have as many categories as item values. Therefore if you assign a predefined value to the category (e.g. clicks), then you're left with only 2 event fields (action, label) to store 3 items (location, text, href). That in itself isn't the end of the world because you can concatenate 2 items into 1 event field, then use the reporting or the API to filter things out. Accordingly what I plan on doing is this: category: clicks action: {location_on_page} ¦ {text} label: {href} where {__} are variable values related to the clicked links With this I can easily create some reports directly via the GUI: downloads: include only events where label ends with .pdf click outs to particular domains: include only events where label contains domain And for more complex tasks I need to export the data (or use the API): influence of location on clicks: for each location in the design, count number of events that have that location in the action, then corroborate with pageviews of the corresponding pages. Whilst this looks good I'm wondering if there is a better approach, hence the following questions: Q1: Can you foresee any particular issues with this particular setup (e.g. things I won't be able to report on)? Q2: Can you think of other data that would be interesting to include in the event?

    Read the article

< Previous Page | 485 486 487 488 489 490 491 492 493 494 495 496  | Next Page >