Search Results

Search found 13761 results on 551 pages for 'sma strategy meets action'.

Page 71/551 | < Previous Page | 67 68 69 70 71 72 73 74 75 76 77 78  | Next Page >

  • Grails Navigation with Dynamic Values

    - by TripWired
    I'm using the grails navigation plugin and one of the things i want to do is have the menu item have a dynamic number in it for example static navigation = [ group:'tabs', order:7, title:'Mail', action:'index', subItems: [ [action:'pending', order:1, title:"Pending"], [action:'active', order:10, title:'Active'], [action:'completed', order:11, title:'Completed'] ] ] So i want to figure out a way to make the title Mail(3) or Mail(2) based on the amount of messages a user has. I know this will have to be set in the method or by a service of some sort i'm just not sure how to get it to update.

    Read the article

  • Is this a valid pattern for raising events in C#?

    - by Will Vousden
    Update: For the benefit of anyone reading this, since .NET 4, the lock is unnecessary due to changes in synchronization of auto-generated events, so I just use this now: public static void Raise<T>(this EventHandler<T> handler, object sender, T e) where T : EventArgs { if (handler != null) { handlerCopy(sender, e); } } And to raise it: SomeEvent.Raise(this, new FooEventArgs()); Having been reading one of Jon Skeet's articles on multithreading, I've tried to encapsulate the approach he advocates to raising an event in an extension method like so (with a similar generic version): public static void Raise(this EventHandler handler, object @lock, object sender, EventArgs e) { EventHandler handlerCopy; lock (@lock) { handlerCopy = handler; } if (handlerCopy != null) { handlerCopy(sender, e); } } This can then be called like so: protected virtual void OnSomeEvent(EventArgs e) { this.someEvent.Raise(this.eventLock, this, e); } Are there any problems with doing this? Also, I'm a little confused about the necessity of the lock in the first place. As I understand it, the delegate is copied in the example in the article to avoid the possibility of it changing (and becoming null) between the null check and the delegate call. However, I was under the impression that access/assignment of this kind is atomic, so why is the lock necessary? Update: With regards to Mark Simpson's comment below, I threw together a test: static class Program { private static Action foo; private static Action bar; private static Action test; static void Main(string[] args) { foo = () => Console.WriteLine("Foo"); bar = () => Console.WriteLine("Bar"); test += foo; test += bar; test.Test(); Console.ReadKey(true); } public static void Test(this Action action) { action(); test -= foo; Console.WriteLine(); action(); } } This outputs: Foo Bar Foo Bar This illustrates that the delegate parameter to the method (action) does not mirror the argument that was passed into it (test), which is kind of expected, I guess. My question is will this affect the validity of the lock in the context of my Raise extension method? Update: Here is the code I'm now using. It's not quite as elegant as I'd have liked, but it seems to work: public static void Raise<T>(this object sender, ref EventHandler<T> handler, object eventLock, T e) where T : EventArgs { EventHandler<T> copy; lock (eventLock) { copy = handler; } if (copy != null) { copy(sender, e); } }

    Read the article

  • Database Table Schema and Aggregate Roots

    - by bretddog
    Hi, Applicaiton is single user, 1-tier(1 pc), database SqlCE. DataService layer will be (I think) : Repository returning domain objects and quering database with LinqToSql (dbml). There are obviously a lot more columns, this is simplified view. http://img573.imageshack.us/img573/3612/ss20110115171817w.png This is my first attempt of creating a 2 tables database. I think the table schema makes sense, but I need some reassurance or critics. Because the table relations looks quite scary to be honest. I'm hoping you could; Look at the table schema and respond if there are clear signs of troubles or errors that you spot right away.. And if you have time, Look at Program Summary/Questions, and see if the table layout makes makes sense to those points. Please be brutal, I will try to defend :) Program summary: a) A set of categories, each having a set of strategies (1:m) b) Each day a number of items will be produced. And each strategy MAY reference it. (So there can be 50 items, and a strategy may reference 23 of them) c) An item can be referenced by more than one strategy. So I think it's an m:m relation. d) Status values will be logged at fixed time-fractions through the day, for: - .... each Strategy.....each StrategyItem....each item e) An action on an item may be executed by a strategy that reference it. - This is logged as ItemAction (Could have called it StrategyItemAction) User Requsts b) - e) described the main activity mode of the program. To work with only today's DayLog , for each category. 2nd priority activity is retrieval of history, which typically will be From all categories, from day x to day y; Get all StrategyDailyLog. Questions First, does the overall layout look sound? I'm worried to see that there are so many relationships in all directions, connecting everything. Is this normal, or does it look like trouble? StrategyItem is made to represent an m:m relationship. Is it correct as I noted 1:m / 1:1 (marked red) ? StrategyItemTimeLog and ItemTimeLog; Logs values that both need to be retrieved together, when retreiving a StrategyItem. Reason I separated is that the first one is strategy-specific, and several strategies can reference same item. So I thought not to duplicate those values that are not dependent no strategy, but only on the item. Hence I also dragged out the LogTime, as it seems to be the only parameter to unite the logs. But this all looks quite disturbing with those 3 tables. Does it make sense at all? Or you have suggestion? Pink circles shows my vague attempt of Aggregate Root Paths. I've been thinking in terms of "what entity is responsible for delete". Though I'm unsure about the actual root. I think it's Category. Does it make sense related to User Requests described above?

    Read the article

  • routing in asp.net mvc.

    - by andrew Sullivan
    I have a route routes.MapRoute("BuildingProject", "BuildingProject/{action}/{id}", new { controller = "Home", action = "Index", id = "" }); i want it to behave like default route ie for url that starts with BuildingProject like http://localhost:4030/BuildingProject/DeleteAll. I tried routes.MapRoute("BuildingProject", "BuildingProject/{action}/{id}", new { controller = "Home", action = "", id = "" }); It worked.But on typing localhost:4030/BuildingProject it is not redirecting to it's Index but showing error. .How to do this.

    Read the article

  • zend-framework navigation

    - by ulduz114
    i have this xml file for Creating a container , if i want create a db for save this items and and create container from db how should i do ? <?xml version="1.0" encoding="utf-8"?> <config> <nav> <logout> <label>logout</label> <controller>authentication</controller> <action>logout</action> <resource>logout</resource> </logout> <login> <label>login</label> <controller>authentication</controller> <action>login</action> <resource>login</resource> </login> <test> <label>test</label> <uri>#</uri> <resource>test</resource> <pages> <list> <label>list</label> <controller>tset</controller> <action>listtest</action> </list> <archive> <label>archive</label> <controller>myarchive</controller> <action>archive</action> </archive> </pages> </test> </nav> </config> and code in bootsrap $navContainerConfig = new Zend_Config_Xml(APPLICATION_PATH . 'navigation.xml', 'nav'); $navContainer = new Zend_Navigation($navContainerConfig);

    Read the article

  • How Do I get City State Zip in MVC 3 URL Route without writing a controller for every state and actions for each city

    - by OpTech Marketing
    I have the need to have the urls in my bosses application look like: http://domain.com/Texas/Dallas/72701 However, I don't want to write a controller for every state and an action for every city. routes.MapRoute( "DrillDown", // Route name "{controller}/{action}/{ZipId}", new { controller = "State", action = "City", ZipId = UrlParameter.Optional} // Parameter defaults Can someone help me write a pattern for the routes that will accept State/City/Zip without destroying the ability for me to have regular routes with controller/Action/Param ? Looking all over and can't find any direction.

    Read the article

  • Is there a simpler way to redirect using a route while adding paramters in Kohana?

    - by Darryl Hein
    I find myself doing the following or similar quite often: Request::instance()->redirect(Route::get('route')->uri(array('action' => 'action'))); Or: Request::instance()->redirect(Route::get(Route::name(Request::instance()->route))->uri(array('action' => 'action'))); I'm wondering if there's any short, easier, simpler way of doing this. I love the Route functionality, but it makes for some long lines of PHP.

    Read the article

  • Find the right value in recursive array

    - by fire
    I have an array with multiple sub-arrays like this: Array ( [0] => Array ( [Page_ID] => 1 [Page_Parent_ID] => 0 [Page_Title] => Overview [Page_URL] => overview [Page_Type] => content [Page_Order] => 1 ) [1] => Array ( [0] => Array ( [Page_ID] => 2 [Page_Parent_ID] => 1 [Page_Title] => Team [Page_URL] => overview/team [Page_Type] => content [Page_Order] => 1 ) ) [2] => Array ( [Page_ID] => 3 [Page_Parent_ID] => 0 [Page_Title] => Funds [Page_URL] => funds [Page_Type] => content [Page_Order] => 2 ) [3] => Array ( [0] => Array ( [Page_ID] => 4 [Page_Parent_ID] => 3 [Page_Title] => Strategy [Page_URL] => funds/strategy [Page_Type] => content [Page_Order] => 1 ) [1] => Array ( [0] => Array ( [Page_ID] => 7 [Page_Parent_ID] => 4 [Page_Title] => A Class Fund [Page_URL] => funds/strategy/a-class-fund [Page_Type] => content [Page_Order] => 1 ) [1] => Array ( [0] => Array ( [Page_ID] => 10 [Page_Parent_ID] => 7 [Page_Title] => Information [Page_URL] => funds/strategy/a-class-fund/information [Page_Type] => content [Page_Order] => 1 ) [1] => Array ( [Page_ID] => 11 [Page_Parent_ID] => 7 [Page_Title] => Fund Data [Page_URL] => funds/strategy/a-class-fund/fund-data [Page_Type] => content [Page_Order] => 2 ) ) [2] => Array ( [Page_ID] => 8 [Page_Parent_ID] => 4 [Page_Title] => B Class Fund [Page_URL] => funds/strategy/b-class-fund [Page_Type] => content [Page_Order] => 2 ) I need a function to find the right Page_URL so if you know the $url is "funds/strategy/a-class-fund" I need to pass that to a function that returns a single array result (which would be the Page_ID = 7 array in this example). Having a bit of a stupid day, any help would be appreciated!

    Read the article

  • mysql circular dependency in foreign key constraints

    - by Flavius
    Given the schema: What I need is having every user_identities.belongs_to reference an users.id. At the same time, every users has a primary_identity as shown in the picture. However when I try to add this reference with ON DELETE NO ACTION ON UPDATE NO ACTION, MySQL says #1452 - Cannot add or update a child row: a foreign key constraint fails (yap.#sql-a3b_1bf, CONSTRAINT #sql-a3b_1bf_ibfk_1 FOREIGN KEY (belongs_to) REFERENCES users (id) ON DELETE NO ACTION ON UPDATE NO ACTION) I suspect this is due to the circular dependency, but how could I solve it (and maintain referential integrity)?

    Read the article

  • What will be the OOP approach? (or YOUR approach?)

    - by hsmit
    I'm having difficulties with some general OOP & Java approach. There are various ways to let classes/objects communicate with each other. To give a simple example: I need object A to perform action X. Object A needs P, Q and R to perform this action X. Will then Object A retrieve P, Q and R by itself (within action X), or must these values be parameters for action X?

    Read the article

  • Comparing Design Patterns

    - by Lijo
    Hi, I am learning design patterns using C#. One of the challenges that I am facing is that they look similar. Could you please help me to distinguish them – basically when to use them? - Why not the other? Bridge and Strategy State and Strategy Façade and Strategy Composite and Strategy I understand that there are lots of resources available in the web. However they does not treat this special question. [Note: I am looking for implementation examples and rationale behind the selection; not mere explanations] It would be great if you are taking examples from any of the following 1) E-Commerce 2) Payroll system 3) Banking 4) Retailing Thanks for your understanding.. Thanks Lijo

    Read the article

  • MapsActivity not beeing found

    - by Johnny Rottenweed
    I am trying to get a simple map displayed. This is what I have: package com.chance.squat; import com.chance.squat.R; import com.google.android.maps.MapActivity; import android.os.Bundle; public class Maps extends MapActivity { /** Called when the activity is first created. */ @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.maps); } @Override protected boolean isRouteDisplayed() { return false; } } <?xml version="1.0" encoding="utf-8"?> <com.google.android.maps.MapView xmlns:android="http://schemas.android.com/apk/res/android" android:id="@+id/mapview" android:layout_width="fill_parent" android:layout_height="fill_parent" android:clickable="true" android:apiKey="A2:D9:A5:1C:21:6F:D7:44:47:23:31:EC:1A:98:EF:36" /> <?xml version="1.0" encoding="utf-8"?> <manifest xmlns:android="http://schemas.android.com/apk/res/android" package="com.chance.squat" android:versionCode="1" android:versionName="1.0"> <application android:icon="@drawable/icon" android:label="@string/app_name" android:theme="@style/CustomTheme"> <uses-library android:name="com.google.android.maps"/> <activity android:name=".MyApp" android:label="@string/app_name"> <intent-filter> <action android:name="android.intent.action.MAIN" /> <category android:name="android.intent.category.LAUNCHER" /> </intent-filter> </activity> <activity android:name="com.chance.squat.Search" android:label="@string/app_name"> <intent-filter> <action android:name="android.intent.action.MAIN" /> </intent-filter> </activity> <activity android:name="com.chance.squat.Add" android:label="@string/app_name"> <intent-filter> <action android:name="android.intent.action.MAIN" /> </intent-filter> </activity> <activity android:name="com.chance.squat.About" android:label="@string/app_name"> <intent-filter> <action android:name="android.intent.action.MAIN" /> </intent-filter> </activity> </application> <uses-permission android:name="android.permission.INTERNET"/> </manifest> I also have downloaded the Google APIs for version 8 and have set to build against them. My problem is it doesn't seem to find import com.google.android.maps.MapActivity and I don't know why or what the next step is. Can anyone help?

    Read the article

  • rb plugin the hot key not working

    - by Bunny Rabbit
    def activate(self,shell): self.shell = shell self.action = gtk.Action ('foo','bar','baz',None) self.activate_id = self.action.connect ('activate', self.call_bk_fn,self.shell) self.action_group = gtk.ActionGroup ('hot_key_action_group') self.action_group.add_action_with_accel (self.action, "<control>E") uim = shell.get_ui_manager () uim.insert_action_group (self.action_group, 0) uim.ensure_update () def call_bk_fn(self,shell): print('hello world') i am using the above code in a plugin for rhythmbox ,and here i am trying to register the key ctr+e so that the call_bk_fn gets called whenever the key combination is pressed , but its not working why is that so ?

    Read the article

  • Is it posible with ajax to send one json-array (array with json objects) and also include a separate parameter to receive in MVC action method?

    - by david2342
    Is it posible with ajax to send one json-array (array with json objects) and also include a separate parameter to receive in MVC action method? var n = { number: 1 }; $.ajax({ type: "POST", url: url, contentType: "application/json; charset=utf-8", dataType: "html", data: JSON.stringify({jsonObjects:json, number:n}), success: function (response) { $('#body').html(response) } } }); public ActionResult Create(List jsonObjects, int? number) JsonObjects is comming in like it supposed to but number is null.

    Read the article

  • Struts2 linking actions

    - by SonOfTheEARTh
    I am working on the Login module of my Struts2 app. I have created the Login Page, and a home Page(which is rendered by Login.action). Now i want to add another feature, forgot password which after performing its business must call the Login.action so that the user need not have to Login explicitly. what should i code and where so that as soon as the ForgotPassword.action finishes its work it passes control to Login.action.

    Read the article

  • 26 Days: Countdown to Oracle OpenWorld 2012

    - by Michael Snow
    Welcome to our countdown to Oracle OpenWorld! Oracle OpenWorld 2012 is just around the corner. In less than 26 days, San Francisco will be invaded by an expected 50,000 people from all over the world. Here on the Oracle WebCenter team, we’ve all been working to help make the experience a great one for all our WebCenter customers. For a sneak peak  – we’ll be spending this week giving you a teaser of what to look forward to if you are joining us in San Francisco from September 30th through October 4th. We have Oracle WebCenter sessions covering all topics imaginable. Take a look and use the tools we provide to build out your schedule in advance and reserve your seats in your favorite sessions.  That gives you plenty of time to plan for your week with us in San Francisco. If unfortunately, your boss denied your request to attend - there are still some ways that you can join in the experience virtually On-Demand. This year - we are expanding even more up North of Market Street and will be taking over Union Square as well. Check out this map of San Francisco to get a sense of how much of a footprint Oracle OpenWorld has grown to this year. With so much to see and so many sessions to learn from - its no wonder that people get excited. Add to that a good mix of fun and all of the possible WebCenter sessions you could attend - you won't want to sleep at all to take full advantage of such an opportunity. We'll also have our annual WebCenter Customer Appreciation reception - stay tuned this week for some more info on registration to make sure you'll be able to join us. If you've been following the America's Cup at all and believe in EXTREME PERFORMANCE you'll definitely want to take a look at this video from last year's OpenWorld Keynote. 12.00 Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-ascii- mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi- mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Important OpenWorld Links:  Attendee / Presenters Toolkit Oracle Schedule Builder WebCenter Sessions (listed in the catalog under Fusion Middleware as "Portals, Sites, Content, and Collaboration" ) Oracle Music Festival - AMAZING Line up!!  Oracle Customer Appreciation Night -LOOK HERE!! Oracle OpenWorld LIVE On-Demand Here are all the WebCenter sessions broken down by day for your viewing pleasure. Monday, October 1st CON8885 - Simplify CRM Engagement with Contextual Collaboration Are your sales teams disconnected and disengaged? Do you want a tool for easily connecting expertise across your organization and providing visibility into the complete sales process? Do you want a way to enhance and retain organization knowledge? Oracle Social Network is the answer. Attend this session to learn how to make CRM easy, effective, and efficient for use across virtual sales teams. Also learn how Oracle Social Network can drive sales force collaboration with natural conversations throughout the sales cycle, promote sales team productivity through purposeful social networking without the noise, and build cross-team knowledge by integrating conversations with CRM and other business applications. CON8268 - Oracle WebCenter Strategy: Engaging Your Customers. Empowering Your Business Oracle WebCenter is a user engagement platform for social business, connecting people and information. Attend this session to learn about the Oracle WebCenter strategy, and understand where Oracle is taking the platform to help companies engage customers, empower employees, and enable partners. Business success starts with ensuring that everyone is engaged with the right people and the right information and can access what they need through the channel of their choice—Web, mobile, or social. Are you giving customers, employees, and partners the best-possible experience? Come learn how you can! ¶ HOL10208 - Add Social Capabilities to Your Enterprise Applications Oracle Social Network enables you to add real-time collaboration capabilities into your enterprise applications, so that conversations can happen directly within your business systems. In this hands-on lab, you will try out the Oracle Social Network product to collaborate with other attendees, using real-time conversations with document sharing capabilities. Next you will embed social capabilities into a sample Web-based enterprise application, using embedded UI components. Experts will also write simple REST-based integrations, using the Oracle Social Network API to programmatically create social interactions. ¶ CON8893 - Improve Employee Productivity with Intuitive and Social Work Environments Social technologies have already transformed the ways customers, employees, partners, and suppliers communicate and stay informed. Forward-thinking organizations today need technologies and infrastructures to help them advance to the next level and integrate social activities with business applications to deliver a user experience that simplifies business processes and enterprise application engagement. Attend this session to hear from an innovative Oracle Social Network customer and learn how you can improve productivity with intuitive and social work environments and empower your employees with innovative social tools to enable contextual access to content and dynamic personalization of solutions. ¶ CON8270 - Oracle WebCenter Content Strategy and Vision Oracle WebCenter provides a strategic content infrastructure for managing documents, images, e-mails, and rich media files. With a single repository, organizations can address any content use case, such as accounts payable, HR onboarding, document management, compliance, records management, digital asset management, or Website management. In this session, learn about future plans for how Oracle WebCenter will address new use cases as well as new integrations with Oracle Fusion Middleware and Oracle Applications, leveraging your investments by making your users more productive and error-free. ¶ CON8269 - Oracle WebCenter Sites Strategy and Vision Oracle’s Web experience management solution, Oracle WebCenter Sites, enables organizations to use the online channel to drive customer acquisition and brand loyalty. It helps marketers and business users easily create and manage contextually relevant, social, interactive online experiences across multiple channels on a global scale. In this session, learn about future plans for how Oracle WebCenter Sites will provide you with the tools, capabilities, and integrations you need in order to continue to address your customers’ evolving requirements for engaging online experiences and keep moving your business forward. ¶ CON8896 - Living with SharePoint SharePoint is a popular platform, but it’s not always the best fit for Oracle customers. In this session, you’ll discover the technical and nontechnical limitations and pitfalls of SharePoint and learn about Oracle alternatives for collaboration, portals, enterprise and Web content management, social computing, and application integration. The presentation shows you how to integrate with SharePoint when business or IT requirements dictate and covers cloud-based (Office 365) and on-premises versions of SharePoint. Presented by a former Microsoft director of SharePoint product management and backed by independent customer research, this session will prepare you to answer the question “Why don’t we just use SharePoint for that?’ the next time it comes up in your organization. ¶ CON7843 - Content-Enabling Enterprise Processes with Oracle WebCenter Organizations today continually strive to automate business processes, reduce costs, and improve efficiency. Many business processes are content-intensive and unstructured, requiring ad hoc collaboration, and distributed in nature, requiring many approvals and generating huge volumes of paper. In this session, learn how Oracle and SYSTIME have partnered to help a customer content-enable its enterprise with Oracle WebCenter Content and Oracle WebCenter Imaging 11g and integrate them with Oracle Applications. ¶ CON6114 - Tape Robotics’ Newest Superhero: Now Fueled by Oracle Software For small, midsize, and rapidly growing businesses that want the most energy-efficient, scalable storage infrastructure to meet their rapidly growing data demands, Oracle’s most recent addition to its award-winning tape portfolio leverages several pieces of Oracle software. With Oracle Linux, Oracle WebLogic, and Oracle Fusion Middleware tools, the library achieves a higher level of usability than previous products while offering customers a familiar interface for management, plus ease of use. This session examines the competitive advantages of the tape library and how Oracle software raises customer satisfaction. Learn how the combination of Oracle engineered systems, Oracle Secure Backup, and Oracle’s StorageTek tape libraries provide end-to-end coverage of your data. ¶ CON9437 - Mobile Access Management With more than five billion mobile devices on the planet and an increasing number of users using their own devices to access corporate data and applications, securely extending identity management to mobile devices has become a hot topic. This session focuses on how to extend your existing identity management infrastructure and policies to securely and seamlessly enable mobile user access. CON7815 - Customer Experience Online in Cloud: Oracle WebCenter Sites, Oracle ATG Apps, Oracle Exalogic Oracle WebCenter Sites and Oracle’s ATG product line together can provide a compelling marketing and e-commerce experience. When you couple them with the extreme performance of Oracle Exalogic, you’ll see unmatched scalability that provides you with a true cloud-based solution. In this session, you’ll learn how running Oracle WebCenter Sites and ATG applications on Oracle Exalogic delivers both a private and a public cloud experience. Find out what it takes to get these systems working together and delivering engaging Web experiences. Even if you aren’t considering Oracle Exalogic today, the rich Web experience of Oracle WebCenter, paired with the depth of the ATG product line, can provide your business full support, from merchandising through sale completion. ¶ CON8271 - Oracle WebCenter Portal Strategy and Vision To innovate and keep a competitive edge, organizations need to leverage the power of agile and responsive Web applications. Oracle WebCenter Portal enables you to do just that, by delivering intuitive user experiences for enterprise applications to drive innovation with composite applications and mashups. Attend this session to learn firsthand from customers how Oracle WebCenter Portal extends the value of existing enterprise applications, business processes, and content; delivers a superior business user experience; and maximizes limited IT resources. ¶ CON8880 - The Connected Customer Experience Begins with the Online Channel There’s a lot of talk these days about how to connect the customer journey across various touchpoints—from Websites and e-commerce to call centers and in-store—to provide experiences that are more relevant and engaging and ultimately gain competitive edge. Doing it all at once isn’t a realistic objective, so where do you start? Come to this session, and hear about three steps you can take that can help you begin your journey toward delivering the connected customer experience. You’ll hear how Oracle now has an integrated digital marketing platform for your corporate Website, your e-commerce site, your self-service portal, and your marketing and loyalty campaigns, and you’ll learn what you can do today to begin executing on your customer experience initiatives. ¶ GEN11451 - General Session: Building Mobile Applications with Oracle Cloud With the prevalence of smart mobile devices, companies are facing an increased demand to provide access to data and applications from new channels. However, developing applications for mobile devices poses some unique challenges. Come to this session to learn how Oracle addresses these challenges, offering a simpler way to develop and deploy cross-device mobile applications. See how Oracle Cloud enables you to access applications, data, and services from mobile channels in an easier way.  CON8272 - Oracle Social Network Strategy and Vision One key way of increasing employee productivity is by bringing people, processes, and information together—providing new social capabilities to enable business users to quickly correspond and collaborate on business activities. Oracle WebCenter provides a user engagement platform with social and collaborative technologies to empower business users to focus on their key business processes, applications, and content in the context of their role and process. Attend this session to hear how the latest social capabilities in Oracle Social Network are enabling organizations to transform themselves into social businesses.  --- Tuesday, October 2nd HOL10194 - Enterprise Content Management Simplified: Oracle WebCenter Content’s Next-Generation UI Regardless of the nature of your business, unstructured content underpins many of its daily functions. Whether you are working with traditional presentations, spreadsheets, or text documents—or even with digital assets such as images and multimedia files—your content needs to be accessible and manageable in convenient and intuitive ways to make working with the content easier. Additionally, you need the ability to easily share documents with coworkers to facilitate a collaborative working environment. Come to this session to see how Oracle WebCenter Content’s next-generation user interface helps modern knowledge workers easily manage personal and enterprise documents in a collaborative environment.¶ CON8877 - Develop a Mobile Strategy with Oracle WebCenter: Engage Customers, Employees, and Partners Mobile technology has gone from nice-to-have to a cornerstone of user engagement. Mobile access enables users to have information available at their fingertips, enabling them to take action the moment they make a decision, interact in the moment of convenience, and take advantage of new service offerings in their preferred channels. All your employees have your mobile applications in their pocket; now what are you going to do? It is a critical step for companies to think through what their employees, customers, and partners really need on their devices. Attend this session to see how Oracle WebCenter enables you to better engage your customers, employees, and partners by providing a unified experience across multiple channels. ¶ CON9447 - Enabling Access for Hundreds of Millions of Users How do you grow your business by identifying, authenticating, authorizing, and federating users on the Web, leveraging social identity and the open source OAuth protocol? How do you scale your access management solution to support hundreds of millions of users? With social identity support out of the box, Oracle’s access management solution is also benchmarked for 250-million-user deployment according to real-world customer scenarios. In this session, you will learn about the social identity capability and the 250-million-user benchmark testing of Oracle Access Manager and Oracle Adaptive Access Manager running on Oracle Exalogic and Oracle Exadata. ¶ HOL10207 - Build an Intranet Portal with Oracle WebCenter In this hands-on lab, you’ll work with Oracle WebCenter Portal and Oracle WebCenter Content to build out an enterprise portal that maximizes the productivity of teams and individual contributors. Using browser-based tools, you’ll manage site resources such as page styles, templates, and navigation. You’ll edit content stored in Oracle WebCenter Content directly from your portal. You’ll also experience the latest features that promote collaboration, social networking, and personal productivity. ¶ CON2906 - Get Proactive: Best Practices for Maintaining Oracle Fusion Middleware You chose Oracle Fusion Middleware products to help your organization deliver superior business results. Now learn how to take full advantage of your software with all the great tools, resources, and product updates you’re entitled to through Oracle Support. In this session, Oracle product experts provide proven best practices to help you work more efficiently, plan and prepare for upgrades and patching more effectively, and manage risk. Topics include configuration management tools, remote diagnostics, My Oracle Support Community, and My Oracle Support Lifecycle Advisors. New users and Oracle Fusion Middleware experts alike are guaranteed to leave with fresh ideas and practical, easy-to-implement next steps. ¶ CON8878 - Oracle WebCenter’s Cloud Strategy: From Social and Platform Services to Mashups Cloud computing represents a paradigm shift in how we build applications, automate processes, collaborate, and share and in how we secure our enterprise. Additionally, as you adopt cloud-based services in your organization, it’s likely that you will still have many critical on-premises applications running. With these mixed environments, multiple user interfaces, different security, and multiple datasources and content sources, how do you start evolving your strategy to account for these challenges? Oracle WebCenter offers a complete array of technologies enabling you to solve these challenges and prepare you for the cloud. Attend this session to learn how you can use Oracle WebCenter in the cloud as well as create on-premises and cloud application mash-ups. ¶ CON8901 - Optimize Enterprise Business Processes with Oracle WebCenter and Oracle BPM Do you have business processes that span multiple applications? Are you grappling with how to have visibility across these business processes; how to manage content that is associated with these processes; and, most importantly, how to model and optimize these business processes? Attend this session to hear how Oracle WebCenter and Oracle Business Process Management provide a unique set of integrated solutions to provide a composite application dashboard across these business processes and offer a solution for content-centric business processes. ¶ CON8883 - Deliver Engaging Interfaces to Oracle Applications with Oracle WebCenter Critical business processes live within enterprise applications, and application users need to manage and execute these processes as effectively as possible. Oracle provides a comprehensive user engagement platform to increase user productivity and optimize overall processes within Oracle Applications—Oracle E-Business Suite and Oracle’s Siebel, PeopleSoft, and JD Edwards product families—and third-party applications. Attend this session to learn how you can integrate these applications with Oracle WebCenter to deliver composite application dashboards to your end users—whether they are your customers, partners, or employees—for enhanced usability and Web 2.0–enabled enterprise portals.¶ Wednesday, October 3rd CON8895 - Future-Ready Intranets: How Aramark Re-engineered the Application Landscape There are essential techniques and technologies you can use to deliver employee portals that garner higher productivity, improve business efficiency, and increase user engagement. Attend this session to learn how you can leverage Oracle WebCenter Portal as a user engagement platform for bringing together business process management, enterprise content management, and business intelligence into a highly relevant and integrated experience. Hear how Aramark has leveraged Oracle WebCenter Portal and Oracle WebCenter Content to deliver a unified workspace providing simpler navigation and processing, consolidation of tools, easy access to information, integrated search, and single sign-on. ¶ CON8886 - Content Consolidation: Save Money, Increase Efficiency, and Eliminate Silos Organizations are looking for ways to save money and be more efficient. With content in many different places, it’s difficult to know where to look for a document and whether the document is the most current version. With Oracle WebCenter, content can be consolidated into one best-of-breed repository that is secure, scalable, and integrated with your business processes and applications. Users can find the content they need, where they need it, and ensure that it is the right content. This session covers content challenges that affect your business; content consolidation that can lead to savings in storage and administration costs and can lower risks; and how companies are realizing savings. ¶ CON8911 - Improve Online Experiences for Customers and Partners with Self-Service Portals Are you able to provide your customers and partners an easy-to-use online self-service experience? Are you processing high-volume transactions and struggling with call center bottlenecks or back-end systems that won’t integrate, causing order delays and customer frustration? Are you looking to target content such as product and service offerings to your end users? This session shares approaches to providing targeted delivery as well as strategies and best practices for transforming your business by providing an intuitive user experience for your customers and partners. ¶ CON6156 - Top 10 Ways to Integrate Oracle WebCenter Content This session covers 10 common ways to integrate Oracle WebCenter Content with other enterprise applications and middleware. It discusses out-of-the-box modules that provide expanded features in Oracle WebCenter Content—such as enterprise search, SOA, and BPEL—as well as developer tools you can use to create custom integrations. The presentation also gives guidance on which integration option may work best in your environment. ¶ HOL10207 - Build an Intranet Portal with Oracle WebCenter In this hands-on lab, you’ll work with Oracle WebCenter Portal and Oracle WebCenter Content to build out an enterprise portal that maximizes the productivity of teams and individual contributors. Using browser-based tools, you’ll manage site resources such as page styles, templates, and navigation. You’ll edit content stored in Oracle WebCenter Content directly from your portal. You’ll also experience the latest features that promote collaboration, social networking, and personal productivity. ¶ CON7817 - Migration to Oracle WebCenter Imaging 11g Customers today continually strive to automate business processes, reduce costs, and improve efficiency. The accounts payable process—which is often distributed in nature, requires many approvals, and generates huge volumes of paper invoices—is automated by many customers. In this session, learn how Oracle and SYSTIME have partnered to help a customer migrate its existing Oracle Imaging and Process Management Release 7.6 to the latest Oracle WebCenter Imaging 11g and integrate it with Oracle’s JD Edwards family of products. ¶ CON8910 - How to Engage Customers Across Web, Mobile, and Social Channels Whether on desktops at the office, on tablets at home, or on mobile phones when on the go, today’s customers are always connected. To engage today’s customers, you need to make the online customer experience connected and consistent across a host of devices and multiple channels, including Web, mobile, and social networks. Managing this multichannel environment can result in lots of headaches without the right tools. Attend this session to learn how Oracle WebCenter Sites solves the challenge of multichannel customer engagement. ¶ HOL10206 - Oracle WebCenter Sites 11g: Transforming the Content Contributor Experience Oracle WebCenter Sites 11g makes it easy for marketers and business users to contribute to and manage Websites with the new visual, contextual, and intuitive Web authoring interface. In this hands-on lab, you will create and manage content for a sports-themed Website, using many of the new and enhanced features of the 11g release. ¶ CON8900 - Building Next-Generation Portals: An Interactive Customer Panel Discussion Social and collaborative technologies have changed how people interact, learn, and collaborate, and providing a modern, social Web presence is imperative to remain competitive in today’s market. Can your business benefit from a more collaborative and interactive portal environment for employees, customers, and partners? Attend this session to hear from Oracle WebCenter Portal customers as they share their strategies and best practices for providing users with a modern experience that adapts to their needs and includes personalized access to content in context. The panel also addresses how customers have benefited from creating next-generation portals by migrating from older portal technologies to Oracle WebCenter Portal. ¶ CON9625 - Taking Control of Oracle WebCenter Security Organizations are increasingly looking to extend their Oracle WebCenter portal for social business, to serve external users and provide seamless access to the right information. In particular, many organizations are extending Oracle WebCenter in a business-to-business scenario requiring secure identification and authorization of business partners and their users. This session focuses on how customers are leveraging, securing, and providing access control to Oracle WebCenter portal and mobile solutions. You will learn best practices and hear real-world examples of how to provide flexible and granular access control for Oracle WebCenter deployments, using Oracle Platform Security Services and Oracle Access Management Suite product offerings. ¶ CON8891 - Extending Social into Enterprise Applications and Business Processes Oracle Social Network is an extensible social platform that enables contextual collaboration within enterprise applications and business processes, providing relevant data from across various enterprise systems in one place. Attend this session to see how an Oracle Social Network customer is integrating multiple applications—such as CRM, HCM, and business processes—into Oracle Social Network and Oracle WebCenter to enable individuals and teams to solve complex cross-organizational business problems more effectively by utilizing the social enterprise. ¶ Thursday, October 4th CON8899 - Becoming a Social Business: Stories from the Front Lines of Change What does it really mean to be a social business? How can you change our organization to embrace social approaches? What pitfalls do you need to avoid? In this lively panel discussion, customer and industry thought leaders in social business explore these topics and more as they share their stories of the good, the bad, and the ugly that can happen when embracing social methods and technologies to improve business success. Using moderated questions and open Q&A from the audience, the panel discusses vital topics such as the critical factors for success, the major issues to avoid, how to gain senior executive support for social efforts, how to handle undesired behavior, and how to measure business impact. It takes a thought-provoking look at becoming a social business from the inside. ¶ CON6851 - Oracle WebCenter and Oracle Business Intelligence Enterprise Edition to Create Vendor Portals Large manufacturers of grocery items routinely find themselves depending on the inventory management expertise of their wholesalers and distributors. Inventory costs can be managed more efficiently by the manufacturers if they have better insight into the inventory levels of items carried by their distributors. This creates a unique opportunity for distributors and wholesalers to leverage this knowledge into a revenue-generating subscription service. Oracle Business Intelligence Enterprise Edition and Oracle WebCenter Portal play a key part in enabling creation of business-managed business intelligence portals for vendors. This session discusses one customer that implemented this by leveraging Oracle WebCenter and Oracle Business Intelligence Enterprise Edition. ¶ CON8879 - Provide a Personalized and Consistent Customer Experience in Your Websites and Portals Your customers engage with your company online in different ways throughout their journey—from prospecting by acquiring information on your corporate Website to transacting through self-service applications on your customer portal—and then the cycle begins again when they look for new products and services. Ensuring that the customer experience is consistent and personalized across online properties—from branding and content to interactions and transactions—can be a daunting task. Oracle WebCenter enables you to speak and interact with your customers with one voice across your Websites and portals by providing an integrated platform for delivery of self-service and engagement that unifies and personalizes the online experience. Learn more in this session. ¶ CON8898 - Land Mines, Potholes, and Dirt Roads: Navigating the Way to ECM Nirvana Ten years ago, people were predicting that by this time in history, we’d be some kind of utopian paperless society. As we all know, we’re not there yet, but are we getting closer? What is keeping companies from driving down the road to enterprise content management bliss? Most people understand that using ECM as a central platform enables organizations to expedite document-centric processes, but most business processes in organizations are still heavily paper-based. Many of these processes could be automated and improved with an ECM platform infrastructure. In this panel discussion, you’ll hear from Oracle WebCenter customers that have already solved some of these challenges as they share their strategies for success and roads to avoid along your journey. ¶ CON8908 - Oracle WebCenter Portal: Creating and Using Content Presenter Templates Oracle WebCenter Portal applications use task flows to display and integrate content stored in the Oracle WebCenter Content server. Among the most flexible task flows is Content Presenter, which renders various types of content on an Oracle WebCenter Portal page. Although Oracle WebCenter Portal comes with a set of predefined Content Presenter templates, developers can create their own templates for specific rendering needs. This session shows the lifecycle of developing Content Presenter task flows, including how to create, package, import, modify at runtime, and use such templates. In addition to simple examples with Oracle Application Development Framework (Oracle ADF) UI elements to render the content, it shows how to use other UI technologies, CSS files, and JavaScript libraries. ¶ CON8897 - Using Web Experience Management to Drive Online Marketing Success Every year, the online channel becomes more imperative for driving organizational top-line revenue, but for many companies, mastering how to best market their products and services in a fast-evolving online world with high customer expectations for personalized experiences can be a complex proposition. Come to this panel discussion, and hear directly from online marketers how they are succeeding today by using Web experience management to drive marketing success, using capabilities such as targeting and optimization, user-generated content, mobile site publishing, and site visitor personalization to deliver engaging online experiences. ¶ CON8892 - Oracle’s Journey to Social Business Social business is a revolution, one that is causing rapidly accelerating change in how companies and customers engage with one another and how employees work together. Oracle’s goal in becoming a social business is to create a socially connected organization in which working collaboratively across geographical locations, lines of business, and management chains is second nature, enabling innovative solutions to business challenges. We can achieve this by connecting the right people, finding the right content, communicating with the right people, collaborating at the right time, and building the right communities in the right context—all ready in the CLOUD. Attend this session to see how Oracle is transforming itself into a social business. ¶  ------------ If you've read all the way to the end here - we are REALLY looking forward to seeing you in San Francisco.

    Read the article

  • disk-to-disk backup without costly backup redundancy?

    - by AaronLS
    A good backup strategy involves a combination of 1) disconnected backups/snapshots that will not be affected by bugs, viruses, and/or security breaches 2) geographically distributed backups to protect against local disasters 3) testing backups to ensure that they can be restored as needed Generally I take an onsite backup daily, and an offsite backup weekly, and do test restores periodically. In the rare circumstance that I need to restore files, I do some from the local backup. Should a catastrophic event destroy the servers and local backups, then the offsite weekly tape backup would be used to restore the files. I don't need multiple offsite backups with redundancy. I ALREADY HAVE REDUNDANCY THROUGH THE USE OF BOTH LOCAL AND REMOTE BACKUPS. I have recovery blocks and par files with the backups, so I already have protection against a small percentage of corrupt bits. I perform test restores to ensure the backups function properly. Should the remote backups experience a dataloss, I can replace them with one of the local backups. There are historical offsite backups as well, so if a dataloss was not noticed for a few weeks(such as a bug/security breach/virus), the data could be restored from an older backup. By doing this, the only scenario that poses a risk to complete data loss would be one where both the local, remote, and servers all experienced a data loss in the same time period. I'm willing to risk that happening since the odds of that trifecta negligibly small, and the data isn't THAT valuable to me. So I hope I have emphasized that I don't need redundancy in my offsite backups because I have covered all the bases. I know this exact technique is employed by numerous businesses. Of course there are some that take multiple offsite backups, because the data is so incredibly valuable that they don't even want to risk that trifecta disaster, but in the majority of cases the trifecta disaster is an accepted risk. I HAD TO COVER ALL THIS BECAUSE SOME PEOPLE DON'T READ!!! I think I have justified my backup strategy and the majority of businesses who use offsite tape backups do not have any additional redundancy beyond what is mentioned above(recovery blocks, par files, historical snapshots). Now I would like to eliminate the use of tapes for offsite backups, and instead use a backup service. Most however are extremely costly for $/gb/month storage. I don't mind paying for transfer bandwidth, but the cost of storage is way to high. All of them advertise that they maintain backups of the data, and I imagine they use RAID as well. Obviously if you were using them to host servers this would all be necessary, but for my scenario, I am simply replacing my offsite backups with such a service. So there is no need for RAID, and absolutely no value in another layer of backups of backups. My one and only question: "Are there online data-storage/backup services that do not use redundancy or offer backups(backups of my backups) as part of their packages, and thus are more reasonably priced?" NOT my question: "Is this a flawed strategy?" I don't care if you think this is a good strategy or not. I know it pretty standard. Very few people make an extra copy of their offsite backups. They already have local backups that they can use to replace the remote backups if something catastrophic happens at the remote site. Please limit your responses to the question posed. Sorry if I seem a little abrasive, but I had some trolls in my last post who didn't read my requirements nor my question, and were trying to go off answering a totally different question. I made it pretty clear, but didn't try to justify my strategy, because I didn't ask about whether my strategy was justifyable. So I apologize if this was lengthy, as it really didn't need to be, but since there are so many trolls here who try to sidetrack questions by responding without addressing the question at hand.

    Read the article

  • How to allow bind in app armor?

    - by WitchCraft
    Question: I did setup bind9 as described here: http://ubuntuforums.org/showthread.php?p=12149576#post12149576 Now I have a little problem with apparmor: If I switch it off, it works. If apparmor runs, it doesn't work, and I get the following dmesg output: [ 23.809767] type=1400 audit(1344097913.519:11): apparmor="STATUS" operation="profile_replace" name="/sbin/dhclient" pid=1540 comm="apparmor_parser" [ 23.811537] type=1400 audit(1344097913.519:12): apparmor="STATUS" operation="profile_replace" name="/usr/lib/NetworkManager/nm-dhcp-client.action" pid=1540 comm="apparmor_parser" [ 23.812514] type=1400 audit(1344097913.523:13): apparmor="STATUS" operation="profile_replace" name="/usr/lib/connman/scripts/dhclient-script" pid=1540 comm="apparmor_parser" [ 23.821999] type=1400 audit(1344097913.531:14): apparmor="STATUS" operation="profile_load" name="/usr/sbin/mysqld" pid=1544 comm="apparmor_parser" [ 23.845085] type=1400 audit(1344097913.555:15): apparmor="STATUS" operation="profile_load" name="/usr/sbin/libvirtd" pid=1543 comm="apparmor_parser" [ 23.849051] type=1400 audit(1344097913.559:16): apparmor="STATUS" operation="profile_load" name="/usr/sbin/named" pid=1545 comm="apparmor_parser" [ 23.849509] type=1400 audit(1344097913.559:17): apparmor="STATUS" operation="profile_load" name="/usr/lib/libvirt/virt-aa-helper" pid=1542 comm="apparmor_parser" [ 23.851597] type=1400 audit(1344097913.559:18): apparmor="STATUS" operation="profile_load" name="/usr/sbin/tcpdump" pid=1547 comm="apparmor_parser" [ 24.415193] type=1400 audit(1344097914.123:19): apparmor="STATUS" operation="profile_replace" name="/usr/sbin/mysqld" pid=1625 comm="apparmor_parser" [ 24.738631] ip_tables: (C) 2000-2006 Netfilter Core Team [ 25.005242] nf_conntrack version 0.5.0 (16384 buckets, 65536 max) [ 25.187939] ADDRCONF(NETDEV_UP): virbr0: link is not ready [ 26.004282] Ebtables v2.0 registered [ 26.068783] ip6_tables: (C) 2000-2006 Netfilter Core Team [ 28.158848] postgres (1900): /proc/1900/oom_adj is deprecated, please use /proc/1900/oom_score_adj instead. [ 29.840079] xenbr0: no IPv6 routers present [ 31.502916] type=1400 audit(1344097919.088:20): apparmor="DENIED" operation="mknod" parent=1984 profile="/usr/sbin/named" name="/var/log/query.log" pid=1989 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 34.336141] xenbr0: port 1(eth0) entering forwarding state [ 38.424359] Event-channel device installed. [ 38.853077] XENBUS: Unable to read cpu state [ 38.854215] XENBUS: Unable to read cpu state [ 38.855231] XENBUS: Unable to read cpu state [ 38.858891] XENBUS: Unable to read cpu state [ 47.411497] device vif1.0 entered promiscuous mode [ 47.429245] ADDRCONF(NETDEV_UP): vif1.0: link is not ready [ 49.366219] virbr0: port 1(vif1.0) entering disabled state [ 49.366705] virbr0: port 1(vif1.0) entering disabled state [ 49.368873] virbr0: mixed no checksumming and other settings. [ 97.273028] type=1400 audit(1344097984.861:21): apparmor="DENIED" operation="mknod" parent=3076 profile="/usr/sbin/named" name="/var/log/query.log" pid=3078 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 277.790627] type=1400 audit(1344098165.377:22): apparmor="DENIED" operation="mknod" parent=3384 profile="/usr/sbin/named" name="/var/log/query.log" pid=3389 comm="named" requested_mask="c" denied_mask="c" fsuid=0 ouid=0 [ 287.812986] type=1400 audit(1344098175.401:23): apparmor="DENIED" operation="mknod" parent=3325 profile="/usr/sbin/named" name="/root/tmp-gjnX0c0dDa" pid=3400 comm="named" requested_mask="c" denied_mask="c" fsuid=0 ouid=0 [ 287.818466] type=1400 audit(1344098175.405:24): apparmor="DENIED" operation="mknod" parent=3325 profile="/usr/sbin/named" name="/root/tmp-CpOtH52qU5" pid=3400 comm="named" requested_mask="c" denied_mask="c" fsuid=0 ouid=0 [ 323.166228] type=1400 audit(1344098210.753:25): apparmor="DENIED" operation="mknod" parent=3422 profile="/usr/sbin/named" name="/var/log/query.log" pid=3427 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 386.512586] type=1400 audit(1344098274.101:26): apparmor="DENIED" operation="mknod" parent=3456 profile="/usr/sbin/named" name="/var/log/query.log" pid=3459 comm="named" requested_mask="c" denied_mask="c" fsuid=0 ouid=0 [ 808.549049] type=1400 audit(1344098696.137:27): apparmor="DENIED" operation="mknod" parent=3872 profile="/usr/sbin/named" name="/var/log/query.log" pid=3877 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 894.671081] type=1400 audit(1344098782.257:28): apparmor="DENIED" operation="mknod" parent=3922 profile="/usr/sbin/named" name="/var/log/query.log" pid=3927 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 968.514669] type=1400 audit(1344098856.101:29): apparmor="DENIED" operation="mknod" parent=3978 profile="/usr/sbin/named" name="/var/log/query.log" pid=3983 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1021.814582] type=1400 audit(1344098909.401:30): apparmor="DENIED" operation="mknod" parent=4010 profile="/usr/sbin/named" name="/var/log/query.log" pid=4012 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1063.856633] type=1400 audit(1344098951.445:31): apparmor="DENIED" operation="mknod" parent=4041 profile="/usr/sbin/named" name="/var/log/query.log" pid=4043 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1085.404001] type=1400 audit(1344098972.989:32): apparmor="DENIED" operation="mknod" parent=4072 profile="/usr/sbin/named" name="/var/log/query.log" pid=4077 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1108.207402] type=1400 audit(1344098995.793:33): apparmor="DENIED" operation="mknod" parent=4102 profile="/usr/sbin/named" name="/var/log/query.log" pid=4107 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1156.947189] type=1400 audit(1344099044.533:34): apparmor="DENIED" operation="mknod" parent=4134 profile="/usr/sbin/named" name="/var/log/query.log" pid=4136 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1166.768005] type=1400 audit(1344099054.353:35): apparmor="DENIED" operation="mknod" parent=4150 profile="/usr/sbin/named" name="/var/log/query.log" pid=4155 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1168.873385] type=1400 audit(1344099056.461:36): apparmor="DENIED" operation="mknod" parent=4162 profile="/usr/sbin/named" name="/var/log/query.log" pid=4167 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1181.558946] type=1400 audit(1344099069.145:37): apparmor="DENIED" operation="mknod" parent=4177 profile="/usr/sbin/named" name="/var/log/query.log" pid=4182 comm="named" requested_mask="c" denied_mask="c" fsuid=0 ouid=0 [ 1199.349265] type=1400 audit(1344099086.937:38): apparmor="DENIED" operation="mknod" parent=4191 profile="/usr/sbin/named" name="/var/log/query.log" pid=4196 comm="named" requested_mask="c" denied_mask="c" fsuid=0 ouid=0 [ 1296.805604] type=1400 audit(1344099184.393:39): apparmor="DENIED" operation="mknod" parent=4232 profile="/usr/sbin/named" name="/var/log/query.log" pid=4237 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1317.730568] type=1400 audit(1344099205.317:40): apparmor="DENIED" operation="mknod" parent=3325 profile="/usr/sbin/named" name="/tmp-nuBes0IXwi" pid=4251 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1317.730744] type=1400 audit(1344099205.317:41): apparmor="DENIED" operation="mknod" parent=3325 profile="/usr/sbin/named" name="/tmp-ZDJA06ZOkU" pid=4252 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1365.072687] type=1400 audit(1344099252.661:42): apparmor="DENIED" operation="mknod" parent=3325 profile="/usr/sbin/named" name="/tmp-EnsuYUrGOC" pid=4290 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1365.074520] type=1400 audit(1344099252.661:43): apparmor="DENIED" operation="mknod" parent=3325 profile="/usr/sbin/named" name="/tmp-LVCnpWOStP" pid=4287 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1380.336984] type=1400 audit(1344099267.925:44): apparmor="DENIED" operation="mknod" parent=4617 profile="/usr/sbin/named" name="/var/log/query.log" pid=4622 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1437.924534] type=1400 audit(1344099325.513:45): apparmor="DENIED" operation="mknod" parent=3325 profile="/usr/sbin/named" name="/tmp-Uyf1dHIZUU" pid=4648 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1437.924626] type=1400 audit(1344099325.513:46): apparmor="DENIED" operation="mknod" parent=3325 profile="/usr/sbin/named" name="/tmp-OABXWclII3" pid=4647 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1526.334959] type=1400 audit(1344099413.921:47): apparmor="DENIED" operation="mknod" parent=4749 profile="/usr/sbin/named" name="/var/log/query.log" pid=4754 comm="named" requested_mask="c" denied_mask="c" fsuid=0 ouid=0 [ 1601.292548] type=1400 audit(1344099488.881:48): apparmor="DENIED" operation="mknod" parent=4835 profile="/usr/sbin/named" name="/var/log/query.log" pid=4840 comm="named" requested_mask="c" denied_mask="c" fsuid=0 ouid=0 [ 1639.543733] type=1400 audit(1344099527.129:49): apparmor="DENIED" operation="mknod" parent=4905 profile="/usr/sbin/named" name="/var/log/query.log" pid=4907 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1916.381179] type=1400 audit(1344099803.969:50): apparmor="DENIED" operation="mknod" parent=4959 profile="/usr/sbin/named" name="/var/log/query.log" pid=4961 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 1940.816898] type=1400 audit(1344099828.405:51): apparmor="DENIED" operation="mknod" parent=4991 profile="/usr/sbin/named" name="/var/log/query.log" pid=4996 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 2043.010898] type=1400 audit(1344099930.597:52): apparmor="DENIED" operation="mknod" parent=5048 profile="/usr/sbin/named" name="/var/log/query.log" pid=5053 comm="named" requested_mask="c" denied_mask="c" fsuid=107 ouid=107 [ 2084.956230] type=1400 audit(1344099972.545:53): apparmor="DENIED" operation="mknod" parent=3325 profile="/usr/sbin/named" name="/var/log/tmp-XYgr33RqUt" pid=5069 comm="named" requested_mask="c" denied_mask="c" fsuid=0 ouid=0 [ 2084.959120] type=1400 audit(1344099972.545:54): apparmor="DENIED" operation="mknod" parent=3325 profile="/usr/sbin/named" name="/var/log/tmp-vO24RHwL14" pid=5066 comm="named" requested_mask="c" denied_mask="c" fsuid=0 ouid=0 [ 2088.169500] type=1400 audit(1344099975.757:55): apparmor="DENIED" operation="mknod" parent=5076 profile="/usr/sbin/named" name="/var/log/query.log" pid=5078 comm="named" requested_mask="c" denied_mask="c" fsuid=0 ouid=0 [ 2165.625096] type=1400 audit(1344100053.213:56): apparmor="STATUS" operation="profile_remove" name="/sbin/dhclient" pid=5124 comm="apparmor" [ 2165.625401] type=1400 audit(1344100053.213:57): apparmor="STATUS" operation="profile_remove" name="/usr/lib/NetworkManager/nm-dhcp-client.action" pid=5124 comm="apparmor" [ 2165.625608] type=1400 audit(1344100053.213:58): apparmor="STATUS" operation="profile_remove" name="/usr/lib/connman/scripts/dhclient-script" pid=5124 comm="apparmor" [ 2165.625782] type=1400 audit(1344100053.213:59): apparmor="STATUS" operation="profile_remove" name="/usr/lib/libvirt/virt-aa-helper" pid=5124 comm="apparmor" [ 2165.625931] type=1400 audit(1344100053.213:60): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/libvirtd" pid=5124 comm="apparmor" [ 2165.626057] type=1400 audit(1344100053.213:61): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/mysqld" pid=5124 comm="apparmor" [ 2165.626181] type=1400 audit(1344100053.213:62): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/named" pid=5124 comm="apparmor" [ 2165.626319] type=1400 audit(1344100053.213:63): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/tcpdump" pid=5124 comm="apparmor" [ 3709.583927] type=1400 audit(1344101597.169:64): apparmor="STATUS" operation="profile_load" name="/usr/sbin/libvirtd" pid=7484 comm="apparmor_parser" [ 3709.839895] type=1400 audit(1344101597.425:65): apparmor="STATUS" operation="profile_load" name="/usr/sbin/mysqld" pid=7485 comm="apparmor_parser" [ 3710.008892] type=1400 audit(1344101597.597:66): apparmor="STATUS" operation="profile_load" name="/usr/lib/libvirt/virt-aa-helper" pid=7483 comm="apparmor_parser" [ 3710.545232] type=1400 audit(1344101598.133:67): apparmor="STATUS" operation="profile_load" name="/usr/sbin/named" pid=7486 comm="apparmor_parser" [ 3710.655600] type=1400 audit(1344101598.241:68): apparmor="STATUS" operation="profile_load" name="/sbin/dhclient" pid=7481 comm="apparmor_parser" [ 3710.656013] type=1400 audit(1344101598.241:69): apparmor="STATUS" operation="profile_load" name="/usr/lib/NetworkManager/nm-dhcp-client.action" pid=7481 comm="apparmor_parser" [ 3710.656786] type=1400 audit(1344101598.245:70): apparmor="STATUS" operation="profile_load" name="/usr/lib/connman/scripts/dhclient-script" pid=7481 comm="apparmor_parser" [ 3710.832624] type=1400 audit(1344101598.421:71): apparmor="STATUS" operation="profile_load" name="/usr/sbin/tcpdump" pid=7488 comm="apparmor_parser" [ 3717.573123] type=1400 audit(1344101605.161:72): apparmor="DENIED" operation="open" parent=7505 profile="/usr/sbin/named" name="/var/log/query.log" pid=7510 comm="named" requested_mask="ac" denied_mask="ac" fsuid=107 ouid=0 [ 3743.667808] type=1400 audit(1344101631.253:73): apparmor="STATUS" operation="profile_remove" name="/sbin/dhclient" pid=7552 comm="apparmor" [ 3743.668338] type=1400 audit(1344101631.257:74): apparmor="STATUS" operation="profile_remove" name="/usr/lib/NetworkManager/nm-dhcp-client.action" pid=7552 comm="apparmor" [ 3743.668625] type=1400 audit(1344101631.257:75): apparmor="STATUS" operation="profile_remove" name="/usr/lib/connman/scripts/dhclient-script" pid=7552 comm="apparmor" [ 3743.668834] type=1400 audit(1344101631.257:76): apparmor="STATUS" operation="profile_remove" name="/usr/lib/libvirt/virt-aa-helper" pid=7552 comm="apparmor" [ 3743.668991] type=1400 audit(1344101631.257:77): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/libvirtd" pid=7552 comm="apparmor" [ 3743.669127] type=1400 audit(1344101631.257:78): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/mysqld" pid=7552 comm="apparmor" [ 3743.669282] type=1400 audit(1344101631.257:79): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/named" pid=7552 comm="apparmor" [ 3743.669520] type=1400 audit(1344101631.257:80): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/tcpdump" pid=7552 comm="apparmor" [ 3873.572336] type=1400 audit(1344101761.161:81): apparmor="STATUS" operation="profile_load" name="/usr/sbin/libvirtd" pid=7722 comm="apparmor_parser" [ 3873.826209] type=1400 audit(1344101761.413:82): apparmor="STATUS" operation="profile_load" name="/usr/sbin/mysqld" pid=7723 comm="apparmor_parser" [ 3873.988181] type=1400 audit(1344101761.577:83): apparmor="STATUS" operation="profile_load" name="/usr/lib/libvirt/virt-aa-helper" pid=7721 comm="apparmor_parser" [ 3874.520305] type=1400 audit(1344101762.109:84): apparmor="STATUS" operation="profile_load" name="/sbin/dhclient" pid=7719 comm="apparmor_parser" [ 3874.520736] type=1400 audit(1344101762.109:85): apparmor="STATUS" operation="profile_load" name="/usr/lib/NetworkManager/nm-dhcp-client.action" pid=7719 comm="apparmor_parser" [ 3874.521000] type=1400 audit(1344101762.109:86): apparmor="STATUS" operation="profile_load" name="/usr/lib/connman/scripts/dhclient-script" pid=7719 comm="apparmor_parser" [ 3874.528878] type=1400 audit(1344101762.117:87): apparmor="STATUS" operation="profile_load" name="/usr/sbin/named" pid=7724 comm="apparmor_parser" [ 3874.930712] type=1400 audit(1344101762.517:88): apparmor="STATUS" operation="profile_load" name="/usr/sbin/tcpdump" pid=7726 comm="apparmor_parser" [ 3971.744599] type=1400 audit(1344101859.333:89): apparmor="STATUS" operation="profile_replace" name="/usr/sbin/libvirtd" pid=7899 comm="apparmor_parser" [ 3972.009857] type=1400 audit(1344101859.597:90): apparmor="STATUS" operation="profile_replace" name="/usr/sbin/mysqld" pid=7900 comm="apparmor_parser" [ 3972.165297] type=1400 audit(1344101859.753:91): apparmor="STATUS" operation="profile_replace" name="/usr/lib/libvirt/virt-aa-helper" pid=7898 comm="apparmor_parser" [ 3972.587766] type=1400 audit(1344101860.173:92): apparmor="STATUS" operation="profile_replace" name="/usr/sbin/named" pid=7901 comm="apparmor_parser" [ 3972.847189] type=1400 audit(1344101860.433:93): apparmor="STATUS" operation="profile_replace" name="/sbin/dhclient" pid=7896 comm="apparmor_parser" [ 3972.847705] type=1400 audit(1344101860.433:94): apparmor="STATUS" operation="profile_replace" name="/usr/lib/NetworkManager/nm-dhcp-client.action" pid=7896 comm="apparmor_parser" [ 3972.848150] type=1400 audit(1344101860.433:95): apparmor="STATUS" operation="profile_replace" name="/usr/lib/connman/scripts/dhclient-script" pid=7896 comm="apparmor_parser" [ 3973.147889] type=1400 audit(1344101860.733:96): apparmor="STATUS" operation="profile_replace" name="/usr/sbin/tcpdump" pid=7903 comm="apparmor_parser" [ 3988.863999] type=1400 audit(1344101876.449:97): apparmor="DENIED" operation="open" parent=7939 profile="/usr/sbin/named" name="/var/log/query.log" pid=7944 comm="named" requested_mask="ac" denied_mask="ac" fsuid=107 ouid=0 [ 4025.826132] type=1400 audit(1344101913.413:98): apparmor="STATUS" operation="profile_remove" name="/sbin/dhclient" pid=7975 comm="apparmor" [ 4025.826627] type=1400 audit(1344101913.413:99): apparmor="STATUS" operation="profile_remove" name="/usr/lib/NetworkManager/nm-dhcp-client.action" pid=7975 comm="apparmor" [ 4025.826861] type=1400 audit(1344101913.413:100): apparmor="STATUS" operation="profile_remove" name="/usr/lib/connman/scripts/dhclient-script" pid=7975 comm="apparmor" [ 4025.827059] type=1400 audit(1344101913.413:101): apparmor="STATUS" operation="profile_remove" name="/usr/lib/libvirt/virt-aa-helper" pid=7975 comm="apparmor" [ 4025.827214] type=1400 audit(1344101913.413:102): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/libvirtd" pid=7975 comm="apparmor" [ 4025.827352] type=1400 audit(1344101913.413:103): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/mysqld" pid=7975 comm="apparmor" [ 4025.827485] type=1400 audit(1344101913.413:104): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/named" pid=7975 comm="apparmor" [ 4025.827624] type=1400 audit(1344101913.413:105): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/tcpdump" pid=7975 comm="apparmor" [ 4027.862198] type=1400 audit(1344101915.449:106): apparmor="STATUS" operation="profile_load" name="/usr/sbin/libvirtd" pid=8090 comm="apparmor_parser" [ 4039.500920] audit_printk_skb: 21 callbacks suppressed [ 4039.500932] type=1400 audit(1344101927.089:114): apparmor="STATUS" operation="profile_remove" name="/sbin/dhclient" pid=8114 comm="apparmor" [ 4039.501413] type=1400 audit(1344101927.089:115): apparmor="STATUS" operation="profile_remove" name="/usr/lib/NetworkManager/nm-dhcp-client.action" pid=8114 comm="apparmor" [ 4039.501672] type=1400 audit(1344101927.089:116): apparmor="STATUS" operation="profile_remove" name="/usr/lib/connman/scripts/dhclient-script" pid=8114 comm="apparmor" [ 4039.501861] type=1400 audit(1344101927.089:117): apparmor="STATUS" operation="profile_remove" name="/usr/lib/libvirt/virt-aa-helper" pid=8114 comm="apparmor" [ 4039.502033] type=1400 audit(1344101927.089:118): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/libvirtd" pid=8114 comm="apparmor" [ 4039.502170] type=1400 audit(1344101927.089:119): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/mysqld" pid=8114 comm="apparmor" [ 4039.502305] type=1400 audit(1344101927.089:120): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/named" pid=8114 comm="apparmor" [ 4039.502442] type=1400 audit(1344101927.089:121): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/tcpdump" pid=8114 comm="apparmor" [ 4041.425405] type=1400 audit(1344101929.013:122): apparmor="STATUS" operation="profile_load" name="/usr/lib/libvirt/virt-aa-helper" pid=8240 comm="apparmor_parser" [ 4041.425952] type=1400 audit(1344101929.013:123): apparmor="STATUS" operation="profile_load" name="/sbin/dhclient" pid=8238 comm="apparmor_parser" [ 4058.910390] audit_printk_skb: 18 callbacks suppressed [ 4058.910401] type=1400 audit(1344101946.497:130): apparmor="STATUS" operation="profile_remove" name="/sbin/dhclient" pid=8264 comm="apparmor" [ 4058.910757] type=1400 audit(1344101946.497:131): apparmor="STATUS" operation="profile_remove" name="/usr/lib/NetworkManager/nm-dhcp-client.action" pid=8264 comm="apparmor" [ 4058.910969] type=1400 audit(1344101946.497:132): apparmor="STATUS" operation="profile_remove" name="/usr/lib/connman/scripts/dhclient-script" pid=8264 comm="apparmor" [ 4058.911185] type=1400 audit(1344101946.497:133): apparmor="STATUS" operation="profile_remove" name="/usr/lib/libvirt/virt-aa-helper" pid=8264 comm="apparmor" [ 4058.911335] type=1400 audit(1344101946.497:134): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/libvirtd" pid=8264 comm="apparmor" [ 4058.911595] type=1400 audit(1344101946.497:135): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/mysqld" pid=8264 comm="apparmor" [ 4058.911856] type=1400 audit(1344101946.497:136): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/named" pid=8264 comm="apparmor" [ 4058.912001] type=1400 audit(1344101946.497:137): apparmor="STATUS" operation="profile_remove" name="/usr/sbin/tcpdump" pid=8264 comm="apparmor" [ 4060.266700] type=1400 audit(1344101947.853:138): apparmor="STATUS" operation="profile_load" name="/sbin/dhclient" pid=8391 comm="apparmor_parser" [ 4060.268356] type=1400 audit(1344101947.857:139): apparmor="STATUS" operation="profile_load" name="/usr/lib/NetworkManager/nm-dhcp-client.action" pid=8391 comm="apparmor_parser" [ 5909.432749] audit_printk_skb: 18 callbacks suppressed [ 5909.432759] type=1400 audit(1344103797.021:146): apparmor="DENIED" operation="open" parent=8800 profile="/usr/sbin/named" name="/var/log/query.log" pid=8805 comm="named" requested_mask="ac" denied_mask="ac" fsuid=107 ouid=0 root@zotac:~# What can I do that it still works and I don't have to disable apparmor ?

    Read the article

  • Linq 2 SQL using base class and WCF

    - by Gena Verdel
    Hi all. I have the following problem: I'm using L2S for generating entity classes. All these classes share the same property ID which is autonumber. So I figured to put this property to base class and extend all entity classes from the base one. In order to be able to read the value I'm using the override modifier on this property in each and every entity class. Up to now it's live and kicking. Then I decided to introduce another tier - services using WCF approach. I've modified the Serialization mode to Unidirectional (and added the IsReference=true attribute to enable two directions), also added [DataContract] attribute to the BaseObject class. WCF is able to transport the whole object but one property , which is ID. Applying [DataMember] attribute on ID property at the base class resulted in nothing. Am I missing something? Is what I'm trying to achieve possible at all? [DataContract()] abstract public class BaseObject : IIccObject public virtual long ID { get; set; } [Table(Name="dbo.Blocks")] [DataContract(IsReference=true)] public partial class Block : INotifyPropertyChanging, INotifyPropertyChanged { private static PropertyChangingEventArgs emptyChangingEventArgs = new PropertyChangingEventArgs(String.Empty); private long _ID; private int _StatusID; private string _Name; private bool _IsWithControlPoints; private long _DivisionID; private string _SHAPE; private EntitySet<BlockByWorkstation> _BlockByWorkstations; private EntitySet<PlanningPointAppropriation> _PlanningPointAppropriations; private EntitySet<Neighbor> _Neighbors; private EntitySet<Neighbor> _Neighbors1; private EntitySet<Task> _Tasks; private EntitySet<PlanningPointByBlock> _PlanningPointByBlocks; private EntityRef<Division> _Division; private bool serializing; #region Extensibility Method Definitions partial void OnLoaded(); partial void OnValidate(System.Data.Linq.ChangeAction action); partial void OnCreated(); partial void OnIDChanging(long value); partial void OnIDChanged(); partial void OnStatusIDChanging(int value); partial void OnStatusIDChanged(); partial void OnNameChanging(string value); partial void OnNameChanged(); partial void OnIsWithControlPointsChanging(bool value); partial void OnIsWithControlPointsChanged(); partial void OnDivisionIDChanging(long value); partial void OnDivisionIDChanged(); partial void OnSHAPEChanging(string value); partial void OnSHAPEChanged(); #endregion public Block() { this.Initialize(); } [Column(Storage="_ID", AutoSync=AutoSync.OnInsert, DbType="BigInt NOT NULL IDENTITY", IsPrimaryKey=true, IsDbGenerated=true)] [DataMember(Order=1)] public override long ID { get { return this._ID; } set { if ((this._ID != value)) { this.OnIDChanging(value); this.SendPropertyChanging(); this._ID = value; this.SendPropertyChanged("ID"); this.OnIDChanged(); } } } [Column(Storage="_StatusID", DbType="Int NOT NULL")] [DataMember(Order=2)] public int StatusID { get { return this._StatusID; } set { if ((this._StatusID != value)) { this.OnStatusIDChanging(value); this.SendPropertyChanging(); this._StatusID = value; this.SendPropertyChanged("StatusID"); this.OnStatusIDChanged(); } } } [Column(Storage="_Name", DbType="NVarChar(255)")] [DataMember(Order=3)] public string Name { get { return this._Name; } set { if ((this._Name != value)) { this.OnNameChanging(value); this.SendPropertyChanging(); this._Name = value; this.SendPropertyChanged("Name"); this.OnNameChanged(); } } } [Column(Storage="_IsWithControlPoints", DbType="Bit NOT NULL")] [DataMember(Order=4)] public bool IsWithControlPoints { get { return this._IsWithControlPoints; } set { if ((this._IsWithControlPoints != value)) { this.OnIsWithControlPointsChanging(value); this.SendPropertyChanging(); this._IsWithControlPoints = value; this.SendPropertyChanged("IsWithControlPoints"); this.OnIsWithControlPointsChanged(); } } } [Column(Storage="_DivisionID", DbType="BigInt NOT NULL")] [DataMember(Order=5)] public long DivisionID { get { return this._DivisionID; } set { if ((this._DivisionID != value)) { if (this._Division.HasLoadedOrAssignedValue) { throw new System.Data.Linq.ForeignKeyReferenceAlreadyHasValueException(); } this.OnDivisionIDChanging(value); this.SendPropertyChanging(); this._DivisionID = value; this.SendPropertyChanged("DivisionID"); this.OnDivisionIDChanged(); } } } [Column(Storage="_SHAPE", DbType="Text", UpdateCheck=UpdateCheck.Never)] [DataMember(Order=6)] public string SHAPE { get { return this._SHAPE; } set { if ((this._SHAPE != value)) { this.OnSHAPEChanging(value); this.SendPropertyChanging(); this._SHAPE = value; this.SendPropertyChanged("SHAPE"); this.OnSHAPEChanged(); } } } [Association(Name="Block_BlockByWorkstation", Storage="_BlockByWorkstations", ThisKey="ID", OtherKey="BlockID")] [DataMember(Order=7, EmitDefaultValue=false)] public EntitySet<BlockByWorkstation> BlockByWorkstations { get { if ((this.serializing && (this._BlockByWorkstations.HasLoadedOrAssignedValues == false))) { return null; } return this._BlockByWorkstations; } set { this._BlockByWorkstations.Assign(value); } } [Association(Name="Block_PlanningPointAppropriation", Storage="_PlanningPointAppropriations", ThisKey="ID", OtherKey="MasterBlockID")] [DataMember(Order=8, EmitDefaultValue=false)] public EntitySet<PlanningPointAppropriation> PlanningPointAppropriations { get { if ((this.serializing && (this._PlanningPointAppropriations.HasLoadedOrAssignedValues == false))) { return null; } return this._PlanningPointAppropriations; } set { this._PlanningPointAppropriations.Assign(value); } } [Association(Name="Block_Neighbor", Storage="_Neighbors", ThisKey="ID", OtherKey="FirstBlockID")] [DataMember(Order=9, EmitDefaultValue=false)] public EntitySet<Neighbor> Neighbors { get { if ((this.serializing && (this._Neighbors.HasLoadedOrAssignedValues == false))) { return null; } return this._Neighbors; } set { this._Neighbors.Assign(value); } } [Association(Name="Block_Neighbor1", Storage="_Neighbors1", ThisKey="ID", OtherKey="SecondBlockID")] [DataMember(Order=10, EmitDefaultValue=false)] public EntitySet<Neighbor> Neighbors1 { get { if ((this.serializing && (this._Neighbors1.HasLoadedOrAssignedValues == false))) { return null; } return this._Neighbors1; } set { this._Neighbors1.Assign(value); } } [Association(Name="Block_Task", Storage="_Tasks", ThisKey="ID", OtherKey="BlockID")] [DataMember(Order=11, EmitDefaultValue=false)] public EntitySet<Task> Tasks { get { if ((this.serializing && (this._Tasks.HasLoadedOrAssignedValues == false))) { return null; } return this._Tasks; } set { this._Tasks.Assign(value); } } [Association(Name="Block_PlanningPointByBlock", Storage="_PlanningPointByBlocks", ThisKey="ID", OtherKey="BlockID")] [DataMember(Order=12, EmitDefaultValue=false)] public EntitySet<PlanningPointByBlock> PlanningPointByBlocks { get { if ((this.serializing && (this._PlanningPointByBlocks.HasLoadedOrAssignedValues == false))) { return null; } return this._PlanningPointByBlocks; } set { this._PlanningPointByBlocks.Assign(value); } } [Association(Name="Division_Block", Storage="_Division", ThisKey="DivisionID", OtherKey="ID", IsForeignKey=true, DeleteOnNull=true, DeleteRule="CASCADE")] public Division Division { get { return this._Division.Entity; } set { Division previousValue = this._Division.Entity; if (((previousValue != value) || (this._Division.HasLoadedOrAssignedValue == false))) { this.SendPropertyChanging(); if ((previousValue != null)) { this._Division.Entity = null; previousValue.Blocks.Remove(this); } this._Division.Entity = value; if ((value != null)) { value.Blocks.Add(this); this._DivisionID = value.ID; } else { this._DivisionID = default(long); } this.SendPropertyChanged("Division"); } } } public event PropertyChangingEventHandler PropertyChanging; public event PropertyChangedEventHandler PropertyChanged; protected virtual void SendPropertyChanging() { if ((this.PropertyChanging != null)) { this.PropertyChanging(this, emptyChangingEventArgs); } } protected virtual void SendPropertyChanged(String propertyName) { if ((this.PropertyChanged != null)) { this.PropertyChanged(this, new PropertyChangedEventArgs(propertyName)); } } private void attach_BlockByWorkstations(BlockByWorkstation entity) { this.SendPropertyChanging(); entity.Block = this; } private void detach_BlockByWorkstations(BlockByWorkstation entity) { this.SendPropertyChanging(); entity.Block = null; } private void attach_PlanningPointAppropriations(PlanningPointAppropriation entity) { this.SendPropertyChanging(); entity.Block = this; } private void detach_PlanningPointAppropriations(PlanningPointAppropriation entity) { this.SendPropertyChanging(); entity.Block = null; } private void attach_Neighbors(Neighbor entity) { this.SendPropertyChanging(); entity.FirstBlock = this; } private void detach_Neighbors(Neighbor entity) { this.SendPropertyChanging(); entity.FirstBlock = null; } private void attach_Neighbors1(Neighbor entity) { this.SendPropertyChanging(); entity.SecondBlock = this; } private void detach_Neighbors1(Neighbor entity) { this.SendPropertyChanging(); entity.SecondBlock = null; } private void attach_Tasks(Task entity) { this.SendPropertyChanging(); entity.Block = this; } private void detach_Tasks(Task entity) { this.SendPropertyChanging(); entity.Block = null; } private void attach_PlanningPointByBlocks(PlanningPointByBlock entity) { this.SendPropertyChanging(); entity.Block = this; } private void detach_PlanningPointByBlocks(PlanningPointByBlock entity) { this.SendPropertyChanging(); entity.Block = null; } private void Initialize() { this._BlockByWorkstations = new EntitySet<BlockByWorkstation>(new Action<BlockByWorkstation>(this.attach_BlockByWorkstations), new Action<BlockByWorkstation>(this.detach_BlockByWorkstations)); this._PlanningPointAppropriations = new EntitySet<PlanningPointAppropriation>(new Action<PlanningPointAppropriation>(this.attach_PlanningPointAppropriations), new Action<PlanningPointAppropriation>(this.detach_PlanningPointAppropriations)); this._Neighbors = new EntitySet<Neighbor>(new Action<Neighbor>(this.attach_Neighbors), new Action<Neighbor>(this.detach_Neighbors)); this._Neighbors1 = new EntitySet<Neighbor>(new Action<Neighbor>(this.attach_Neighbors1), new Action<Neighbor>(this.detach_Neighbors1)); this._Tasks = new EntitySet<Task>(new Action<Task>(this.attach_Tasks), new Action<Task>(this.detach_Tasks)); this._PlanningPointByBlocks = new EntitySet<PlanningPointByBlock>(new Action<PlanningPointByBlock>(this.attach_PlanningPointByBlocks), new Action<PlanningPointByBlock>(this.detach_PlanningPointByBlocks)); this._Division = default(EntityRef<Division>); OnCreated(); } [OnDeserializing()] [System.ComponentModel.EditorBrowsableAttribute(EditorBrowsableState.Never)] public void OnDeserializing(StreamingContext context) { this.Initialize(); } [OnSerializing()] [System.ComponentModel.EditorBrowsableAttribute(EditorBrowsableState.Never)] public void OnSerializing(StreamingContext context) { this.serializing = true; } [OnSerialized()] [System.ComponentModel.EditorBrowsableAttribute(EditorBrowsableState.Never)] public void OnSerialized(StreamingContext context) { this.serializing = false; } }

    Read the article

  • Installing UCMA 3.0 and Creating a Communications Server "14"Trusted Application Pool

    A lot of setup and administration tasks have gotten a lot easier in Communications Server 14; one of them is building an application server to develop and run your UCMA 3.0 applications on. In this post, Ill walk you through installing the UCMA 3.0 Core SDK and creating a Trusted Application Pool on the server, thus adding it to the Communications Server 14 topology and allowing you to host and run UCMA 3.0 applications on it. Note: These instructions will change slightly as the bits get updated for the eventual Beta release I will update this post as soon as I get a chance to run this setup on a more recent build. Im doing the install on a simple Communications Server 14 topology consisting of the following Windows Server 2008 R2 Hyper-V images: DC Domain Controller ExchangeUM Exchange Server 2010 CS-SE Microsoft Communications Server 2010 Standard Edition TS Development machine Ill walk through setting up UCMA 3.0 on the TS VM, which is a fully patched Windows Server 2008 R2 machine that is joined to the Fabrikam domain.   Im also running Visual Studio 2010 on this VM because I intend to use it as a development machine.  In a future post, Ill walk through installing just the UCMA 3.0 run time to build a true production UCMA application server. Im making a couple of assumptions here: You have an existing CS 2010 site and cluster configured(well look at this in a future post) Youre starting with a fully patched Windows Server 2008 R2 machine The machine is joined to your domain This walkthrough was done in my Fabrikam VM environment but can easily be modified for your own environment. Installing the UCMA 3.0 SDK Lets start by installing the UCMA 3.0 SDK.  Run UcmaSdkWebDownload.msi to kick off the SDK installer package extract process. The installed package is extracted to C: >> Program Files >> Microsoft UCMA 3.0 >> SDK Installer Package.  Browse there and run setup.exe. Click Install to install the UCMA 3.0 Core SDK and Workflow SDK. Install Communications Server Core Components UCMA 3.0 introduces a new concept called Auto-provisioning, which is most easily explained from the developer point of view.  Remember what your app.config looked it in UCMA 2.0?  You had to store the application GRUU, the trusted contact SIP Uri, the port for your application, and the name of the certificate authority. Thats all gone with auto-provisioning all you need in your app.config is your ApplicationId, e.g.: urn:application:MyApplication. How does CS 2010 do this? All of the applications configuration data is associated with the applications id.  UCMA also queries a replicated copy of the Central Management Database to retrieve the applications configuration data and also the configuration data for any endpoints. In this step, well run Bootstrapper.exe to install the CS Core components, this checked for the following components and installs them if they are not already present: VcRedist Sqlexpress Sqlnativeclient Sqlbackcompat Ucmaredist OcsCore.msi Open a command window at C: >> Program Files >> Microsoft Communications Server 2010 >> Deployment and run the following command: Bootstrapper.exe /BootstrapReplica /MinCache /SourceDirectory:"%ProgramFiles%\Microsoft UCMA 3.0\SDK Installer Package\Prereq\BootstrapperCache" Create a New Trusted Application Pool The next step is to create a new trusted application pool for the new server.  Fire up the Communications Server Management Shell from Start >> Microsoft Communications Server 2010 >> Communications Server Management Shell and enter the following PowerShell command: New-CsTrustedApplicationPool -Identity <FQDN of Server> -Registrar <FQDN of CS Server> -Site <CS Site Name> Verify that the new server was added to the CS topology by running the following PowerShell command: (Get-CsTopology -AsXml).ToString() > Topology.xml This created a file called Topology.xml in the directory that you ran the command from.  Open the file and find the Clusters section and look for a node for the new server. The Cluster Fqdn is the name of your server, and note the name of the Site that this Cluster is a part of. <Cluster Fqdn="appsrv.fabrikam.com" RequiresReplication="true" RequiresSetup="true"> <ClusterId SiteId="UcMarketing2" Number="5" /> <Machine OrdinalInCluster="1" Fqdn="appsrv.fabrikam.com"> <NetInterface InterfaceSide="Primary" InterfaceNumber="1" IPAddress="0.0.0.0" /> </Machine> </Cluster> Configure CS Management Store Replication At this point, we have the CS Core components installed and the server configured as a trusted application pool.  We now need to set up replication so that the Central Management Store replicates down to the new server. From the Communications Server Management Shell, run the following PowerShell command to enable the Replica service on the new server: Enable-CSReplica The Replica service is enabled, but hasn't done anything yet. This can be verified by running the following PowerShell command to check the replication status for the various servers in the topology: Get-CSManagementStoreReplicationStatus You can see in the screenshot below that the UpToDate property of the new server is still False Run the following PowerShell command to force the replication to run: Invoke-CSManagementStoreReplicationStatus Run Get-CSManagementStoreReplicationStatus again to verify that the new service is now up to date Request and Set a New Certificate The last step in the process is to request a new certificate from the certificate authority on the domain and assign it to the new server. From the Communications Server Management Shell, run the following PowerShell command to request a new certificate: Request-CSCertificate -Action new -Type default -CA <Domain Controller FQDN>\<Certificate Authority> Setting the -Verbose switch on the cmdlet creates an Xml file with its output. Open the Xml file and copy the thumbprint of the generated certificate. <?xml version="1.0" encoding="utf-8"?> <Action Name="Request-CsCertificate" Time="20100512T212258"> <Action Name="Request-CsCertificate" Time="20100512T212258"> <Info Title="Connection" Time="20100512T212258">Data Source=(local)\rtclocal;Initial Catalog=xds;Integrated Security=True</Info> <Action Time="20100512T212258"> <Info Title="Certificate use" Time="20100512T212258">urn:certref:default</Info> <Info Title="Subject distinguished name" Time="20100512T212258">CN="appsrv2.fabrikam.com"</Info> <Info Time="20100512T212259">The certificate request is submitted to the Certification Authority dc.fabrikam.com\FabrikamCA.</Info> <Info Time="20100512T212259">The certificate was issued.</Info> <Info Time="20100512T212259">The certificate was imported with thumbprint AFC3C46E459C1A39AD06247676F3555826DBF705.</Info> <Complete Time="20100512T212259" /> </Action> <Info Title="command status" Time="20100512T212259">Command execution processing completed</Info> <Action Name="DeploymentXdsCmdlet.SaveCachedItems" Time="20100512T212259"> <Info Time="20100512T212259">0 updates</Info> <Complete Time="20100512T212259" /> </Action> <Info Title="command status" Time="20100512T212259">Command has completed</Info> </Action> </Action> Run the following PowerShell command to set the certificate: Set-CsCertificate -Type Default -Thumbprint <Thumbprint> Wrapping Up You now have a new UCMA 3.0 application server in your Communications Server 2010 server topology.  You can provision trusted applications and trusted application endpoints on the new server using the Communications Server 2010 Management Shell.  Well take a look at how to do that in another post. Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • ASP.NET MVC ‘Extendable-hooks’ – ControllerActionInvoker class

    - by nmarun
    There’s a class ControllerActionInvoker in ASP.NET MVC. This can be used as one of an hook-points to allow customization of your application. Watching Brad Wilsons’ Advanced MP3 from MVC Conf inspired me to write about this class. What MSDN says: “Represents a class that is responsible for invoking the action methods of a controller.” Well if MSDN says it, I think I can instill a fair amount of confidence into what the class does. But just to get to the details, I also looked into the source code for MVC. Seems like the base class Controller is where an IActionInvoker is initialized: 1: protected virtual IActionInvoker CreateActionInvoker() { 2: return new ControllerActionInvoker(); 3: } In the ControllerActionInvoker (the O-O-B behavior), there are different ‘versions’ of InvokeActionMethod() method that actually call the action method in question and return an instance of type ActionResult. 1: protected virtual ActionResult InvokeActionMethod(ControllerContext controllerContext, ActionDescriptor actionDescriptor, IDictionary<string, object> parameters) { 2: object returnValue = actionDescriptor.Execute(controllerContext, parameters); 3: ActionResult result = CreateActionResult(controllerContext, actionDescriptor, returnValue); 4: return result; 5: } I guess that’s enough on the ‘behind-the-screens’ of this class. Let’s see how we can use this class to hook-up extensions. Say I have a requirement that the user should be able to get different renderings of the same output, like html, xml, json, csv and so on. The user will type-in the output format in the url and should the get result accordingly. For example: http://site.com/RenderAs/ – renders the default way (the razor view) http://site.com/RenderAs/xml http://site.com/RenderAs/csv … and so on where RenderAs is my controller. There are many ways of doing this and I’m using a custom ControllerActionInvoker class (even though this might not be the best way to accomplish this). For this, my one and only route in the Global.asax.cs is: 1: routes.MapRoute("RenderAsRoute", "RenderAs/{outputType}", 2: new {controller = "RenderAs", action = "Index", outputType = ""}); Here the controller name is ‘RenderAsController’ and the action that’ll get called (always) is the Index action. The outputType parameter will map to the type of output requested by the user (xml, csv…). I intend to display a list of food items for this example. 1: public class Item 2: { 3: public int Id { get; set; } 4: public string Name { get; set; } 5: public Cuisine Cuisine { get; set; } 6: } 7:  8: public class Cuisine 9: { 10: public int CuisineId { get; set; } 11: public string Name { get; set; } 12: } Coming to my ‘RenderAsController’ class. I generate an IList<Item> to represent my model. 1: private static IList<Item> GetItems() 2: { 3: Cuisine cuisine = new Cuisine { CuisineId = 1, Name = "Italian" }; 4: Item item = new Item { Id = 1, Name = "Lasagna", Cuisine = cuisine }; 5: IList<Item> items = new List<Item> { item }; 6: item = new Item {Id = 2, Name = "Pasta", Cuisine = cuisine}; 7: items.Add(item); 8: //... 9: return items; 10: } My action method looks like 1: public IList<Item> Index(string outputType) 2: { 3: return GetItems(); 4: } There are two things that stand out in this action method. The first and the most obvious one being that the return type is not of type ActionResult (or one of its derivatives). Instead I’m passing the type of the model itself (IList<Item> in this case). We’ll convert this to some type of an ActionResult in our custom controller action invoker class later. The second thing (a little subtle) is that I’m not doing anything with the outputType value that is passed on to this action method. This value will be in the RouteData dictionary and we’ll use this in our custom invoker class as well. It’s time to hook up our invoker class. First, I’ll override the Initialize() method of my RenderAsController class. 1: protected override void Initialize(RequestContext requestContext) 2: { 3: base.Initialize(requestContext); 4: string outputType = string.Empty; 5:  6: // read the outputType from the RouteData dictionary 7: if (requestContext.RouteData.Values["outputType"] != null) 8: { 9: outputType = requestContext.RouteData.Values["outputType"].ToString(); 10: } 11:  12: // my custom invoker class 13: ActionInvoker = new ContentRendererActionInvoker(outputType); 14: } Coming to the main part of the discussion – the ContentRendererActionInvoker class: 1: public class ContentRendererActionInvoker : ControllerActionInvoker 2: { 3: private readonly string _outputType; 4:  5: public ContentRendererActionInvoker(string outputType) 6: { 7: _outputType = outputType.ToLower(); 8: } 9: //... 10: } So the outputType value that was read from the RouteData, which was passed in from the url, is being set here in  a private field. Moving to the crux of this article, I now override the CreateActionResult method. 1: protected override ActionResult CreateActionResult(ControllerContext controllerContext, ActionDescriptor actionDescriptor, object actionReturnValue) 2: { 3: if (actionReturnValue == null) 4: return new EmptyResult(); 5:  6: ActionResult result = actionReturnValue as ActionResult; 7: if (result != null) 8: return result; 9:  10: // This is where the magic happens 11: // Depending on the value in the _outputType field, 12: // return an appropriate ActionResult 13: switch (_outputType) 14: { 15: case "json": 16: { 17: JavaScriptSerializer serializer = new JavaScriptSerializer(); 18: string json = serializer.Serialize(actionReturnValue); 19: return new ContentResult { Content = json, ContentType = "application/json" }; 20: } 21: case "xml": 22: { 23: XmlSerializer serializer = new XmlSerializer(actionReturnValue.GetType()); 24: using (StringWriter writer = new StringWriter()) 25: { 26: serializer.Serialize(writer, actionReturnValue); 27: return new ContentResult { Content = writer.ToString(), ContentType = "text/xml" }; 28: } 29: } 30: case "csv": 31: controllerContext.HttpContext.Response.AddHeader("Content-Disposition", "attachment; filename=items.csv"); 32: return new ContentResult 33: { 34: Content = ToCsv(actionReturnValue as IList<Item>), 35: ContentType = "application/ms-excel" 36: }; 37: case "pdf": 38: string filePath = controllerContext.HttpContext.Server.MapPath("~/items.pdf"); 39: controllerContext.HttpContext.Response.AddHeader("content-disposition", 40: "attachment; filename=items.pdf"); 41: ToPdf(actionReturnValue as IList<Item>, filePath); 42: return new FileContentResult(StreamFile(filePath), "application/pdf"); 43:  44: default: 45: controllerContext.Controller.ViewData.Model = actionReturnValue; 46: return new ViewResult 47: { 48: TempData = controllerContext.Controller.TempData, 49: ViewData = controllerContext.Controller.ViewData 50: }; 51: } 52: } A big method there! The hook I was talking about kinda above actually is here. This is where different kinds / formats of output get returned based on the output type requested in the url. When the _outputType is not set (string.Empty as set in the Global.asax.cs file), the razor view gets rendered (lines 45-50). This is the default behavior in most MVC applications where-in a view (webform/razor) gets rendered on the browser. As you see here, this gets returned as a ViewResult. But then, for an outputType of json/xml/csv, a ContentResult gets returned, while for pdf, a FileContentResult is returned. Here are how the different kinds of output look like: This is how we can leverage this feature of ASP.NET MVC to developer a better application. I’ve used the iTextSharp library to convert to a pdf format. Mike gives quite a bit of detail regarding this library here. You can download the sample code here. (You’ll get an option to download once you open the link). Verdict: Hot chocolate: $3; Reebok shoes: $50; Your first car: $3000; Being able to extend a web application: Priceless.

    Read the article

  • ASP.NET MVC Validation Complete

    - by Ricardo Peres
    OK, so let’s talk about validation. Most people are probably familiar with the out of the box validation attributes that MVC knows about, from the System.ComponentModel.DataAnnotations namespace, such as EnumDataTypeAttribute, RequiredAttribute, StringLengthAttribute, RangeAttribute, RegularExpressionAttribute and CompareAttribute from the System.Web.Mvc namespace. All of these validators inherit from ValidationAttribute and perform server as well as client-side validation. In order to use them, you must include the JavaScript files MicrosoftMvcValidation.js, jquery.validate.js or jquery.validate.unobtrusive.js, depending on whether you want to use Microsoft’s own library or jQuery. No significant difference exists, but jQuery is more extensible. You can also create your own attribute by inheriting from ValidationAttribute, but, if you want to have client-side behavior, you must also implement IClientValidatable (all of the out of the box validation attributes implement it) and supply your own JavaScript validation function that mimics its server-side counterpart. Of course, you must reference the JavaScript file where the declaration function is. Let’s see an example, validating even numbers. First, the validation attribute: 1: [Serializable] 2: [AttributeUsage(AttributeTargets.Property, AllowMultiple = false, Inherited = true)] 3: public class IsEvenAttribute : ValidationAttribute, IClientValidatable 4: { 5: protected override ValidationResult IsValid(Object value, ValidationContext validationContext) 6: { 7: Int32 v = Convert.ToInt32(value); 8:  9: if (v % 2 == 0) 10: { 11: return (ValidationResult.Success); 12: } 13: else 14: { 15: return (new ValidationResult("Value is not even")); 16: } 17: } 18:  19: #region IClientValidatable Members 20:  21: public IEnumerable<ModelClientValidationRule> GetClientValidationRules(ModelMetadata metadata, ControllerContext context) 22: { 23: yield return (new ModelClientValidationRule() { ValidationType = "iseven", ErrorMessage = "Value is not even" }); 24: } 25:  26: #endregion 27: } The iseven validation function is declared like this in JavaScript, using jQuery validation: 1: jQuery.validator.addMethod('iseven', function (value, element, params) 2: { 3: return (true); 4: return ((parseInt(value) % 2) == 0); 5: }); 6:  7: jQuery.validator.unobtrusive.adapters.add('iseven', [], function (options) 8: { 9: options.rules['iseven'] = options.params; 10: options.messages['iseven'] = options.message; 11: }); Do keep in mind that this is a simple example, for example, we are not using parameters, which may be required for some more advanced scenarios. As a side note, if you implement a custom validator that also requires a JavaScript function, you’ll probably want them together. One way to achieve this is by including the JavaScript file as an embedded resource on the same assembly where the custom attribute is declared. You do this by having its Build Action set as Embedded Resource inside Visual Studio: Then you have to declare an attribute at assembly level, perhaps in the AssemblyInfo.cs file: 1: [assembly: WebResource("SomeNamespace.IsEven.js", "text/javascript")] In your views, if you want to include a JavaScript file from an embedded resource you can use this code: 1: public static class UrlExtensions 2: { 3: private static readonly MethodInfo getResourceUrlMethod = typeof(AssemblyResourceLoader).GetMethod("GetWebResourceUrlInternal", BindingFlags.NonPublic | BindingFlags.Static); 4:  5: public static IHtmlString Resource<TType>(this UrlHelper url, String resourceName) 6: { 7: return (Resource(url, typeof(TType).Assembly.FullName, resourceName)); 8: } 9:  10: public static IHtmlString Resource(this UrlHelper url, String assemblyName, String resourceName) 11: { 12: String resourceUrl = getResourceUrlMethod.Invoke(null, new Object[] { Assembly.Load(assemblyName), resourceName, false, false, null }).ToString(); 13: return (new HtmlString(resourceUrl)); 14: } 15: } And on the view: 1: <script src="<%: this.Url.Resource("SomeAssembly", "SomeNamespace.IsEven.js") %>" type="text/javascript"></script> Then there’s the CustomValidationAttribute. It allows externalizing your validation logic to another class, so you have to tell which type and method to use. The method can be static as well as instance, if it is instance, the class cannot be abstract and must have a public parameterless constructor. It can be applied to a property as well as a class. It does not, however, support client-side validation. Let’s see an example declaration: 1: [CustomValidation(typeof(ProductValidator), "OnValidateName")] 2: public String Name 3: { 4: get; 5: set; 6: } The validation method needs this signature: 1: public static ValidationResult OnValidateName(String name) 2: { 3: if ((String.IsNullOrWhiteSpace(name) == false) && (name.Length <= 50)) 4: { 5: return (ValidationResult.Success); 6: } 7: else 8: { 9: return (new ValidationResult(String.Format("The name has an invalid value: {0}", name), new String[] { "Name" })); 10: } 11: } Note that it can be either static or instance and it must return a ValidationResult-derived class. ValidationResult.Success is null, so any non-null value is considered a validation error. The single method argument must match the property type to which the attribute is attached to or the class, in case it is applied to a class: 1: [CustomValidation(typeof(ProductValidator), "OnValidateProduct")] 2: public class Product 3: { 4: } The signature must thus be: 1: public static ValidationResult OnValidateProduct(Product product) 2: { 3: } Continuing with attribute-based validation, another possibility is RemoteAttribute. This allows specifying a controller and an action method just for performing the validation of a property or set of properties. This works in a client-side AJAX way and it can be very useful. Let’s see an example, starting with the attribute declaration and proceeding to the action method implementation: 1: [Remote("Validate", "Validation")] 2: public String Username 3: { 4: get; 5: set; 6: } The controller action method must contain an argument that can be bound to the property: 1: public ActionResult Validate(String username) 2: { 3: return (this.Json(true, JsonRequestBehavior.AllowGet)); 4: } If in your result JSON object you include a string instead of the true value, it will consider it as an error, and the validation will fail. This string will be displayed as the error message, if you have included it in your view. You can also use the remote validation approach for validating your entire entity, by including all of its properties as included fields in the attribute and having an action method that receives an entity instead of a single property: 1: [Remote("Validate", "Validation", AdditionalFields = "Price")] 2: public String Name 3: { 4: get; 5: set; 6: } 7:  8: public Decimal Price 9: { 10: get; 11: set; 12: } The action method will then be: 1: public ActionResult Validate(Product product) 2: { 3: return (this.Json("Product is not valid", JsonRequestBehavior.AllowGet)); 4: } Only the property to which the attribute is applied and the additional properties referenced by the AdditionalFields will be populated in the entity instance received by the validation method. The same rule previously stated applies, if you return anything other than true, it will be used as the validation error message for the entity. The remote validation is triggered automatically, but you can also call it explicitly. In the next example, I am causing the full entity validation, see the call to serialize(): 1: function validate() 2: { 3: var form = $('form'); 4: var data = form.serialize(); 5: var url = '<%: this.Url.Action("Validation", "Validate") %>'; 6:  7: var result = $.ajax 8: ( 9: { 10: type: 'POST', 11: url: url, 12: data: data, 13: async: false 14: } 15: ).responseText; 16:  17: if (result) 18: { 19: //error 20: } 21: } Finally, by implementing IValidatableObject, you can implement your validation logic on the object itself, that is, you make it self-validatable. This will only work server-side, that is, the ModelState.IsValid property will be set to false on the controller’s action method if the validation in unsuccessful. Let’s see how to implement it: 1: public class Product : IValidatableObject 2: { 3: public String Name 4: { 5: get; 6: set; 7: } 8:  9: public Decimal Price 10: { 11: get; 12: set; 13: } 14:  15: #region IValidatableObject Members 16: 17: public IEnumerable<ValidationResult> Validate(ValidationContext validationContext) 18: { 19: if ((String.IsNullOrWhiteSpace(this.Name) == true) || (this.Name.Length > 50)) 20: { 21: yield return (new ValidationResult(String.Format("The name has an invalid value: {0}", this.Name), new String[] { "Name" })); 22: } 23: 24: if ((this.Price <= 0) || (this.Price > 100)) 25: { 26: yield return (new ValidationResult(String.Format("The price has an invalid value: {0}", this.Price), new String[] { "Price" })); 27: } 28: } 29: 30: #endregion 31: } The errors returned will be matched against the model properties through the MemberNames property of the ValidationResult class and will be displayed in their proper labels, if present on the view. On the controller action method you can check for model validity by looking at ModelState.IsValid and you can get actual error messages and related properties by examining all of the entries in the ModelState dictionary: 1: Dictionary<String, String> errors = new Dictionary<String, String>(); 2:  3: foreach (KeyValuePair<String, ModelState> keyValue in this.ModelState) 4: { 5: String key = keyValue.Key; 6: ModelState modelState = keyValue.Value; 7:  8: foreach (ModelError error in modelState.Errors) 9: { 10: errors[key] = error.ErrorMessage; 11: } 12: } And these are the ways to perform date validation in ASP.NET MVC. Don’t forget to use them!

    Read the article

  • C#: Handling Notifications: inheritance, events, or delegates?

    - by James Michael Hare
    Often times as developers we have to design a class where we get notification when certain things happen. In older object-oriented code this would often be implemented by overriding methods -- with events, delegates, and interfaces, however, we have far more elegant options. So, when should you use each of these methods and what are their strengths and weaknesses? Now, for the purposes of this article when I say notification, I'm just talking about ways for a class to let a user know that something has occurred. This can be through any programmatic means such as inheritance, events, delegates, etc. So let's build some context. I'm sitting here thinking about a provider neutral messaging layer for the place I work, and I got to the point where I needed to design the message subscriber which will receive messages from the message bus. Basically, what we want is to be able to create a message listener and have it be called whenever a new message arrives. Now, back before the flood we would have done this via inheritance and an abstract class: 1:  2: // using inheritance - omitting argument null checks and halt logic 3: public abstract class MessageListener 4: { 5: private ISubscriber _subscriber; 6: private bool _isHalted = false; 7: private Thread _messageThread; 8:  9: // assign the subscriber and start the messaging loop 10: public MessageListener(ISubscriber subscriber) 11: { 12: _subscriber = subscriber; 13: _messageThread = new Thread(MessageLoop); 14: _messageThread.Start(); 15: } 16:  17: // user will override this to process their messages 18: protected abstract void OnMessageReceived(Message msg); 19:  20: // handle the looping in the thread 21: private void MessageLoop() 22: { 23: while(!_isHalted) 24: { 25: // as long as processing, wait 1 second for message 26: Message msg = _subscriber.Receive(TimeSpan.FromSeconds(1)); 27: if(msg != null) 28: { 29: OnMessageReceived(msg); 30: } 31: } 32: } 33: ... 34: } It seems so odd to write this kind of code now. Does it feel odd to you? Maybe it's just because I've gotten so used to delegation that I really don't like the feel of this. To me it is akin to saying that if I want to drive my car I need to derive a new instance of it just to put myself in the driver's seat. And yet, unquestionably, five years ago I would have probably written the code as you see above. To me, inheritance is a flawed approach for notifications due to several reasons: Inheritance is one of the HIGHEST forms of coupling. You can't seal the listener class because it depends on sub-classing to work. Because C# does not allow multiple-inheritance, I've spent my one inheritance implementing this class. Every time you need to listen to a bus, you have to derive a class which leads to lots of trivial sub-classes. The act of consuming a message should be a separate responsibility than the act of listening for a message (SRP). Inheritance is such a strong statement (this IS-A that) that it should only be used in building type hierarchies and not for overriding use-specific behaviors and notifications. Chances are, if a class needs to be inherited to be used, it most likely is not designed as well as it could be in today's modern programming languages. So lets look at the other tools available to us for getting notified instead. Here's a few other choices to consider. Have the listener expose a MessageReceived event. Have the listener accept a new IMessageHandler interface instance. Have the listener accept an Action<Message> delegate. Really, all of these are different forms of delegation. Now, .NET events are a bit heavier than the other types of delegates in terms of run-time execution, but they are a great way to allow others using your class to subscribe to your events: 1: // using event - ommiting argument null checks and halt logic 2: public sealed class MessageListener 3: { 4: private ISubscriber _subscriber; 5: private bool _isHalted = false; 6: private Thread _messageThread; 7:  8: // assign the subscriber and start the messaging loop 9: public MessageListener(ISubscriber subscriber) 10: { 11: _subscriber = subscriber; 12: _messageThread = new Thread(MessageLoop); 13: _messageThread.Start(); 14: } 15:  16: // user will override this to process their messages 17: public event Action<Message> MessageReceived; 18:  19: // handle the looping in the thread 20: private void MessageLoop() 21: { 22: while(!_isHalted) 23: { 24: // as long as processing, wait 1 second for message 25: Message msg = _subscriber.Receive(TimeSpan.FromSeconds(1)); 26: if(msg != null && MessageReceived != null) 27: { 28: MessageReceived(msg); 29: } 30: } 31: } 32: } Note, now we can seal the class to avoid changes and the user just needs to provide a message handling method: 1: theListener.MessageReceived += CustomReceiveMethod; However, personally I don't think events hold up as well in this case because events are largely optional. To me, what is the point of a listener if you create one with no event listeners? So in my mind, use events when handling the notification is optional. So how about the delegation via interface? I personally like this method quite a bit. Basically what it does is similar to inheritance method mentioned first, but better because it makes it easy to split the part of the class that doesn't change (the base listener behavior) from the part that does change (the user-specified action after receiving a message). So assuming we had an interface like: 1: public interface IMessageHandler 2: { 3: void OnMessageReceived(Message receivedMessage); 4: } Our listener would look like this: 1: // using delegation via interface - omitting argument null checks and halt logic 2: public sealed class MessageListener 3: { 4: private ISubscriber _subscriber; 5: private IMessageHandler _handler; 6: private bool _isHalted = false; 7: private Thread _messageThread; 8:  9: // assign the subscriber and start the messaging loop 10: public MessageListener(ISubscriber subscriber, IMessageHandler handler) 11: { 12: _subscriber = subscriber; 13: _handler = handler; 14: _messageThread = new Thread(MessageLoop); 15: _messageThread.Start(); 16: } 17:  18: // handle the looping in the thread 19: private void MessageLoop() 20: { 21: while(!_isHalted) 22: { 23: // as long as processing, wait 1 second for message 24: Message msg = _subscriber.Receive(TimeSpan.FromSeconds(1)); 25: if(msg != null) 26: { 27: _handler.OnMessageReceived(msg); 28: } 29: } 30: } 31: } And they would call it by creating a class that implements IMessageHandler and pass that instance into the constructor of the listener. I like that this alleviates the issues of inheritance and essentially forces you to provide a handler (as opposed to events) on construction. Well, this is good, but personally I think we could go one step further. While I like this better than events or inheritance, it still forces you to implement a specific method name. What if that name collides? Furthermore if you have lots of these you end up either with large classes inheriting multiple interfaces to implement one method, or lots of small classes. Also, if you had one class that wanted to manage messages from two different subscribers differently, it wouldn't be able to because the interface can't be overloaded. This brings me to using delegates directly. In general, every time I think about creating an interface for something, and if that interface contains only one method, I start thinking a delegate is a better approach. Now, that said delegates don't accomplish everything an interface can. Obviously having the interface allows you to refer to the classes that implement the interface which can be very handy. In this case, though, really all you want is a method to handle the messages. So let's look at a method delegate: 1: // using delegation via delegate - omitting argument null checks and halt logic 2: public sealed class MessageListener 3: { 4: private ISubscriber _subscriber; 5: private Action<Message> _handler; 6: private bool _isHalted = false; 7: private Thread _messageThread; 8:  9: // assign the subscriber and start the messaging loop 10: public MessageListener(ISubscriber subscriber, Action<Message> handler) 11: { 12: _subscriber = subscriber; 13: _handler = handler; 14: _messageThread = new Thread(MessageLoop); 15: _messageThread.Start(); 16: } 17:  18: // handle the looping in the thread 19: private void MessageLoop() 20: { 21: while(!_isHalted) 22: { 23: // as long as processing, wait 1 second for message 24: Message msg = _subscriber.Receive(TimeSpan.FromSeconds(1)); 25: if(msg != null) 26: { 27: _handler(msg); 28: } 29: } 30: } 31: } Here the MessageListener now takes an Action<Message>.  For those of you unfamiliar with the pre-defined delegate types in .NET, that is a method with the signature: void SomeMethodName(Message). The great thing about delegates is it gives you a lot of power. You could create an anonymous delegate, a lambda, or specify any other method as long as it satisfies the Action<Message> signature. This way, you don't need to define an arbitrary helper class or name the method a specific thing. Incidentally, we could combine both the interface and delegate approach to allow maximum flexibility. Doing this, the user could either pass in a delegate, or specify a delegate interface: 1: // using delegation - give users choice of interface or delegate 2: public sealed class MessageListener 3: { 4: private ISubscriber _subscriber; 5: private Action<Message> _handler; 6: private bool _isHalted = false; 7: private Thread _messageThread; 8:  9: // assign the subscriber and start the messaging loop 10: public MessageListener(ISubscriber subscriber, Action<Message> handler) 11: { 12: _subscriber = subscriber; 13: _handler = handler; 14: _messageThread = new Thread(MessageLoop); 15: _messageThread.Start(); 16: } 17:  18: // passes the interface method as a delegate using method group 19: public MessageListener(ISubscriber subscriber, IMessageHandler handler) 20: : this(subscriber, handler.OnMessageReceived) 21: { 22: } 23:  24: // handle the looping in the thread 25: private void MessageLoop() 26: { 27: while(!_isHalted) 28: { 29: // as long as processing, wait 1 second for message 30: Message msg = _subscriber.Receive(TimeSpan.FromSeconds(1)); 31: if(msg != null) 32: { 33: _handler(msg); 34: } 35: } 36: } 37: } } This is the method I tend to prefer because it allows the user of the class to choose which method works best for them. You may be curious about the actual performance of these different methods. 1: Enter iterations: 2: 1000000 3:  4: Inheritance took 4 ms. 5: Events took 7 ms. 6: Interface delegation took 4 ms. 7: Lambda delegate took 5 ms. Before you get too caught up in the numbers, however, keep in mind that this is performance over over 1,000,000 iterations. Since they are all < 10 ms which boils down to fractions of a micro-second per iteration so really any of them are a fine choice performance wise. As such, I think the choice of what to do really boils down to what you're trying to do. Here's my guidelines: Inheritance should be used only when defining a collection of related types with implementation specific behaviors, it should not be used as a hook for users to add their own functionality. Events should be used when subscription is optional or multi-cast is desired. Interface delegation should be used when you wish to refer to implementing classes by the interface type or if the type requires several methods to be implemented. Delegate method delegation should be used when you only need to provide one method and do not need to refer to implementers by the interface name.

    Read the article

  • Enhanced REST Support in Oracle Service Bus 11gR1

    - by jeff.x.davies
    In a previous entry on REST and Oracle Service Bus (see http://blogs.oracle.com/jeffdavies/2009/06/restful_services_with_oracle_s_1.html) I encoded the REST query string really as part of the relative URL. For example, consider the following URI: http://localhost:7001/SimpleREST/Products/id=1234 Now, technically there is nothing wrong with this approach. However, it is generally more common to encode the search parameters into the query string. Take a look at the following URI that shows this principle http://localhost:7001/SimpleREST/Products?id=1234 At first blush this appears to be a trivial change. However, this approach is more intuitive, especially if you are passing in multiple parameters. For example: http://localhost:7001/SimpleREST/Products?cat=electronics&subcat=television&mfg=sony The above URI is obviously used to retrieve a list of televisions made by Sony. In prior versions of OSB (before 11gR1PS3), parsing the query string of a URI was more difficult than in the current release. In 11gR1PS3 it is now much easier to parse the query strings, which in turn makes developing REST services in OSB even easier. In this blog entry, we will re-implement the REST-ful Products services using query strings for passing parameter information. Lets begin with the implementation of the Products REST service. This service is implemented in the Products.proxy file of the project. Lets begin with the overall structure of the service, as shown in the following screenshot. This is a common pattern for REST services in the Oracle Service Bus. You implement different flows for each of the HTTP verbs that you want your service to support. Lets take a look at how the GET verb is implemented. This is the path that is taken of you were to point your browser to: http://localhost:7001/SimpleREST/Products/id=1234 There is an Assign action in the request pipeline that shows how to extract a query parameter. Here is the expression that is used to extract the id parameter: $inbound/ctx:transport/ctx:request/http:query-parameters/http:parameter[@name="id"]/@value The Assign action that stores the value into an OSB variable named id. Using this type of XPath statement you can query for any variables by name, without regard to their order in the parameter list. The Log statement is there simply to provided some debugging info in the OSB server console. The response pipeline contains a Replace action that constructs the response document for our rest service. Most of the response data is static, but the ID field that is returned is set based upon the query-parameter that was passed into the REST proxy. Testing the REST service with a browser is very simple. Just point it to the URL I showed you earlier. However, the browser is really only good for testing simple GET services. The OSB Test Console provides a much more robust environment for testing REST services, no matter which HTTP verb is used. Lets see how to use the Test Console to test this GET service. Open the OSB we console (http://localhost:7001/sbconsole) and log in as the administrator. Click on the Test Console icon (the little "bug") next to the Products proxy service in the SimpleREST project. This will bring up the Test Console browser window. Unlike SOAP services, we don't need to do much work in the request document because all of our request information will be encoded into the URI of the service itself. Belore the Request Document section of the Test Console is the Transport section. Expand that section and modify the query-parameters and http-method fields as shown in the next screenshot. By default, the query-parameters field will have the tags already defined. You just need to add a tag for each parameter you want to pass into the service. For out purposes with this particular call, you'd set the quer-parameters field as follows: <tp:parameter name="id" value="1234" /> </tp:query-parameters> Now you are ready to push the Execute button to see the results of the call. That covers the process for parsing query parameters using OSB. However, what if you have an OSB proxy service that needs to consume a REST-ful service? How do you tell OSB to pass the query parameters to the external service? In the sample code you will see a 2nd proxy service called CallREST. It invokes the Products proxy service in exactly the same way it would invoke any REST service. Our CallREST proxy service is defined as a SOAP service. This help to demonstrate OSBs ability to mediate between service consumers and service providers, decreasing the level of coupling between them. If you examine the message flow for the CallREST proxy service, you'll see that it uses an Operational branch to isolate processing logic for each operation that is defined by the SOAP service. We will focus on the getProductDetail branch, that calls the Products REST service using the HTTP GET verb. Expand the getProduct pipeline and the stage node that it contains. There is a single Assign statement that simply extracts the productID from the SOA request and stores it in a local OSB variable. Nothing suprising here. The real work (and the real learning) occurs in the Route node below the pipeline. The first thing to learn is that you need to use a route node when calling REST services, not a Service Callout or a Publish action. That's because only the Routing action has access to the $oubound variable, especially when invoking a business service. The Routing action contains 3 Insert actions. The first Insert action shows how to specify the HTTP verb as a GET. The second insert action simply inserts the XML node into the request. This element does not exist in the request by default, so we need to add it manually. Now that we have the element defined in our outbound request, we can fill it with the parameters that we want to send to the REST service. In the following screenshot you can see how we define the id parameter based on the productID value we extracted earlier from the SOAP request document. That expression will look for the parameter that has the name id and extract its value. That's all there is to it. You now know how to take full advantage of the query parameter parsing capability of the Oracle Service Bus 11gR1PS2. Download the sample source code here: rest2_sbconfig.jar Ubuntu and the OSB Test Console You will get an error when you try to use the Test Console with the Oracle Service Bus, using Ubuntu (or likely a number of other Linux distros also). The error (shown below) will state that the Test Console service is not running. The fix for this problem is quite simple. Open up the WebLogic Server administrator console (usually running at http://localhost:7001/console). In the Domain Structure window on the left side of the console, select the Servers entry under the Environment heading. The select the Admin Server entry in the main window of the console. By default, you should be viewing the Configuration tabe and the General sub tab in the main window. Look for the Listen Address field. By default it is blank, which means it is listening on all interfaces. For some reason Ubuntu doesn't like this. So enter a value like localhost or the specific IP address or DNS name for your server (usually its just localhost in development envirionments). Save your changes and restart the server. Your Test Console will now work correctly.

    Read the article

  • Windows Azure Use Case: Fast Acquisitions

    - by BuckWoody
    This is one in a series of posts on when and where to use a distributed architecture design in your organization's computing needs. You can find the main post here: http://blogs.msdn.com/b/buckwoody/archive/2011/01/18/windows-azure-and-sql-azure-use-cases.aspx  Description: Many organizations absorb, take over or merge with other organizations. In these cases, one of the most difficult parts of the process is the merging or changing of the IT systems that the employees use to do their work, process payments, and even get paid. Normally this means that the two companies have disparate systems, and several approaches can be used to have the two organizations use technology between them. An organization may choose to retain both systems, and manage them separately. The advantage here is speed, and keeping the profit/loss sheets separate. Another choice is to slowly “sunset” or stop using one organization’s system, and cutting to the other system immediately or at a later date. Although a popular choice, one of the most difficult methods is to extract data and processes from one system and import it into the other. Employees at the transitioning system have to be trained on the new one, the data must be examined and cleansed, and there is inevitable disruption when this happens. Still another option is to integrate the systems. This may prove to be as much work as a transitional strategy, but may have less impact on the users or the balance sheet. Implementation: A distributed computing paradigm can be a good strategic solution to most of these strategies. Retaining both systems is made more simple by allowing the users at the second organization immediate access to the new system, because security accounts can be created quickly inside an application. There is no need to set up a VPN or any other connections than just to the Internet. Having the users stop using one system and start with the other is also simple in Windows Azure for the same reason. Extracting data to Azure holds the same limitations as an on-premise system, and may even be more problematic because of the large data transfers that might be required. In a distributed environment, you pay for the data transfer, so a mixed migration strategy is not recommended. However, if the data is slowly migrated over time with a defined cutover, this can be an effective strategy. If done properly, an integration strategy works very well for a distributed computing environment like Windows Azure. If the Azure code is architected as a series of services, then endpoints can expose the service into and out of not only the Azure platform, but internally as well. This is a form of the Hybrid Application use-case documented here. References: Designing for Cloud Optimized Architecture: http://blogs.msdn.com/b/dachou/archive/2011/01/23/designing-for-cloud-optimized-architecture.aspx 5 Enterprise steps for adopting a Platform as a Service: http://blogs.msdn.com/b/davidmcg/archive/2010/12/02/5-enterprise-steps-for-adopting-a-platform-as-a-service.aspx?wa=wsignin1.0

    Read the article

< Previous Page | 67 68 69 70 71 72 73 74 75 76 77 78  | Next Page >