Search Results

Search found 1393 results on 56 pages for 'employee'.

Page 56/56 | < Previous Page | 52 53 54 55 56 

  • Windows Azure: Major Updates for Mobile Backend Development

    - by ScottGu
    This week we released some great updates to Windows Azure that make it significantly easier to develop mobile applications that use the cloud. These new capabilities include: Mobile Services: Custom API support Mobile Services: Git Source Control support Mobile Services: Node.js NPM Module support Mobile Services: A .NET API via NuGet Mobile Services and Web Sites: Free 20MB SQL Database Option for Mobile Services and Web Sites Mobile Notification Hubs: Android Broadcast Push Notification Support All of these improvements are now available to use immediately (note: some are still in preview).  Below are more details about them. Mobile Services: Custom APIs, Git Source Control, and NuGet Windows Azure Mobile Services provides the ability to easily stand up a mobile backend that can be used to support your Windows 8, Windows Phone, iOS, Android and HTML5 client applications.  Starting with the first preview we supported the ability to easily extend your data backend logic with server side scripting that executes as part of client-side CRUD operations against your cloud back data tables. With today’s update we are extending this support even further and introducing the ability for you to also create and expose Custom APIs from your Mobile Service backend, and easily publish them to your Mobile clients without having to associate them with a data table. This capability enables a whole set of new scenarios – including the ability to work with data sources other than SQL Databases (for example: Table Services or MongoDB), broker calls to 3rd party APIs, integrate with Windows Azure Queues or Service Bus, work with custom non-JSON payloads (e.g. Windows Periodic Notifications), route client requests to services back on-premises (e.g. with the new Windows Azure BizTalk Services), or simply implement functionality that doesn’t correspond to a database operation.  The custom APIs can be written in server-side JavaScript (using Node.js) and can use Node’s NPM packages.  We will also be adding support for custom APIs written using .NET in the future as well. Creating a Custom API Adding a custom API to an existing Mobile Service is super easy.  Using the Windows Azure Management Portal you can now simply click the new “API” tab with your Mobile Service, and then click the “Create a Custom API” button to create a new Custom API within it: Give the API whatever name you want to expose, and then choose the security permissions you’d like to apply to the HTTP methods you expose within it.  You can easily lock down the HTTP verbs to your Custom API to be available to anyone, only those who have a valid application key, only authenticated users, or administrators.  Mobile Services will then enforce these permissions without you having to write any code: When you click the ok button you’ll see the new API show up in the API list.  Selecting it will enable you to edit the default script that contains some placeholder functionality: Today’s release enables Custom APIs to be written using Node.js (we will support writing Custom APIs in .NET as well in a future release), and the Custom API programming model follows the Node.js convention for modules, which is to export functions to handle HTTP requests. The default script above exposes functionality for an HTTP POST request. To support a GET, simply change the export statement accordingly.  Below is an example of some code for reading and returning data from Windows Azure Table Storage using the Azure Node API: After saving the changes, you can now call this API from any Mobile Service client application (including Windows 8, Windows Phone, iOS, Android or HTML5 with CORS). Below is the code for how you could invoke the API asynchronously from a Windows Store application using .NET and the new InvokeApiAsync method, and data-bind the results to control within your XAML:     private async void RefreshTodoItems() {         var results = await App.MobileService.InvokeApiAsync<List<TodoItem>>("todos", HttpMethod.Get, parameters: null);         ListItems.ItemsSource = new ObservableCollection<TodoItem>(results);     }    Integrating authentication and authorization with Custom APIs is really easy with Mobile Services. Just like with data requests, custom API requests enjoy the same built-in authentication and authorization support of Mobile Services (including integration with Microsoft ID, Google, Facebook and Twitter authentication providers), and it also enables you to easily integrate your Custom API code with other Mobile Service capabilities like push notifications, logging, SQL, etc. Check out our new tutorials to learn more about to use new Custom API support, and starting adding them to your app today. Mobile Services: Git Source Control Support Today’s Mobile Services update also enables source control integration with Git.  The new source control support provides a Git repository as part your Mobile Service, and it includes all of your existing Mobile Service scripts and permissions. You can clone that git repository on your local machine, make changes to any of your scripts, and then easily deploy the mobile service to production using Git. This enables a really great developer workflow that works on any developer machine (Windows, Mac and Linux). To use the new support, navigate to the dashboard for your mobile service and select the Set up source control link: If this is your first time enabling Git within Windows Azure, you will be prompted to enter the credentials you want to use to access the repository: Once you configure this, you can switch to the configure tab of your Mobile Service and you will see a Git URL you can use to use your repository: You can use this URL to clone the repository locally from your favorite command line: > git clone https://scottgutodo.scm.azure-mobile.net/ScottGuToDo.git Below is the directory structure of the repository: As you can see, the repository contains a service folder with several subfolders. Custom API scripts and associated permissions appear under the api folder as .js and .json files respectively (the .json files persist a JSON representation of the security settings for your endpoints). Similarly, table scripts and table permissions appear as .js and .json files, but since table scripts are separate per CRUD operation, they follow the naming convention of <tablename>.<operationname>.js. Finally, scheduled job scripts appear in the scheduler folder, and the shared folder is provided as a convenient location for you to store code shared by multiple scripts and a few miscellaneous things such as the APNS feedback script. Lets modify the table script todos.js file so that we have slightly better error handling when an exception occurs when we query our Table service: todos.js tableService.queryEntities(query, function(error, todoItems){     if (error) {         console.error("Error querying table: " + error);         response.send(500);     } else {         response.send(200, todoItems);     }        }); Save these changes, and now back in the command line prompt commit the changes and push them to the Mobile Services: > git add . > git commit –m "better error handling in todos.js" > git push Once deployment of the changes is complete, they will take effect immediately, and you will also see the changes be reflected in the portal: With the new Source Control feature, we’re making it really easy for you to edit your mobile service locally and push changes in an atomic fashion without sacrificing ease of use in the Windows Azure Portal. Mobile Services: NPM Module Support The new Mobile Services source control support also allows you to add any Node.js module you need in the scripts beyond the fixed set provided by Mobile Services. For example, you can easily switch to use Mongo instead of Windows Azure table in our example above. Set up Mongo DB by either purchasing a MongoLab subscription (which provides MongoDB as a Service) via the Windows Azure Store or set it up yourself on a Virtual Machine (either Windows or Linux). Then go the service folder of your local git repository and run the following command: > npm install mongoose This will add the Mongoose module to your Mobile Service scripts.  After that you can use and reference the Mongoose module in your custom API scripts to access your Mongo database: var mongoose = require('mongoose'); var schema = mongoose.Schema({ text: String, completed: Boolean });   exports.get = function (request, response) {     mongoose.connect('<your Mongo connection string> ');     TodoItemModel = mongoose.model('todoitem', schema);     TodoItemModel.find(function (err, items) {         if (err) {             console.log('error:' + err);             return response.send(500);         }         response.send(200, items);     }); }; Don’t forget to push your changes to your mobile service once you are done > git add . > git commit –m "Switched to use Mongo Labs" > git push Now our Mobile Service app is using Mongo DB! Note, with today’s update usage of custom Node.js modules is limited to Custom API scripts only. We will enable it in all scripts (including data and custom CRON tasks) shortly. New Mobile Services NuGet package, including .NET 4.5 support A few months ago we announced a new pre-release version of the Mobile Services client SDK based on portable class libraries (PCL). Today, we are excited to announce that this new library is now a stable .NET client SDK for mobile services and is no longer a pre-release package. Today’s update includes full support for Windows Store, Windows Phone 7.x, and .NET 4.5, which allows developers to use Mobile Services from ASP.NET or WPF applications. You can install and use this package today via NuGet. Mobile Services and Web Sites: Free 20MB Database for Mobile Services and Web Sites Starting today, every customer of Windows Azure gets one Free 20MB database to use for 12 months free (for both dev/test and production) with Web Sites and Mobile Services. When creating a Mobile Service or a Web Site, simply chose the new “Create a new Free 20MB database” option to take advantage of it: You can use this free SQL Database together with the 10 free Web Sites and 10 free Mobile Services you get with your Windows Azure subscription, or from any other Windows Azure VM or Cloud Service. Notification Hubs: Android Broadcast Push Notification Support Earlier this year, we introduced a new capability in Windows Azure for sending broadcast push notifications at high scale: Notification Hubs. In the initial preview of Notification Hubs you could use this support with both iOS and Windows devices.  Today we’re excited to announce new Notification Hubs support for sending push notifications to Android devices as well. Push notifications are a vital component of mobile applications.  They are critical not only in consumer apps, where they are used to increase app engagement and usage, but also in enterprise apps where up-to-date information increases employee responsiveness to business events.  You can use Notification Hubs to send push notifications to devices from any type of app (a Mobile Service, Web Site, Cloud Service or Virtual Machine). Notification Hubs provide you with the following capabilities: Cross-platform Push Notifications Support. Notification Hubs provide a common API to send push notifications to iOS, Android, or Windows Store at once.  Your app can send notifications in platform specific formats or in a platform-independent way.  Efficient Multicast. Notification Hubs are optimized to enable push notification broadcast to thousands or millions of devices with low latency.  Your server back-end can fire one message into a Notification Hub, and millions of push notifications can automatically be delivered to your users.  Devices and apps can specify a number of per-user tags when registering with a Notification Hub. These tags do not need to be pre-provisioned or disposed, and provide a very easy way to send filtered notifications to an infinite number of users/devices with a single API call.   Extreme Scale. Notification Hubs enable you to reach millions of devices without you having to re-architect or shard your application.  The pub/sub routing mechanism allows you to broadcast notifications in a super-efficient way.  This makes it incredibly easy to route and deliver notification messages to millions of users without having to build your own routing infrastructure. Usable from any Backend App. Notification Hubs can be easily integrated into any back-end server app, whether it is a Mobile Service, a Web Site, a Cloud Service or an IAAS VM. It is easy to configure Notification Hubs to send push notifications to Android. Create a new Notification Hub within the Windows Azure Management Portal (New->App Services->Service Bus->Notification Hub): Then register for Google Cloud Messaging using https://code.google.com/apis/console and obtain your API key, then simply paste that key on the Configure tab of your Notification Hub management page under the Google Cloud Messaging Settings: Then just add code to the OnCreate method of your Android app’s MainActivity class to register the device with Notification Hubs: gcm = GoogleCloudMessaging.getInstance(this); String connectionString = "<your listen access connection string>"; hub = new NotificationHub("<your notification hub name>", connectionString, this); String regid = gcm.register(SENDER_ID); hub.register(regid, "myTag"); Now you can broadcast notification from your .NET backend (or Node, Java, or PHP) to any Windows Store, Android, or iOS device registered for “myTag” tag via a single API call (you can literally broadcast messages to millions of clients you have registered with just one API call): var hubClient = NotificationHubClient.CreateClientFromConnectionString(                   “<your connection string with full access>”,                   "<your notification hub name>"); hubClient.SendGcmNativeNotification("{ 'data' : {'msg' : 'Hello from Windows Azure!' } }", "myTag”); Notification Hubs provide an extremely scalable, cross-platform, push notification infrastructure that enables you to efficiently route push notification messages to millions of mobile users and devices.  It will make enabling your push notification logic significantly simpler and more scalable, and allow you to build even better apps with it. Learn more about Notification Hubs here on MSDN . Summary The above features are now live and available to start using immediately (note: some of the services are still in preview).  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using them today.  Visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Selling Federal Enterprise Architecture (EA)

    - by TedMcLaughlan
    Selling Federal Enterprise Architecture A taxonomy of subject areas, from which to develop a prioritized marketing and communications plan to evangelize EA activities within and among US Federal Government organizations and constituents. Any and all feedback is appreciated, particularly in developing and extending this discussion as a tool for use – more information and details are also available. "Selling" the discipline of Enterprise Architecture (EA) in the Federal Government (particularly in non-DoD agencies) is difficult, notwithstanding the general availability and use of the Federal Enterprise Architecture Framework (FEAF) for some time now, and the relatively mature use of the reference models in the OMB Capital Planning and Investment (CPIC) cycles. EA in the Federal Government also tends to be a very esoteric and hard to decipher conversation – early apologies to those who agree to continue reading this somewhat lengthy article. Alignment to the FEAF and OMB compliance mandates is long underway across the Federal Departments and Agencies (and visible via tools like PortfolioStat and ITDashboard.gov – but there is still a gap between the top-down compliance directives and enablement programs, and the bottom-up awareness and effective use of EA for either IT investment management or actual mission effectiveness. "EA isn't getting deep enough penetration into programs, components, sub-agencies, etc.", verified a panelist at the most recent EA Government Conference in DC. Newer guidance from OMB may be especially difficult to handle, where bottom-up input can't be accurately aligned, analyzed and reported via standardized EA discipline at the Agency level – for example in addressing the new (for FY13) Exhibit 53D "Agency IT Reductions and Reinvestments" and the information required for "Cloud Computing Alternatives Evaluation" (supporting the new Exhibit 53C, "Agency Cloud Computing Portfolio"). Therefore, EA must be "sold" directly to the communities that matter, from a coordinated, proactive messaging perspective that takes BOTH the Program-level value drivers AND the broader Agency mission and IT maturity context into consideration. Selling EA means persuading others to take additional time and possibly assign additional resources, for a mix of direct and indirect benefits – many of which aren't likely to be realized in the short-term. This means there's probably little current, allocated budget to work with; ergo the challenge of trying to sell an "unfunded mandate". Also, the concept of "Enterprise" in large Departments like Homeland Security tends to cross all kinds of organizational boundaries – as Richard Spires recently indicated by commenting that "...organizational boundaries still trump functional similarities. Most people understand what we're trying to do internally, and at a high level they get it. The problem, of course, is when you get down to them and their system and the fact that you're going to be touching them...there's always that fear factor," Spires said. It is quite clear to the Federal IT Investment community that for EA to meet its objective, understandable, relevant value must be measured and reported using a repeatable method – as described by GAO's recent report "Enterprise Architecture Value Needs To Be Measured and Reported". What's not clear is the method or guidance to sell this value. In fact, the current GAO "Framework for Assessing and Improving Enterprise Architecture Management (Version 2.0)", a.k.a. the "EAMMF", does not include words like "sell", "persuade", "market", etc., except in reference ("within Core Element 19: Organization business owner and CXO representatives are actively engaged in architecture development") to a brief section in the CIO Council's 2001 "Practical Guide to Federal Enterprise Architecture", entitled "3.3.1. Develop an EA Marketing Strategy and Communications Plan." Furthermore, Core Element 19 of the EAMMF is advised to be applied in "Stage 3: Developing Initial EA Versions". This kind of EA sales campaign truly should start much earlier in the maturity progress, i.e. in Stages 0 or 1. So, what are the understandable, relevant benefits (or value) to sell, that can find an agreeable, participatory audience, and can pave the way towards success of a longer-term, funded set of EA mechanisms that can be methodically measured and reported? Pragmatic benefits from a useful EA that can help overcome the fear of change? And how should they be sold? Following is a brief taxonomy (it's a taxonomy, to help organize SME support) of benefit-related subjects that might make the most sense, in creating the messages and organizing an initial "engagement plan" for evangelizing EA "from within". An EA "Sales Taxonomy" of sorts. We're not boiling the ocean here; the subjects that are included are ones that currently appear to be urgently relevant to the current Federal IT Investment landscape. Note that successful dialogue in these topics is directly usable as input or guidance for actually developing early-stage, "Fit-for-Purpose" (a DoDAF term) Enterprise Architecture artifacts, as prescribed by common methods found in most EA methodologies, including FEAF, TOGAF, DoDAF and our own Oracle Enterprise Architecture Framework (OEAF). The taxonomy below is organized by (1) Target Community, (2) Benefit or Value, and (3) EA Program Facet - as in: "Let's talk to (1: Community Member) about how and why (3: EA Facet) the EA program can help with (2: Benefit/Value)". Once the initial discussion targets and subjects are approved (that can be measured and reported), a "marketing and communications plan" can be created. A working example follows the Taxonomy. Enterprise Architecture Sales Taxonomy Draft, Summary Version 1. Community 1.1. Budgeted Programs or Portfolios Communities of Purpose (CoPR) 1.1.1. Program/System Owners (Senior Execs) Creating or Executing Acquisition Plans 1.1.2. Program/System Owners Facing Strategic Change 1.1.2.1. Mandated 1.1.2.2. Expected/Anticipated 1.1.3. Program Managers - Creating Employee Performance Plans 1.1.4. CO/COTRs – Creating Contractor Performance Plans, or evaluating Value Engineering Change Proposals (VECP) 1.2. Governance & Communications Communities of Practice (CoP) 1.2.1. Policy Owners 1.2.1.1. OCFO 1.2.1.1.1. Budget/Procurement Office 1.2.1.1.2. Strategic Planning 1.2.1.2. OCIO 1.2.1.2.1. IT Management 1.2.1.2.2. IT Operations 1.2.1.2.3. Information Assurance (Cyber Security) 1.2.1.2.4. IT Innovation 1.2.1.3. Information-Sharing/ Process Collaboration (i.e. policies and procedures regarding Partners, Agreements) 1.2.2. Governing IT Council/SME Peers (i.e. an "Architects Council") 1.2.2.1. Enterprise Architects (assumes others exist; also assumes EA participants aren't buried solely within the CIO shop) 1.2.2.2. Domain, Enclave, Segment Architects – i.e. the right affinity group for a "shared services" EA structure (per the EAMMF), which may be classified as Federated, Segmented, Service-Oriented, or Extended 1.2.2.3. External Oversight/Constraints 1.2.2.3.1. GAO/OIG & Legal 1.2.2.3.2. Industry Standards 1.2.2.3.3. Official public notification, response 1.2.3. Mission Constituents Participant & Analyst Community of Interest (CoI) 1.2.3.1. Mission Operators/Users 1.2.3.2. Public Constituents 1.2.3.3. Industry Advisory Groups, Stakeholders 1.2.3.4. Media 2. Benefit/Value (Note the actual benefits may not be discretely attributable to EA alone; EA is a very collaborative, cross-cutting discipline.) 2.1. Program Costs – EA enables sound decisions regarding... 2.1.1. Cost Avoidance – a TCO theme 2.1.2. Sequencing – alignment of capability delivery 2.1.3. Budget Instability – a Federal reality 2.2. Investment Capital – EA illuminates new investment resources via... 2.2.1. Value Engineering – contractor-driven cost savings on existing budgets, direct or collateral 2.2.2. Reuse – reuse of investments between programs can result in savings, chargeback models; avoiding duplication 2.2.3. License Refactoring – IT license & support models may not reflect actual or intended usage 2.3. Contextual Knowledge – EA enables informed decisions by revealing... 2.3.1. Common Operating Picture (COP) – i.e. cross-program impacts and synergy, relative to context 2.3.2. Expertise & Skill – who truly should be involved in architectural decisions, both business and IT 2.3.3. Influence – the impact of politics and relationships can be examined 2.3.4. Disruptive Technologies – new technologies may reduce costs or mitigate risk in unanticipated ways 2.3.5. What-If Scenarios – can become much more refined, current, verifiable; basis for Target Architectures 2.4. Mission Performance – EA enables beneficial decision results regarding... 2.4.1. IT Performance and Optimization – towards 100% effective, available resource utilization 2.4.2. IT Stability – towards 100%, real-time uptime 2.4.3. Agility – responding to rapid changes in mission 2.4.4. Outcomes –measures of mission success, KPIs – vs. only "Outputs" 2.4.5. Constraints – appropriate response to constraints 2.4.6. Personnel Performance – better line-of-sight through performance plans to mission outcome 2.5. Mission Risk Mitigation – EA mitigates decision risks in terms of... 2.5.1. Compliance – all the right boxes are checked 2.5.2. Dependencies –cross-agency, segment, government 2.5.3. Transparency – risks, impact and resource utilization are illuminated quickly, comprehensively 2.5.4. Threats and Vulnerabilities – current, realistic awareness and profiles 2.5.5. Consequences – realization of risk can be mapped as a series of consequences, from earlier decisions or new decisions required for current issues 2.5.5.1. Unanticipated – illuminating signals of future or non-symmetric risk; helping to "future-proof" 2.5.5.2. Anticipated – discovering the level of impact that matters 3. EA Program Facet (What parts of the EA can and should be communicated, using business or mission terms?) 3.1. Architecture Models – the visual tools to be created and used 3.1.1. Operating Architecture – the Business Operating Model/Architecture elements of the EA truly drive all other elements, plus expose communication channels 3.1.2. Use Of – how can the EA models be used, and how are they populated, from a reasonable, pragmatic yet compliant perspective? What are the core/minimal models required? What's the relationship of these models, with existing system models? 3.1.3. Scope – what level of granularity within the models, and what level of abstraction across the models, is likely to be most effective and useful? 3.2. Traceability – the maturity, status, completeness of the tools 3.2.1. Status – what in fact is the degree of maturity across the integrated EA model and other relevant governance models, and who may already be benefiting from it? 3.2.2. Visibility – how does the EA visibly and effectively prove IT investment performance goals are being reached, with positive mission outcome? 3.3. Governance – what's the interaction, participation method; how are the tools used? 3.3.1. Contributions – how is the EA program informed, accept submissions, collect data? Who are the experts? 3.3.2. Review – how is the EA validated, against what criteria?  Taxonomy Usage Example:   1. To speak with: a. ...a particular set of System Owners Facing Strategic Change, via mandate (like the "Cloud First" mandate); about... b. ...how the EA program's visible and easily accessible Infrastructure Reference Model (i.e. "IRM" or "TRM"), if updated more completely with current system data, can... c. ...help shed light on ways to mitigate risks and avoid future costs associated with NOT leveraging potentially-available shared services across the enterprise... 2. ....the following Marketing & Communications (Sales) Plan can be constructed: a. Create an easy-to-read "Consequence Model" that illustrates how adoption of a cloud capability (like elastic operational storage) can enable rapid and durable compliance with the mandate – using EA traceability. Traceability might be from the IRM to the ARM (that identifies reusable services invoking the elastic storage), and then to the PRM with performance measures (such as % utilization of purchased storage allocation) included in the OMB Exhibits; and b. Schedule a meeting with the Program Owners, timed during their Acquisition Strategy meetings in response to the mandate, to use the "Consequence Model" for advising them to organize a rapid and relevant RFI solicitation for this cloud capability (regarding alternatives for sourcing elastic operational storage); and c. Schedule a series of short "Discovery" meetings with the system architecture leads (as agreed by the Program Owners), to further populate/validate the "As-Is" models and frame the "To Be" models (via scenarios), to better inform the RFI, obtain the best feedback from the vendor community, and provide potential value for and avoid impact to all other programs and systems. --end example -- Note that communications with the intended audience should take a page out of the standard "Search Engine Optimization" (SEO) playbook, using keywords and phrases relating to "value" and "outcome" vs. "compliance" and "output". Searches in email boxes, internal and external search engines for phrases like "cost avoidance strategies", "mission performance metrics" and "innovation funding" should yield messages and content from the EA team. This targeted, informed, practical sales approach should result in additional buy-in and participation, additional EA information contribution and model validation, development of more SMEs and quick "proof points" (with real-life testing) to bolster the case for EA. The proof point here is a successful, timely procurement that satisfies not only the external mandate and external oversight review, but also meets internal EA compliance/conformance goals and therefore is more transparently useful across the community. In short, if sold effectively, the EA will perform and be recognized. EA won’t therefore be used only for compliance, but also (according to a validated, stated purpose) to directly influence decisions and outcomes. The opinions, views and analysis expressed in this document are those of the author and do not necessarily reflect the views of Oracle.

    Read the article

  • The name 'GridView1' does not exist in the current context

    - by sameer
    hi all, I have two files named as TimeSheet.aspx.cs and TimSheet.aspx ,code of the file are given below for your reference. when i build the application im getting error "The name 'GridView1' does not exist in the current context" even thought i have a control with the id GridView1 and i have added the runat="server" as well. Im not able to figure out what is causing this issue.Can any one figure whats happen here. Thanks & Regards, ======================================= TimeSheet.aspx.cs ======================================= #region Using directives using System; using System.Data; using System.Configuration; using System.Web; using System.Web.Security; using System.Web.UI; using System.Web.UI.WebControls; using System.Web.UI.WebControls.WebParts; using System.Web.UI.HtmlControls; using TSMS.Web.UI; #endregion public partial class TimeSheets: Page { protected void Page_Load(object sender, EventArgs e) { FormUtil.RedirectAfterUpdate(GridView1, "TimeSheets.aspx?page={0}"); FormUtil.SetPageIndex(GridView1, "page"); FormUtil.SetDefaultButton((Button)GridViewSearchPanel1.FindControl("cmdSearch")); } protected void GridView1_SelectedIndexChanged(object sender, EventArgs e) { string urlParams = string.Format("TimeSheetId={0}", GridView1.SelectedDataKey.Values[0]); Response.Redirect("TimeSheetsEdit.aspx?" + urlParams, true); } protected void GridView1_RowCommand(object sender, GridViewCommandEventArgs e) { } } ======================================================= TimeSheet.aspx ======================================================= <%@ Page Language="C#" Theme="Default" MasterPageFile="~/MasterPages/admin.master" AutoEventWireup="true" CodeFile="TimeSheets.aspx.cs" Inherits="TimeSheets" Title="TimeSheets List" %> <asp:Content ID="Content2" ContentPlaceHolderID="ContentPlaceHolder2" Runat="Server">Time Sheets List</asp:Content> <asp:Content ID="Content1" ContentPlaceHolderID="ContentPlaceHolder1" Runat="Server"> <data:GridViewSearchPanel ID="GridViewSearchPanel1" runat="server" GridViewControlID="GridView1" PersistenceMethod="Session" /> <br /> <data:EntityGridView ID="GridView1" runat="server" AutoGenerateColumns="False" OnSelectedIndexChanged="GridView1_SelectedIndexChanged" DataSourceID="TimeSheetsDataSource" DataKeyNames="TimeSheetId" AllowMultiColumnSorting="false" DefaultSortColumnName="" DefaultSortDirection="Ascending" ExcelExportFileName="Export_TimeSheets.xls" onrowcommand="GridView1_RowCommand" > <Columns> <asp:CommandField ShowSelectButton="True" ShowEditButton="True" /> <asp:BoundField DataField="TimeSheetId" HeaderText="Time Sheet Id" SortExpression="[TimeSheetID]" ReadOnly="True" /> <asp:BoundField DataField="TimeSheetTitle" HeaderText="Time Sheet Title" SortExpression="[TimeSheetTitle]" /> <asp:BoundField DataField="StartDate" DataFormatString="{0:d}" HtmlEncode="False" HeaderText="Start Date" SortExpression="[StartDate]" /> <asp:BoundField DataField="EndDate" DataFormatString="{0:d}" HtmlEncode="False" HeaderText="End Date" SortExpression="[EndDate]" /> <asp:BoundField DataField="DateOfCreation" DataFormatString="{0:d}" HtmlEncode="False" HeaderText="Date Of Creation" SortExpression="[DateOfCreation]" /> <data:BoundRadioButtonField DataField="Locked" HeaderText="Locked" SortExpression="[Locked]" /> <asp:BoundField DataField="ReviewedBy" HeaderText="Reviewed By" SortExpression="[ReviewedBy]" /> <data:HyperLinkField HeaderText="Employee Id" DataNavigateUrlFormatString="EmployeesEdit.aspx?EmployeeId={0}" DataNavigateUrlFields="EmployeeId" DataContainer="EmployeeIdSource" DataTextField="LastName" /> </Columns> <EmptyDataTemplate> <b>No TimeSheets Found!</b> </EmptyDataTemplate> </data:EntityGridView> <asp:GridView ID="GridView2" runat="server"> </asp:GridView> <br /> <asp:Button runat="server" ID="btnTimeSheets" OnClientClick="javascript:location.href='TimeSheetsEdit.aspx'; return false;" Text="Add New"></asp:Button> <data:TimeSheetsDataSource ID="TimeSheetsDataSource" runat="server" SelectMethod="GetPaged" EnablePaging="True" EnableSorting="True" EnableDeepLoad="True" > <DeepLoadProperties Method="IncludeChildren" Recursive="False"> <Types> <data:TimeSheetsProperty Name="Employees"/> <%--<data:TimeSheetsProperty Name="TimeSheetDetailsCollection" />--%> </Types> </DeepLoadProperties> <Parameters> <data:CustomParameter Name="WhereClause" Value="" ConvertEmptyStringToNull="false" /> <data:CustomParameter Name="OrderByClause" Value="" ConvertEmptyStringToNull="false" /> <asp:ControlParameter Name="PageIndex" ControlID="GridView1" PropertyName="PageIndex" Type="Int32" /> <asp:ControlParameter Name="PageSize" ControlID="GridView1" PropertyName="PageSize" Type="Int32" /> <data:CustomParameter Name="RecordCount" Value="0" Type="Int32" /> </Parameters> </data:TimeSheetsDataSource> </asp:Content>

    Read the article

  • Crash when trying to get NSManagedObject from NSFetchedResultsController after 25 objects?

    - by Jeremy
    Hey everyone, I'm relatively new to Core Data on iOS, but I think I've been getting better with it. I've been experiencing a bizarre crash, however, in one of my applications and have not been able to figure it out. I have approximately 40 objects in Core Data, presented in a UITableView. When tapping on a cell, a UIActionSheet appears, presenting the user with a UIActionSheet with options related to the cell that was selected. So that I can reference the selected object, I declare an NSIndexPath in my header called "lastSelection" and do the following when the UIActionSheet is presented: // Each cell has a tag based on its row number (i.e. first row has tag 0) lastSelection = [NSIndexPath indexPathForRow:[sender tag] inSection:0]; NSManagedObject *managedObject = [self.fetchedResultsController objectAtIndexPath:lastSelection]; BOOL onDuty = [[managedObject valueForKey:@"onDuty"] boolValue]; UIActionSheet *actionSheet = [[UIActionSheet alloc] initWithTitle:@"Status" delegate:self cancelButtonTitle:nil destructiveButtonTitle:nil otherButtonTitles:nil]; if(onDuty) { [actionSheet addButtonWithTitle:@"Off Duty"]; } else { [actionSheet addButtonWithTitle:@"On Duty"]; } actionSheet.actionSheetStyle = UIActionSheetStyleBlackOpaque; // Override the typical UIActionSheet behavior by presenting it overlapping the sender's frame. This makes it more clear which cell is selected. CGRect senderFrame = [sender frame]; CGPoint point = CGPointMake(senderFrame.origin.x + (senderFrame.size.width / 2), senderFrame.origin.y + (senderFrame.size.height / 2)); CGRect popoverRect = CGRectMake(point.x, point.y, 1, 1); [actionSheet showFromRect:popoverRect inView:[sender superview] animated:NO]; [actionSheet release]; When the UIActionSheet is dismissed with a button, the following code is called: - (void)actionSheet:(UIActionSheet *)actionSheet willDismissWithButtonIndex:(NSInteger)buttonIndex { // Set status based on UIActionSheet button pressed if(buttonIndex == -1) { return; } NSManagedObject *managedObject = [self.fetchedResultsController objectAtIndexPath:lastSelection]; if([actionSheet.title isEqualToString:@"Status"]) { if([[actionSheet buttonTitleAtIndex:buttonIndex] isEqualToString:@"On Duty"]) { [managedObject setValue:[NSNumber numberWithBool:YES] forKey:@"onDuty"]; [managedObject setValue:@"onDuty" forKey:@"status"]; } else { [managedObject setValue:[NSNumber numberWithBool:NO] forKey:@"onDuty"]; [managedObject setValue:@"offDuty" forKey:@"status"]; } } NSError *error; [self.managedObjectContext save:&error]; [tableView reloadData]; } This might not be the most efficient code (sorry, I'm new!), but it does work. That is, for the first 25 items in the list. Selecting the 26th item or beyond, the UIActionSheet will appear, but if it is dismissed with a button, I get a variety of errors, including any one of the following: [__NSCFArray section]: unrecognized selector sent to instance 0x4c6bf90 Program received signal: “EXC_BAD_ACCESS” [_NSObjectID_48_0 section]: unrecognized selector sent to instance 0x4c54710 [__NSArrayM section]: unrecognized selector sent to instance 0x4c619a0 [NSComparisonPredicate section]: unrecognized selector sent to instance 0x6088790 [NSKeyPathExpression section]: unrecognized selector sent to instance 0x4c18950 If I comment out NSManagedObject *managedObject = [self.fetchedResultsController objectAtIndexPath:lastSelection]; it doesn't crash anymore, so I believe it has something do do with that. Can anyone offer any insight? Please let me know if I need to include any other information. Thanks! EDIT: Interestingly, my fetchedResultsController code returns a different object every time. Is this expected, or could this be a cause of my issue? The code looks like this: - (NSFetchedResultsController *)fetchedResultsController { /* Set up the fetched results controller. */ // Create the fetch request for the entity. NSFetchRequest *fetchRequest = [[NSFetchRequest alloc] init]; // Edit the entity name as appropriate. NSEntityDescription *entity = [NSEntityDescription entityForName:@"Employee" inManagedObjectContext:self.managedObjectContext]; [fetchRequest setEntity:entity]; // Set the batch size to a suitable number. [fetchRequest setFetchBatchSize:80]; // Edit the sort key as appropriate. NSString *sortKey; BOOL ascending; if(sortControl.selectedSegmentIndex == 0) { sortKey = @"startTime"; ascending = YES; } else if(sortControl.selectedSegmentIndex == 1) { sortKey = @"name"; ascending = YES; } else { sortKey = @"onDuty"; ascending = NO; } NSSortDescriptor *sortDescriptor = [[NSSortDescriptor alloc] initWithKey:sortKey ascending:ascending]; NSArray *sortDescriptors = [[NSArray alloc] initWithObjects:sortDescriptor, nil]; [fetchRequest setSortDescriptors:sortDescriptors]; // Edit the section name key path and cache name if appropriate. NSFetchedResultsController *aFetchedResultsController = [[NSFetchedResultsController alloc] initWithFetchRequest:fetchRequest managedObjectContext:self.managedObjectContext sectionNameKeyPath:nil cacheName:@"Root"]; aFetchedResultsController.delegate = self; self.fetchedResultsController = aFetchedResultsController; [aFetchedResultsController release]; [fetchRequest release]; [sortDescriptor release]; [sortDescriptors release]; NSError *error = nil; if (![fetchedResultsController_ performFetch:&error]) { /* Replace this implementation with code to handle the error appropriately. abort() causes the application to generate a crash log and terminate. You should not use this function in a shipping application, although it may be useful during development. If it is not possible to recover from the error, display an alert panel that instructs the user to quit the application by pressing the Home button. */ //NSLog(@"Unresolved error %@, %@", error, [error userInfo]); abort(); } return fetchedResultsController_; } This happens when I set a breakpoint: (gdb) po [self fetchedResultsController] <NSFetchedResultsController: 0x61567c0> (gdb) po [self fetchedResultsController] <NSFetchedResultsController: 0x4c83630>

    Read the article

  • inserting null values in datetime column and integer column

    - by reggie
    I am using C# and developing a winform application. I have a project class which has the project attributes. the constructor of the project class is as follows: newProject = new Project(GCD_ID.IsNull() ? (int?)null : Convert.ToInt32(GCD_ID), txt_Proj_Desc.Text, txt_Prop_Name.Text, ST.ID.ToString().IsNull() ? null: ST.ID.ToString(), cmbCentre.Text, SEC.ID.ToString().IsNull() ? null : SEC.ID.ToString(), cmbZone.Text, FD.ID.ToString().IsNull() ? null : FD.ID.ToString(), DT.ID.ToString().IsNull() ? null : DT.ID.ToString(), OP.ID.ToString().IsNull() ? null : OP.ID.ToString(), T.ID.ToString().IsNull() ? null : T.ID.ToString(), CKV.ID.ToString().IsNull() ? null : CKV.ID.ToString(), STAT.ID.ToString().IsNull() ? null : STAT.ID.ToString(), MW.IsNull() ? (Double?)null : Convert.ToDouble(MW), txt_Subject.Text, Ip_Num.IsNull() ? (int?)null : Convert.ToInt32(Ip_Num), H1N_ID.IsNull() ? (int?)null : Convert.ToInt32(H1N_ID), NOMS_Slip_Num.IsNull() ? (int?)null : Convert.ToInt32(NOMS_Slip_Num), NMS_Updated.IsNull() ? (DateTime?)null : Convert.ToDateTime(NMS_Updated), Received_Date.IsNull() ? (DateTime?)null : Convert.ToDateTime(Received_Date), Actual_IS_Date.IsNull() ? (DateTime?)null : Convert.ToDateTime(Actual_IS_Date), Scheduled_IS_Date.IsNull() ? (DateTime?)null : Convert.ToDateTime(Scheduled_IS_Date), UpSt.ID.ToString().IsNull() ? null : UpSt.ID.ToString(), UpFd.ID.ToString().IsNull() ? null : UpFd.ID.ToString(), txtHVCircuit.Text, cmbbxSIA.Text); My problem is that i cannot insert values into the database when the datetime variables and the integer variables are null. all this data are assigned to the variable from textboxes on the form.. bELOW is the database function which takes in all the variables and insert them into the database. public static void createNewProject(int? GCD_ID, string Project_Desc, string Proponent_Name, int Station_ID, string OpCentre, int Sector_ID, string PLZone, int Feeder, int DxTx_ID, int OpControl_ID, int Type_ID, int ConnKV_ID, int Status_ID, double? MW, string Subject, int? Ip_Num, int? H1N_ID, int? NOMS_Slip_Num, DateTime? NMS_Updated, DateTime? Received_Date, DateTime? Actual_IS_Date, DateTime? Scheduled_IS_Date, int UP_Station_ID, int UP_Feeder_ID, string @HV_Circuit, string SIA_Required) { SqlConnection conn = null; try { //Takes in all the employee details to be added to the database. conn = new SqlConnection(databaseConnectionString); conn.Open(); SqlCommand cmd = new SqlCommand("createNewProject", conn); cmd.CommandType = CommandType.StoredProcedure; cmd.Parameters.Add(new SqlParameter("GCD_ID", GCD_ID)); cmd.Parameters.Add(new SqlParameter("Project_Desc", MiscFunctions.Capitalize(Project_Desc))); cmd.Parameters.Add(new SqlParameter("Proponent_Name", MiscFunctions.Capitalize(Proponent_Name))); cmd.Parameters.Add(new SqlParameter("Station_ID", Station_ID)); cmd.Parameters.Add(new SqlParameter("OpCentre", OpCentre)); cmd.Parameters.Add(new SqlParameter("Sector_ID", Sector_ID)); cmd.Parameters.Add(new SqlParameter("PLZone", PLZone)); cmd.Parameters.Add(new SqlParameter("Feeder", Feeder)); cmd.Parameters.Add(new SqlParameter("DxTx_ID", DxTx_ID)); cmd.Parameters.Add(new SqlParameter("OpControl_ID", OpControl_ID)); cmd.Parameters.Add(new SqlParameter("Type_ID", Type_ID)); cmd.Parameters.Add(new SqlParameter("ConnKV_ID", ConnKV_ID)); cmd.Parameters.Add(new SqlParameter("Status_ID", Status_ID)); cmd.Parameters.Add(new SqlParameter("MW", MW)); cmd.Parameters.Add(new SqlParameter("Subject", Subject)); cmd.Parameters.Add(new SqlParameter("Ip_Num", Ip_Num)); cmd.Parameters.Add(new SqlParameter("H1N_ID", H1N_ID)); cmd.Parameters.Add(new SqlParameter("NOMS_Slip_Num", NOMS_Slip_Num)); cmd.Parameters.Add(new SqlParameter("NMS_Updated", NMS_Updated)); cmd.Parameters.Add(new SqlParameter("Received_Date", Received_Date)); cmd.Parameters.Add(new SqlParameter("Actual_IS_Date", Actual_IS_Date)); cmd.Parameters.Add(new SqlParameter("Scheduled_IS_Date", Scheduled_IS_Date)); cmd.Parameters.Add(new SqlParameter("UP_Station_ID", UP_Station_ID)); cmd.Parameters.Add(new SqlParameter("UP_Feeder_ID", UP_Feeder_ID)); cmd.Parameters.Add(new SqlParameter("HV_Circuit", HV_Circuit)); cmd.Parameters.Add(new SqlParameter("SIA_Required", SIA_Required)); cmd.ExecuteNonQuery(); } catch (Exception e) //returns if error incurred. { MessageBox.Show("Error occured in createNewProject" + Environment.NewLine + e.ToString()); } finally { if (conn != null) { conn.Close(); } } } My question is, how do i insert values into the database. please please help

    Read the article

  • jQuery toggle fucntion not working as expected

    - by Bunny Rabbit
    <!DOCTYPE html> <html lang="en"> <head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8"> <script src="http://ajax.googleapis.com/ajax/libs/jquery/1.3/jquery.min.js" type="text/javascript"></script> <script type="text/javascript"> $(function(){ $('div#menu div') .bind('mouseover',function(event){ popHelp(this); }) .bind('mouseout',function(event){ clearHelp(); }) .toggle( function(event){$('#menu div a').not(this).css('opacity',1);$(event.target).css('opacity',0.4)}, function(event){$(event.target).css('opacity',1)} ) $('div.item').not('#mainPage') .hide() $('#customerManagement').click(function(event){ $('.item').hide('slow'); $('#customerManagementCon').toggle('slow'); return false; }) $('#userManagement').click(function(){ $('.item').hide('slow'); $('#userManagementCon').toggle('slow'); return false; }) $('#storageManagement').click(function(){ $('.item').hide('slow'); $('#storageManagementCon').toggle('slow'); return false; }) $('#searchManagement').click(function(){ $('.item').hide('slow'); $('#searchManagementCon').toggle('slow'); return false; }) $('#signOff').click(function(){ $('.item').hide('slow'); $('#signOffCon').toggle('slow'); return false; }) } ); function popHelp(ref){ var text; if(ref.id=="customerManagement") text="click here for customer management"; else if(ref.id=="userManagement") text="click here for user management"; else if(ref.id=="storageManagement") text="click here for storage management"; else if(ref.id=="searchManagement") text="search management"; else if(ref.id=="signOff") text="click here to sign off"; $('#helpConsole').append('<div class="help">'+text+'<div>'); } function clearHelp(){ $('#helpConsole > div').remove(); } </script> <style type="text/css" > #helpConsole { background-color:Aqua; margin-left:500px; width:300px; height:100px; outline-width:medium; } </style> </head> <body> <div id="menu"> <table border="2"> <thead> <tr> <th colspan="5">Welcome $Employee Name</th> </tr> </thead> <tbody> <tr> <td><div id="customerManagement" class="menuItem"><a>Customer Management</a></div></td> <td><div id="userManagement" class="menuItem"><a>User Management</a></div></td> <td><div id="storageManagement" class="menuItem"><a>Storage Management</a></div></td> <td><div id="searchManagement" class="menuItem"><a>Search Management</a></div></td> <td><div id="signOff" class="menuItem"><a>Sign Off</a></div></td> </tr> </tbody> </table> </div> <div id="helpConsole"><h3>help</h3></div> <div id="mainPage" class="item"><h1>Storage Service</h1></div> <div id="customerManagementCon" class="item"> <h3>Customer Management</h3> </div> <div id="userManagementCon" class="item"> <h3>User Management</h3> </div> <div id="storageManagementCon" class="item"> <h3>Storage Management</h3> </div> <div id="searchManagementCon" class="item"> <h3>Search Mangement</h3> </div> <div id="signOffCon" class="item"> <h3>Sign Off</h3> </div> <div id="menuItemCon" class="item">menuItem</div> </body> The toggle function in this code is not working as expected though it shows the #menu items when clicked it doesn't hide them always .

    Read the article

  • XSLT - group response as per month and year combination

    - by alisha
    I want into write xslt to get response for the input xml. thanks in advance. I want to group output such that month year combination is not repeated for each of the employee details. Input XML: <resultset> <row> <column> <name>Month</name> <value>2</value> </column> <column> <name>Year</name> <value>2010</value> </column> <column> <name>EmpName</name> <value>Anu</value> </column> <column> <name>Age</name> <value>24</value> </column> </row> <row> <column> <name>Month</name> <value>2</value> </column> <column> <name>Year</name> <value>2010</value> </column> <column> <name>EmpName</name> <value>Nancy</value> </column> <column> <name>Age</name> <value>26</value> </column> </row> <row> <column> <name>Month</name> <value>3</value> </column> <column> <name>Year</name> <value>2010</value> </column> <column> <name>EmpName</name> <value>Ned</value> </column> <column> <name>Age</name> <value>25</value> </column> </row> </resultset> Output expected: <Response> <PeriodInfo> <Month>2</Month> <Year>2010</Year> <EmployeeDetails> <Name>Anu</Name> <Age>24</Age> </EmployeeDetails> <EmployeeDetails> <Name>Nancy</Name> <Age>26</Age> </EmployeeDetails> </PeriodInfo> <PeriodInfo> <Month>3</Month> <Year>2010</Year> <EmployeeDetails> <Name>Ned</Name> <Age>25</Age> </EmployeeDetails> </PeriodInfo> </Response>

    Read the article

  • PHP table problem

    - by maltad
    Hello, Im want to create a table that show the values of a mysql table. The problem is that when I open the page I only have the columns name. But I dont see any row. I also want to make a hyperlink of each row. How I will do that. Here is my code: <p> <?php </p> <p> include_once 'rnheader.php'; echo '</br>'; </p> <p> echo '<a href = "rnservices.php"> Create Service</a> '; </p> <p> echo '<table>'; echo '<tr>'; echo '<th>Service ID</th>'; echo '<th>Title</th>'; echo '<th>Description</th>'; echo '<th>Notes</th>'; echo '<th>Submit By</th>'; echo '<th>Assigned Employee</th>'; echo '<th>Assigned Group</th>'; echo '<th>Category</th>'; echo '<th>Status</th>'; echo '<th>Urgency</th>'; echo '<th>Customer</th>'; echo '<th>Day Created</th>'; echo '</tr>'; </p> <p> $query = ("SELECT ServiceID, Title, Description, Notes, SubmitBy, AssignedEmp, " . "AssignedGroup, NameCategory, TipoStatus, TiposUrgencia, CustomerName, DayCreation FROM Service"); $result = queryMysql($query); echo 'Number of Rows: ' . mysql_num_rows($result); </p> <p> while ($row = mysqli_fetch_assoc($result)) { </p> echo '<tr>'; echo '<td>' . $row['ServiceID'] . '</td>'; echo '<td>' . $row['Title'] . '</td>'; echo '<td>' . $row['Description'] . '</td>'; echo '<td>' . $row['Notes'] . '</td>'; echo '<td>' . $row['SubmitBy'] . '</td>'; echo '<td>' . $row['AssignedEmp'] . '</td>'; echo '<td>' . $row['AssignedGroup'] . '</td>'; echo '<td>' . $row['NameCategory'] . '</td>'; echo '<td>' . $row['TipoStatus'] . '</td>'; echo '<td>' . $row['TiposUrgencia'] . '</td>'; echo '<td>' . $row['CustomerName'] . '</td>'; echo '<td>' . $row['DayCreation'] . '</td>'; echo '</tr>'; } mysqli_free_result($result); echo ''; ?

    Read the article

  • Top things web developers should know about the Visual Studio 2013 release

    - by Jon Galloway
    ASP.NET and Web Tools for Visual Studio 2013 Release NotesASP.NET and Web Tools for Visual Studio 2013 Release NotesSummary for lazy readers: Visual Studio 2013 is now available for download on the Visual Studio site and on MSDN subscriber downloads) Visual Studio 2013 installs side by side with Visual Studio 2012 and supports round-tripping between Visual Studio versions, so you can try it out without committing to a switch Visual Studio 2013 ships with the new version of ASP.NET, which includes ASP.NET MVC 5, ASP.NET Web API 2, Razor 3, Entity Framework 6 and SignalR 2.0 The new releases ASP.NET focuses on One ASP.NET, so core features and web tools work the same across the platform (e.g. adding ASP.NET MVC controllers to a Web Forms application) New core features include new templates based on Bootstrap, a new scaffolding system, and a new identity system Visual Studio 2013 is an incredible editor for web files, including HTML, CSS, JavaScript, Markdown, LESS, Coffeescript, Handlebars, Angular, Ember, Knockdown, etc. Top links: Visual Studio 2013 content on the ASP.NET site are in the standard new releases area: http://www.asp.net/vnext ASP.NET and Web Tools for Visual Studio 2013 Release Notes Short intro videos on the new Visual Studio web editor features from Scott Hanselman and Mads Kristensen Announcing release of ASP.NET and Web Tools for Visual Studio 2013 post on the official .NET Web Development and Tools Blog Scott Guthrie's post: Announcing the Release of Visual Studio 2013 and Great Improvements to ASP.NET and Entity Framework Okay, for those of you who are still with me, let's dig in a bit. Quick web dev notes on downloading and installing Visual Studio 2013 I found Visual Studio 2013 to be a pretty fast install. According to Brian Harry's release post, installing over pre-release versions of Visual Studio is supported.  I've installed the release version over pre-release versions, and it worked fine. If you're only going to be doing web development, you can speed up the install if you just select Web Developer tools. Of course, as a good Microsoft employee, I'll mention that you might also want to install some of those other features, like the Store apps for Windows 8 and the Windows Phone 8.0 SDK, but they do download and install a lot of other stuff (e.g. the Windows Phone SDK sets up Hyper-V and downloads several GB's of VM's). So if you're planning just to do web development for now, you can pick just the Web Developer Tools and install the other stuff later. If you've got a fast internet connection, I recommend using the web installer instead of downloading the ISO. The ISO includes all the features, whereas the web installer just downloads what you're installing. Visual Studio 2013 development settings and color theme When you start up Visual Studio, it'll prompt you to pick some defaults. These are totally up to you -whatever suits your development style - and you can change them later. As I said, these are completely up to you. I recommend either the Web Development or Web Development (Code Only) settings. The only real difference is that Code Only hides the toolbars, and you can switch between them using Tools / Import and Export Settings / Reset. Web Development settings Web Development (code only) settings Usually I've just gone with Web Development (code only) in the past because I just want to focus on the code, although the Standard toolbar does make it easier to switch default web browsers. More on that later. Color theme Sigh. Okay, everyone's got their favorite colors. I alternate between Light and Dark depending on my mood, and I personally like how the low contrast on the window chrome in those themes puts the emphasis on my code rather than the tabs and toolbars. I know some people got pretty worked up over that, though, and wanted the blue theme back. I personally don't like it - it reminds me of ancient versions of Visual Studio that I don't want to think about anymore. So here's the thing: if you install Visual Studio Ultimate, it defaults to Blue. The other versions default to Light. If you use Blue, I won't criticize you - out loud, that is. You can change themes really easily - either Tools / Options / Environment / General, or the smart way: ctrl+q for quick launch, then type Theme and hit enter. Signing in During the first run, you'll be prompted to sign in. You don't have to - you can click the "Not now, maybe later" link at the bottom of that dialog. I recommend signing in, though. It's not hooked in with licensing or tracking the kind of code you write to sell you components. It is doing good things, like  syncing your Visual Studio settings between computers. More about that here. So, you don't have to, but I sure do. Overview of shiny new things in ASP.NET land There are a lot of good new things in ASP.NET. I'll list some of my favorite here, but you can read more on the ASP.NET site. One ASP.NET You've heard us talk about this for a while. The idea is that options are good, but choice can be a burden. When you start a new ASP.NET project, why should you have to make a tough decision - with long-term consequences - about how your application will work? If you want to use ASP.NET Web Forms, but have the option of adding in ASP.NET MVC later, why should that be hard? It's all ASP.NET, right? Ideally, you'd just decide that you want to use ASP.NET to build sites and services, and you could use the appropriate tools (the green blocks below) as you needed them. So, here it is. When you create a new ASP.NET application, you just create an ASP.NET application. Next, you can pick from some templates to get you started... but these are different. They're not "painful decision" templates, they're just some starting pieces. And, most importantly, you can mix and match. I can pick a "mostly" Web Forms template, but include MVC and Web API folders and core references. If you've tried to mix and match in the past, you're probably aware that it was possible, but not pleasant. ASP.NET MVC project files contained special project type GUIDs, so you'd only get controller scaffolding support in a Web Forms project if you manually edited the csproj file. Features in one stack didn't work in others. Project templates were painful choices. That's no longer the case. Hooray! I just did a demo in a presentation last week where I created a new Web Forms + MVC + Web API site, built a model, scaffolded MVC and Web API controllers with EF Code First, add data in the MVC view, viewed it in Web API, then added a GridView to the Web Forms Default.aspx page and bound it to the Model. In about 5 minutes. Sure, it's a simple example, but it's great to be able to share code and features across the whole ASP.NET family. Authentication In the past, authentication was built into the templates. So, for instance, there was an ASP.NET MVC 4 Intranet Project template which created a new ASP.NET MVC 4 application that was preconfigured for Windows Authentication. All of that authentication stuff was built into each template, so they varied between the stacks, and you couldn't reuse them. You didn't see a lot of changes to the authentication options, since they required big changes to a bunch of project templates. Now, the new project dialog includes a common authentication experience. When you hit the Change Authentication button, you get some common options that work the same way regardless of the template or reference settings you've made. These options work on all ASP.NET frameworks, and all hosting environments (IIS, IIS Express, or OWIN for self-host) The default is Individual User Accounts: This is the standard "create a local account, using username / password or OAuth" thing; however, it's all built on the new Identity system. More on that in a second. The one setting that has some configuration to it is Organizational Accounts, which lets you configure authentication using Active Directory, Windows Azure Active Directory, or Office 365. Identity There's a new identity system. We've taken the best parts of the previous ASP.NET Membership and Simple Identity systems, rolled in a lot of feedback and made big enhancements to support important developer concerns like unit testing and extensiblity. I've written long posts about ASP.NET identity, and I'll do it again. Soon. This is not that post. The short version is that I think we've finally got just the right Identity system. Some of my favorite features: There are simple, sensible defaults that work well - you can File / New / Run / Register / Login, and everything works. It supports standard username / password as well as external authentication (OAuth, etc.). It's easy to customize without having to re-implement an entire provider. It's built using pluggable pieces, rather than one large monolithic system. It's built using interfaces like IUser and IRole that allow for unit testing, dependency injection, etc. You can easily add user profile data (e.g. URL, twitter handle, birthday). You just add properties to your ApplicationUser model and they'll automatically be persisted. Complete control over how the identity data is persisted. By default, everything works with Entity Framework Code First, but it's built to support changes from small (modify the schema) to big (use another ORM, store your data in a document database or in the cloud or in XML or in the EXIF data of your desktop background or whatever). It's configured via OWIN. More on OWIN and Katana later, but the fact that it's built using OWIN means it's portable. You can find out more in the Authentication and Identity section of the ASP.NET site (and lots more content will be going up there soon). New Bootstrap based project templates The new project templates are built using Bootstrap 3. Bootstrap (formerly Twitter Bootstrap) is a front-end framework that brings a lot of nice benefits: It's responsive, so your projects will automatically scale to device width using CSS media queries. For example, menus are full size on a desktop browser, but on narrower screens you automatically get a mobile-friendly menu. The built-in Bootstrap styles make your standard page elements (headers, footers, buttons, form inputs, tables etc.) look nice and modern. Bootstrap is themeable, so you can reskin your whole site by dropping in a new Bootstrap theme. Since Bootstrap is pretty popular across the web development community, this gives you a large and rapidly growing variety of templates (free and paid) to choose from. Bootstrap also includes a lot of very useful things: components (like progress bars and badges), useful glyphicons, and some jQuery plugins for tooltips, dropdowns, carousels, etc.). Here's a look at how the responsive part works. When the page is full screen, the menu and header are optimized for a wide screen display: When I shrink the page down (this is all based on page width, not useragent sniffing) the menu turns into a nice mobile-friendly dropdown: For a quick example, I grabbed a new free theme off bootswatch.com. For simple themes, you just need to download the boostrap.css file and replace the /content/bootstrap.css file in your project. Now when I refresh the page, I've got a new theme: Scaffolding The big change in scaffolding is that it's one system that works across ASP.NET. You can create a new Empty Web project or Web Forms project and you'll get the Scaffold context menus. For release, we've got MVC 5 and Web API 2 controllers. We had a preview of Web Forms scaffolding in the preview releases, but they weren't fully baked for RTM. Look for them in a future update, expected pretty soon. This scaffolding system wasn't just changed to work across the ASP.NET frameworks, it's also built to enable future extensibility. That's not in this release, but should also hopefully be out soon. Project Readme page This is a small thing, but I really like it. When you create a new project, you get a Project_Readme.html page that's added to the root of your project and opens in the Visual Studio built-in browser. I love it. A long time ago, when you created a new project we just dumped it on you and left you scratching your head about what to do next. Not ideal. Then we started adding a bunch of Getting Started information to the new project templates. That told you what to do next, but you had to delete all of that stuff out of your website. It doesn't belong there. Not ideal. This is a simple HTML file that's not integrated into your project code at all. You can delete it if you want. But, it shows a lot of helpful links that are current for the project you just created. In the future, if we add new wacky project types, they can create readme docs with specific information on how to do appropriately wacky things. Side note: I really like that they used the internal browser in Visual Studio to show this content rather than popping open an HTML page in the default browser. I hate that. It's annoying. If you're doing that, I hope you'll stop. What if some unnamed person has 40 or 90 tabs saved in their browser session? When you pop open your "Thanks for installing my Visual Studio extension!" page, all eleventy billion tabs start up and I wish I'd never installed your thing. Be like these guys and pop stuff Visual Studio specific HTML docs in the Visual Studio browser. ASP.NET MVC 5 The biggest change with ASP.NET MVC 5 is that it's no longer a separate project type. It integrates well with the rest of ASP.NET. In addition to that and the other common features we've already looked at (Bootstrap templates, Identity, authentication), here's what's new for ASP.NET MVC. Attribute routing ASP.NET MVC now supports attribute routing, thanks to a contribution by Tim McCall, the author of http://attributerouting.net. With attribute routing you can specify your routes by annotating your actions and controllers. This supports some pretty complex, customized routing scenarios, and it allows you to keep your route information right with your controller actions if you'd like. Here's a controller that includes an action whose method name is Hiding, but I've used AttributeRouting to configure it to /spaghetti/with-nesting/where-is-waldo public class SampleController : Controller { [Route("spaghetti/with-nesting/where-is-waldo")] public string Hiding() { return "You found me!"; } } I enable that in my RouteConfig.cs, and I can use that in conjunction with my other MVC routes like this: public class RouteConfig { public static void RegisterRoutes(RouteCollection routes) { routes.IgnoreRoute("{resource}.axd/{*pathInfo}"); routes.MapMvcAttributeRoutes(); routes.MapRoute( name: "Default", url: "{controller}/{action}/{id}", defaults: new { controller = "Home", action = "Index", id = UrlParameter.Optional } ); } } You can read more about Attribute Routing in ASP.NET MVC 5 here. Filter enhancements There are two new additions to filters: Authentication Filters and Filter Overrides. Authentication filters are a new kind of filter in ASP.NET MVC that run prior to authorization filters in the ASP.NET MVC pipeline and allow you to specify authentication logic per-action, per-controller, or globally for all controllers. Authentication filters process credentials in the request and provide a corresponding principal. Authentication filters can also add authentication challenges in response to unauthorized requests. Override filters let you change which filters apply to a given action method or controller. Override filters specify a set of filter types that should not be run for a given scope (action or controller). This allows you to configure filters that apply globally but then exclude certain global filters from applying to specific actions or controllers. ASP.NET Web API 2 ASP.NET Web API 2 includes a lot of new features. Attribute Routing ASP.NET Web API supports the same attribute routing system that's in ASP.NET MVC 5. You can read more about the Attribute Routing features in Web API in this article. OAuth 2.0 ASP.NET Web API picks up OAuth 2.0 support, using security middleware running on OWIN (discussed below). This is great for features like authenticated Single Page Applications. OData Improvements ASP.NET Web API now has full OData support. That required adding in some of the most powerful operators: $select, $expand, $batch and $value. You can read more about OData operator support in this article by Mike Wasson. Lots more There's a huge list of other features, including CORS (cross-origin request sharing), IHttpActionResult, IHttpRequestContext, and more. I think the best overview is in the release notes. OWIN and Katana I've written about OWIN and Katana recently. I'm a big fan. OWIN is the Open Web Interfaces for .NET. It's a spec, like HTML or HTTP, so you can't install OWIN. The benefit of OWIN is that it's a community specification, so anyone who implements it can plug into the ASP.NET stack, either as middleware or as a host. Katana is the Microsoft implementation of OWIN. It leverages OWIN to wire up things like authentication, handlers, modules, IIS hosting, etc., so ASP.NET can host OWIN components and Katana components can run in someone else's OWIN implementation. Howard Dierking just wrote a cool article in MSDN magazine describing Katana in depth: Getting Started with the Katana Project. He had an interesting example showing an OWIN based pipeline which leveraged SignalR, ASP.NET Web API and NancyFx components in the same stack. If this kind of thing makes sense to you, that's great. If it doesn't, don't worry, but keep an eye on it. You're going to see some cool things happen as a result of ASP.NET becoming more and more pluggable. Visual Studio Web Tools Okay, this stuff's just crazy. Visual Studio has been adding some nice web dev features over the past few years, but they've really cranked it up for this release. Visual Studio is by far my favorite code editor for all web files: CSS, HTML, JavaScript, and lots of popular libraries. Stop thinking of Visual Studio as a big editor that you only use to write back-end code. Stop editing HTML and CSS in Notepad (or Sublime, Notepad++, etc.). Visual Studio starts up in under 2 seconds on a modern computer with an SSD. Misspelling HTML attributes or your CSS classes or jQuery or Angular syntax is stupid. It doesn't make you a better developer, it makes you a silly person who wastes time. Browser Link Browser Link is a real-time, two-way connection between Visual Studio and all connected browsers. It's only attached when you're running locally, in debug, but it applies to any and all connected browser, including emulators. You may have seen demos that showed the browsers refreshing based on changes in the editor, and I'll agree that's pretty cool. But it's really just the start. It's a two-way connection, and it's built for extensiblity. That means you can write extensions that push information from your running application (in IE, Chrome, a mobile emulator, etc.) back to Visual Studio. Mads and team have showed off some demonstrations where they enabled edit mode in the browser which updated the source HTML back on the browser. It's also possible to look at how the rendered HTML performs, check for compatibility issues, watch for unused CSS classes, the sky's the limit. New HTML editor The previous HTML editor had a lot of old code that didn't allow for improvements. The team rewrote the HTML editor to take advantage of the new(ish) extensibility features in Visual Studio, which then allowed them to add in all kinds of features - things like CSS Class and ID IntelliSense (so you type style="" and get a list of classes and ID's for your project), smart indent based on how your document is formatted, JavaScript reference auto-sync, etc. Here's a 3 minute tour from Mads Kristensen. The previous HTML editor had a lot of old code that didn't allow for improvements. The team rewrote the HTML editor to take advantage of the new(ish) extensibility features in Visual Studio, which then allowed them to add in all kinds of features - things like CSS Class and ID IntelliSense (so you type style="" and get a list of classes and ID's for your project), smart indent based on how your document is formatted, JavaScript reference auto-sync, etc. Lots more Visual Studio web dev features That's just a sampling - there's a ton of great features for JavaScript editing, CSS editing, publishing, and Page Inspector (which shows real-time rendering of your page inside Visual Studio). Here are some more short videos showing those features. Lots, lots more Okay, that's just a summary, and it's still quite a bit. Head on over to http://asp.net/vnext for more information, and download Visual Studio 2013 now to get started!

    Read the article

  • Quick guide to Oracle IRM 11g: Classification design

    - by Simon Thorpe
    Quick guide to Oracle IRM 11g indexThis is the final article in the quick guide to Oracle IRM. If you've followed everything prior you will now have a fully functional and tested Information Rights Management service. It doesn't matter if you've been following the 10g or 11g guide as this next article is common to both. ContentsWhy this is the most important part... Understanding the classification and standard rights model Identifying business use cases Creating an effective IRM classification modelOne single classification across the entire businessA context for each and every possible granular use caseWhat makes a good context? Deciding on the use of roles in the context Reviewing the features and security for context roles Summary Why this is the most important part...Now the real work begins, installing and getting an IRM system running is as simple as following instructions. However to actually have an IRM technology easily protecting your most sensitive information without interfering with your users existing daily work flows and be able to scale IRM across the entire business, requires thought into how confidential documents are created, used and distributed. This article is going to give you the information you need to ask the business the right questions so that you can deploy your IRM service successfully. The IRM team here at Oracle have over 10 years of experience in helping customers and it is important you understand the following to be successful in securing access to your most confidential information. Whatever you are trying to secure, be it mergers and acquisitions information, engineering intellectual property, health care documentation or financial reports. No matter what type of user is going to access the information, be they employees, contractors or customers, there are common goals you are always trying to achieve.Securing the content at the earliest point possible and do it automatically. Removing the dependency on the user to decide to secure the content reduces the risk of mistakes significantly and therefore results a more secure deployment. K.I.S.S. (Keep It Simple Stupid) Reduce complexity in the rights/classification model. Oracle IRM lets you make changes to access to documents even after they are secured which allows you to start with a simple model and then introduce complexity once you've understood how the technology is going to be used in the business. After an initial learning period you can review your implementation and start to make informed decisions based on user feedback and administration experience. Clearly communicate to the user, when appropriate, any changes to their existing work practice. You must make every effort to make the transition to sealed content as simple as possible. For external users you must help them understand why you are securing the documents and inform them the value of the technology to both your business and them. Before getting into the detail, I must pay homage to Martin White, Vice President of client services in SealedMedia, the company Oracle acquired and who created Oracle IRM. In the SealedMedia years Martin was involved with every single customer and was key to the design of certain aspects of the IRM technology, specifically the context model we will be discussing here. Listening carefully to customers and understanding the flexibility of the IRM technology, Martin taught me all the skills of helping customers build scalable, effective and simple to use IRM deployments. No matter how well the engineering department designed the software, badly designed and poorly executed projects can result in difficult to use and manage, and ultimately insecure solutions. The advice and information that follows was born with Martin and he's still delivering IRM consulting with customers and can be found at www.thinkers.co.uk. It is from Martin and others that Oracle not only has the most advanced, scalable and usable document security solution on the market, but Oracle and their partners have the most experience in delivering successful document security solutions. Understanding the classification and standard rights model The goal of any successful IRM deployment is to balance the increase in security the technology brings without over complicating the way people use secured content and avoid a significant increase in administration and maintenance. With Oracle it is possible to automate the protection of content, deploy the desktop software transparently and use authentication methods such that users can open newly secured content initially unaware the document is any different to an insecure one. That is until of course they attempt to do something for which they don't have any rights, such as copy and paste to an insecure application or try and print. Central to achieving this objective is creating a classification model that is simple to understand and use but also provides the right level of complexity to meet the business needs. In Oracle IRM the term used for each classification is a "context". A context defines the relationship between.A group of related documents The people that use the documents The roles that these people perform The rights that these people need to perform their role The context is the key to the success of Oracle IRM. It provides the separation of the role and rights of a user from the content itself. Documents are sealed to contexts but none of the rights, user or group information is stored within the content itself. Sealing only places information about the location of the IRM server that sealed it, the context applied to the document and a few other pieces of metadata that pertain only to the document. This important separation of rights from content means that millions of documents can be secured against a single classification and a user needs only one right assigned to be able to access all documents. If you have followed all the previous articles in this guide, you will be ready to start defining contexts to which your sensitive information will be protected. But before you even start with IRM, you need to understand how your own business uses and creates sensitive documents and emails. Identifying business use cases Oracle is able to support multiple classification systems, but usually there is one single initial need for the technology which drives a deployment. This need might be to protect sensitive mergers and acquisitions information, engineering intellectual property, financial documents. For this and every subsequent use case you must understand how users create and work with documents, to who they are distributed and how the recipients should interact with them. A successful IRM deployment should start with one well identified use case (we go through some examples towards the end of this article) and then after letting this use case play out in the business, you learn how your users work with content, how well your communication to the business worked and if the classification system you deployed delivered the right balance. It is at this point you can start rolling the technology out further. Creating an effective IRM classification model Once you have selected the initial use case you will address with IRM, you need to design a classification model that defines the access to secured documents within the use case. In Oracle IRM there is an inbuilt classification system called the "context" model. In Oracle IRM 11g it is possible to extend the server to support any rights classification model, but the majority of users who are not using an application integration (such as Oracle IRM within Oracle Beehive) are likely to be starting out with the built in context model. Before looking at creating a classification system with IRM, it is worth reviewing some recognized standards and methods for creating and implementing security policy. A very useful set of documents are the ISO 17799 guidelines and the SANS security policy templates. First task is to create a context against which documents are to be secured. A context consists of a group of related documents (all top secret engineering research), a list of roles (contributors and readers) which define how users can access documents and a list of users (research engineers) who have been given a role allowing them to interact with sealed content. Before even creating the first context it is wise to decide on a philosophy which will dictate the level of granularity, the question is, where do you start? At a department level? By project? By technology? First consider the two ends of the spectrum... One single classification across the entire business Imagine that instead of having separate contexts, one for engineering intellectual property, one for your financial data, one for human resources personally identifiable information, you create one context for all documents across the entire business. Whilst you may have immediate objections, there are some significant benefits in thinking about considering this. Document security classification decisions are simple. You only have one context to chose from! User provisioning is simple, just make sure everyone has a role in the only context in the business. Administration is very low, if you assign rights to groups from the business user repository you probably never have to touch IRM administration again. There are however some obvious downsides to this model.All users in have access to all IRM secured content. So potentially a sales person could access sensitive mergers and acquisition documents, if they can get their hands on a copy that is. You cannot delegate control of different documents to different parts of the business, this may not satisfy your regulatory requirements for the separation and delegation of duties. Changing a users role affects every single document ever secured. Even though it is very unlikely a business would ever use one single context to secure all their sensitive information, thinking about this scenario raises one very important point. Just having one single context and securing all confidential documents to it, whilst incurring some of the problems detailed above, has one huge value. Once secured, IRM protected content can ONLY be accessed by authorized users. Just think of all the sensitive documents in your business today, imagine if you could ensure that only everyone you trust could open them. Even if an employee lost a laptop or someone accidentally sent an email to the wrong recipient, only the right people could open that file. A context for each and every possible granular use case Now let's think about the total opposite of a single context design. What if you created a context for each and every single defined business need and created multiple contexts within this for each level of granularity? Let's take a use case where we need to protect engineering intellectual property. Imagine we have 6 different engineering groups, and in each we have a research department, a design department and manufacturing. The company information security policy defines 3 levels of information sensitivity... restricted, confidential and top secret. Then let's say that each group and department needs to define access to information from both internal and external users. Finally add into the mix that they want to review the rights model for each context every financial quarter. This would result in a huge amount of contexts. For example, lets just look at the resulting contexts for one engineering group. Q1FY2010 Restricted Internal - Engineering Group 1 - Research Q1FY2010 Restricted Internal - Engineering Group 1 - Design Q1FY2010 Restricted Internal - Engineering Group 1 - Manufacturing Q1FY2010 Restricted External- Engineering Group 1 - Research Q1FY2010 Restricted External - Engineering Group 1 - Design Q1FY2010 Restricted External - Engineering Group 1 - Manufacturing Q1FY2010 Confidential Internal - Engineering Group 1 - Research Q1FY2010 Confidential Internal - Engineering Group 1 - Design Q1FY2010 Confidential Internal - Engineering Group 1 - Manufacturing Q1FY2010 Confidential External - Engineering Group 1 - Research Q1FY2010 Confidential External - Engineering Group 1 - Design Q1FY2010 Confidential External - Engineering Group 1 - Manufacturing Q1FY2010 Top Secret Internal - Engineering Group 1 - Research Q1FY2010 Top Secret Internal - Engineering Group 1 - Design Q1FY2010 Top Secret Internal - Engineering Group 1 - Manufacturing Q1FY2010 Top Secret External - Engineering Group 1 - Research Q1FY2010 Top Secret External - Engineering Group 1 - Design Q1FY2010 Top Secret External - Engineering Group 1 - Manufacturing Now multiply the above by 6 for each engineering group, 18 contexts. You are then creating/reviewing another 18 every 3 months. After a year you've got 72 contexts. What would be the advantages of such a complex classification model? You can satisfy very granular rights requirements, for example only an authorized engineering group 1 researcher can create a top secret report for access internally, and his role will be reviewed on a very frequent basis. Your business may have very complex rights requirements and mapping this directly to IRM may be an obvious exercise. The disadvantages of such a classification model are significant...Huge administrative overhead. Someone in the business must manage, review and administrate each of these contexts. If the engineering group had a single administrator, they would have 72 classifications to reside over each year. From an end users perspective life will be very confusing. Imagine if a user has rights in just 6 of these contexts. They may be able to print content from one but not another, be able to edit content in 2 contexts but not the other 4. Such confusion at the end user level causes frustration and resistance to the use of the technology. Increased synchronization complexity. Imagine a user who after 3 years in the company ends up with over 300 rights in many different contexts across the business. This would result in long synchronization times as the client software updates all your offline rights. Hard to understand who can do what with what. Imagine being the VP of engineering and as part of an internal security audit you are asked the question, "What rights to researchers have to our top secret information?". In this complex model the answer is not simple, it would depend on many roles in many contexts. Of course this example is extreme, but it highlights that trying to build many barriers in your business can result in a nightmare of administration and confusion amongst users. In the real world what we need is a balance of the two. We need to seek an optimum number of contexts. Too many contexts are unmanageable and too few contexts does not give fine enough granularity. What makes a good context? Good context design derives mainly from how well you understand your business requirements to secure access to confidential information. Some customers I have worked with can tell me exactly the documents they wish to secure and know exactly who should be opening them. However there are some customers who know only of the government regulation that requires them to control access to certain types of information, they don't actually know where the documents are, how they are created or understand exactly who should have access. Therefore you need to know how to ask the business the right questions that lead to information which help you define a context. First ask these questions about a set of documentsWhat is the topic? Who are legitimate contributors on this topic? Who are the authorized readership? If the answer to any one of these is significantly different, then it probably merits a separate context. Remember that sealed documents are inherently secure and as such they cannot leak to your competitors, therefore it is better sealed to a broad context than not sealed at all. Simplicity is key here. Always revert to the first extreme example of a single classification, then work towards essential complexity. If there is any doubt, always prefer fewer contexts. Remember, Oracle IRM allows you to change your mind later on. You can implement a design now and continue to change and refine as you learn how the technology is used. It is easy to go from a simple model to a more complex one, it is much harder to take a complex model that is already embedded in the work practice of users and try to simplify it. It is also wise to take a single use case and address this first with the business. Don't try and tackle many different problems from the outset. Do one, learn from the process, refine it and then take what you have learned into the next use case, refine and continue. Once you have a good grasp of the technology and understand how your business will use it, you can then start rolling out the technology wider across the business. Deciding on the use of roles in the context Once you have decided on that first initial use case and a context to create let's look at the details you need to decide upon. For each context, identify; Administrative rolesBusiness owner, the person who makes decisions about who may or may not see content in this context. This is often the person who wanted to use IRM and drove the business purchase. They are the usually the person with the most at risk when sensitive information is lost. Point of contact, the person who will handle requests for access to content. Sometimes the same as the business owner, sometimes a trusted secretary or administrator. Context administrator, the person who will enact the decisions of the Business Owner. Sometimes the point of contact, sometimes a trusted IT person. Document related rolesContributors, the people who create and edit documents in this context. Reviewers, the people who are involved in reviewing documents but are not trusted to secure information to this classification. This role is not always necessary. (See later discussion on Published-work and Work-in-Progress) Readers, the people who read documents from this context. Some people may have several of the roles above, which is fine. What you are trying to do is understand and define how the business interacts with your sensitive information. These roles obviously map directly to roles available in Oracle IRM. Reviewing the features and security for context roles At this point we have decided on a classification of information, understand what roles people in the business will play when administrating this classification and how they will interact with content. The final piece of the puzzle in getting the information for our first context is to look at the permissions people will have to sealed documents. First think why are you protecting the documents in the first place? It is to prevent the loss of leaking of information to the wrong people. To control the information, making sure that people only access the latest versions of documents. You are not using Oracle IRM to prevent unauthorized people from doing legitimate work. This is an important point, with IRM you can erect many barriers to prevent access to content yet too many restrictions and authorized users will often find ways to circumvent using the technology and end up distributing unprotected originals. Because IRM is a security technology, it is easy to get carried away restricting different groups. However I would highly recommend starting with a simple solution with few restrictions. Ensure that everyone who reasonably needs to read documents can do so from the outset. Remember that with Oracle IRM you can change rights to content whenever you wish and tighten security. Always return to the fact that the greatest value IRM brings is that ONLY authorized users can access secured content, remember that simple "one context for the entire business" model. At the start of the deployment you really need to aim for user acceptance and therefore a simple model is more likely to succeed. As time passes and users understand how IRM works you can start to introduce more restrictions and complexity. Another key aspect to focus on is handling exceptions. If you decide on a context model where engineering can only access engineering information, and sales can only access sales data. Act quickly when a sales manager needs legitimate access to a set of engineering documents. Having a quick and effective process for permitting other people with legitimate needs to obtain appropriate access will be rewarded with acceptance from the user community. These use cases can often be satisfied by integrating IRM with a good Identity & Access Management technology which simplifies the process of assigning users the correct business roles. The big print issue... Printing is often an issue of contention, users love to print but the business wants to ensure sensitive information remains in the controlled digital world. There are many cases of physical document loss causing a business pain, it is often overlooked that IRM can help with this issue by limiting the ability to generate physical copies of digital content. However it can be hard to maintain a balance between security and usability when it comes to printing. Consider the following points when deciding about whether to give print rights. Oracle IRM sealed documents can contain watermarks that expose information about the user, time and location of access and the classification of the document. This information would reside in the printed copy making it easier to trace who printed it. Printed documents are slower to distribute in comparison to their digital counterparts, so time sensitive information in printed format may present a lower risk. Print activity is audited, therefore you can monitor and react to users abusing print rights. Summary In summary it is important to think carefully about the way you create your context model. As you ask the business these questions you may get a variety of different requirements. There may be special projects that require a context just for sensitive information created during the lifetime of the project. There may be a department that requires all information in the group is secured and you might have a few senior executives who wish to use IRM to exchange a small number of highly sensitive documents with a very small number of people. Oracle IRM, with its very flexible context classification system, can support all of these use cases. The trick is to introducing the complexity to deliver them at the right level. In another article i'm working on I will go through some examples of how Oracle IRM might map to existing business use cases. But for now, this article covers all the important questions you need to get your IRM service deployed and successfully protecting your most sensitive information.

    Read the article

  • No bean named 'springSecurityFilterChain' is defined

    - by michaeljackson4ever
    When configs are loaded, I get the error SEVERE: Exception starting filter springSecurityFilterChain org.springframework.beans.factory.NoSuchBeanDefinitionException: No bean named 'springSecurityFilterChain' is defined My sec-config: <http use-expressions="true" access-denied-page="/error/casfailed.html" entry-point-ref="headerAuthenticationEntryPoint"> <intercept-url pattern="/" access="permitAll"/> <!-- <intercept-url pattern="/index.html" access="permitAll"/> --> <intercept-url pattern="/index.html" access="hasAnyRole('ROLE_USER')"/> <intercept-url pattern="/history.html" access="hasAnyRole('ROLE_USER')"/> <intercept-url pattern="/absence.html" access="hasAnyRole('ROLE_USER')"/> <intercept-url pattern="/search.html" access="hasAnyRole('ROLE_USER')"/> <intercept-url pattern="/employees.html" access="hasAnyRole('ROLE_USER')"/> <intercept-url pattern="/employee.html" access="hasAnyRole('ROLE_USER')"/> <intercept-url pattern="/contract.html" access="hasAnyRole('ROLE_USER')"/> <intercept-url pattern="/myforms.html" access="hasAnyRole('HLO','OPISK')"/> <intercept-url pattern="/vacationmsg.html" access="hasAnyRole('ROLE_USER')"/> <intercept-url pattern="/redirect.jsp" filters="none" /> <intercept-url pattern="/error/**" filters="none" /> <intercept-url pattern="/layout/**" filters="none" /> <intercept-url pattern="/js/**" filters="none" /> <intercept-url pattern="/**" access="isAuthenticated()" /> <!-- session-management invalid-session-url="/absence.html"/ --> <!-- logout logout-success-url="/logout.html"/ --> <custom-filter ref="ssoHeaderAuthenticationFilter" before="CAS_FILTER"/> <!-- CAS_FILTER ??? --> </http> <authentication-manager alias="authenticationManager"> <authentication-provider ref="doNothingAuthenticationProvider"/> </authentication-manager> <beans:bean id="doNothingAuthenticationProvider" class="com.nixu.security.sso.web.DoNothingAuthenticationProvider"/> <beans:bean id="ssoHeaderAuthenticationFilter" class="com.nixu.security.sso.web.HeaderAuthenticationFilter"> <beans:property name="groups"> <beans:map> <beans:entry key="cn=lake,ou=confluence,dc=utu,dc=fi" value="ROLE_ADMIN"/> </beans:map> </beans:property> </beans:bean> <beans:bean id="headerAuthenticationEntryPoint" class="com.nixu.security.sso.web.HeaderAuthenticationEntryPoint"/> And web.xml <context-param> <param-name>contextConfigLocation</param-name> <param-value> /WEB-INF/applicationContext.xml /WEB-INF/sec-config.xml /WEB-INF/idm-config.xml /WEB-INF/ldap-config.xml </param-value> </context-param> <display-name>KeyCard</display-name> <context-param> <param-name>webAppRootKey</param-name> <param-value>KeyCardAppRoot</param-value> </context-param> <context-param> <param-name>log4jConfigLocation</param-name> <param-value>/WEB-INF/log4j.properties</param-value> </context-param> <!-- Reads request input using UTF-8 encoding --> <filter> <filter-name>characterEncodingFilter</filter-name> <filter-class>org.springframework.web.filter.CharacterEncodingFilter</filter-class> <init-param> <param-name>encoding</param-name> <param-value>UTF-8</param-value> </init-param> <init-param> <param-name>forceEncoding</param-name> <param-value>true</param-value> </init-param> </filter> <filter> <filter-name>springSecurityFilterChain</filter-name> <filter-class>org.springframework.web.filter.DelegatingFilterProxy</filter-class> </filter> <filter-mapping> <filter-name>characterEncodingFilter</filter-name> <url-pattern>/*</url-pattern> </filter-mapping> <filter-mapping> <filter-name>springSecurityFilterChain</filter-name> <url-pattern>/*</url-pattern> </filter-mapping> <listener> <listener-class>org.springframework.web.context.ContextLoaderListener</listener-class> </listener> <listener> <listener-class>org.springframework.web.util.Log4jConfigListener</listener-class> </listener> <listener> <!-- this is for session scoped objects --> <listener-class>org.springframework.web.context.request.RequestContextListener</listener-class> </listener> <listener> <listener-class>org.springframework.security.web.session.HttpSessionEventPublisher</listener-class> </listener> <!-- Handles all requests into the application --> <servlet> <servlet-name>KeyCard</servlet-name> <servlet-class>org.springframework.web.servlet.DispatcherServlet</servlet-class> <load-on-startup>1</load-on-startup> </servlet> <servlet> <servlet-name>tiles</servlet-name> <servlet-class>org.apache.tiles.web.startup.TilesServlet</servlet-class> <init-param> <param-name> org.apache.tiles.impl.BasicTilesContainer.DEFINITIONS_CONFIG </param-name> <param-value> /WEB-INF/tilesViewContext.xml </param-value> </init-param> <load-on-startup>2</load-on-startup> </servlet> <servlet-mapping> <servlet-name>KeyCard</servlet-name> <url-pattern>*.html</url-pattern> </servlet-mapping> <session-config> <session-timeout> 120 </session-timeout> </session-config> <welcome-file-list> <welcome-file>index.jsp</welcome-file> </welcome-file-list> <!-- error-page> <exception-type>java.lang.Exception</exception-type> <location>/WEB-INF/error/error.jsp</location> </error-page --> </web-app> What's wrong?

    Read the article

  • How to deploy the advanced search page using Module in SharePoint 2013

    - by ybbest
    Today, I’d like to show you how to deploy your custom advanced search page using module in Visual Studio 2012.Using a module is the way how SharePoint deploy all the publishing pages to the search centre. Browse to the template under 15 hive of SharePoint2013, then go to the SearchCenterFiles under Features(as shown below).Then open the Files.xml it shows how SharePoint using module to deploy advanced search.You can download the solution here. Now I am going to show you how to deploy your custom advanced search page.The feature is located  in the C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\TEMPLATE\FEATURES\SearchCenterFiles . To deploy SharePoint advanced Search pages, you need to do the following: 1. Create SharePoint2013 project and then create a module item. 2. Find how Out of box SharePoint deploy the Advanced Search Page from Files.xml and copy and paste it into the elements.xml <File Url="advanced.aspx" Type="GhostableInLibrary"> <Property Name="PublishingPageLayout" Value="~SiteCollection/_catalogs/masterpage/AdvancedSearchLayout.aspx, $Resources:Microsoft.Office.Server.Search,SearchCenterAdvancedSearchTitle;" /> <Property Name="Title" Value="$Resources:Microsoft.Office.Server.Search,Search_Advanced_Page_Title;" /> <Property Name="ContentType" Value="$Resources:Microsoft.Office.Server.Search,contenttype_welcomepage_name;" /> <AllUsersWebPart WebPartZoneID="MainZone" WebPartOrder="1"> <![CDATA[ <WebPart xmlns="http://schemas.microsoft.com/WebPart/v2"> <Assembly>Microsoft.Office.Server.Search, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c</Assembly> <TypeName>Microsoft.Office.Server.Search.WebControls.AdvancedSearchBox</TypeName> <Title>$Resources:Microsoft.Office.Server.Search,AdvancedSearch_Webpart_Title;</Title> <Description>$Resources:Microsoft.Office.Server.Search,AdvancedSearch_Webpart_Description;</Description> <FrameType>None</FrameType> <AllowMinimize>true</AllowMinimize> <AllowRemove>true</AllowRemove> <IsVisible>true</IsVisible> <SearchResultPageURL xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">results.aspx</SearchResultPageURL> <TextQuerySectionLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">$Resources:Microsoft.Office.Server.Search,AdvancedSearch_FindDocsWith_Title;</TextQuerySectionLabelText> <ShowAndQueryTextBox xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowAndQueryTextBox> <ShowPhraseQueryTextBox xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowPhraseQueryTextBox> <ShowOrQueryTextBox xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowOrQueryTextBox> <ShowNotQueryTextBox xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowNotQueryTextBox> <ScopeSectionLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">$Resources:Microsoft.Office.Server.Search,AdvancedSearch_NarrowSearch_Title;</ScopeSectionLabelText> <ShowLanguageOptions xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowLanguageOptions> <ShowResultTypePicker xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowResultTypePicker> <ShowPropertiesSection xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowPropertiesSection> <PropertiesSectionLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">$Resources:Microsoft.Office.Server.Search,AdvancedSearch_AddPropRestrictions_Title;</PropertiesSectionLabelText> </WebPart> ]]> </AllUsersWebPart> </File> 3. Customize your SharePoint advanced Search Page by modifying the Advanced Search Box and Export the webpart and copy the webpart file to the elements under module. 4. Export the web part and copy the content of the web part file to the elements.xml in the module. <File Path="AdvancedSearchPage\advanced.aspx" Url="employeeAdvanced.aspx" Type="GhostableInLibrary"> <Property Name="PublishingPageLayout" Value="~SiteCollection/_catalogs/masterpage/AdvancedSearchLayout.aspx, $Resources:Microsoft.Office.Server.Search,SearchCenterAdvancedSearchTitle;" /> <Property Name="Title" Value="$Resources:Microsoft.Office.Server.Search,Search_Advanced_Page_Title;" /> <Property Name="ContentType" Value="$Resources:Microsoft.Office.Server.Search,contenttype_welcomepage_name;" /> <AllUsersWebPart WebPartZoneID="MainZone" WebPartOrder="1"> <![CDATA[ <WebPart xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://schemas.microsoft.com/WebPart/v2"> <Title>Advanced Search Box</Title> <FrameType>None</FrameType> <Description>Displays parameterized search options based on properties and combinations of words.</Description> <IsIncluded>true</IsIncluded> <ZoneID>MainZone</ZoneID> <PartOrder>1</PartOrder> <FrameState>Normal</FrameState> <Height /> <Width /> <AllowRemove>true</AllowRemove> <AllowZoneChange>true</AllowZoneChange> <AllowMinimize>true</AllowMinimize> <AllowConnect>true</AllowConnect> <AllowEdit>true</AllowEdit> <AllowHide>true</AllowHide> <IsVisible>true</IsVisible> <DetailLink /> <HelpLink /> <HelpMode>Modeless</HelpMode> <Dir>Default</Dir> <PartImageSmall /> <MissingAssembly>Cannot import this Web Part.</MissingAssembly> <PartImageLarge /> <IsIncludedFilter /> <Assembly>Microsoft.Office.Server.Search, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c</Assembly> <TypeName>Microsoft.Office.Server.Search.WebControls.AdvancedSearchBox</TypeName> <SearchResultPageURL xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">results.aspx</SearchResultPageURL> <TextQuerySectionLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">Find documents that have...</TextQuerySectionLabelText> <ShowAndQueryTextBox xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowAndQueryTextBox> <AndQueryTextBoxLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox" /> <ShowPhraseQueryTextBox xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowPhraseQueryTextBox> <PhraseQueryTextBoxLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox" /> <ShowOrQueryTextBox xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowOrQueryTextBox> <OrQueryTextBoxLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox" /> <ShowNotQueryTextBox xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowNotQueryTextBox> <NotQueryTextBoxLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox" /> <ScopeSectionLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">Narrow the search...</ScopeSectionLabelText> <ShowScopes xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">false</ShowScopes> <ScopeLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox" /> <DisplayGroup xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">Advanced Search</DisplayGroup> <ShowLanguageOptions xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">false</ShowLanguageOptions> <LanguagesLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox" /> <ShowResultTypePicker xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowResultTypePicker> <ResultTypeLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox" /> <ShowPropertiesSection xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowPropertiesSection> <PropertiesSectionLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">Add property restrictions...</PropertiesSectionLabelText> <Properties xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">&lt;root xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"&gt;  &lt;LangDefs&gt;    &lt;LangDef DisplayName="Arabic" LangID="ar"/&gt;    &lt;LangDef DisplayName="Bengali" LangID="bn"/&gt;    &lt;LangDef DisplayName="Bulgarian" LangID="bg"/&gt;    &lt;LangDef DisplayName="Catalan" LangID="ca"/&gt;    &lt;LangDef DisplayName="Simplified Chinese" LangID="zh-cn"/&gt;    &lt;LangDef DisplayName="Traditional Chinese" LangID="zh-tw"/&gt;    &lt;LangDef DisplayName="Croatian" LangID="hr"/&gt;    &lt;LangDef DisplayName="Czech" LangID="cs"/&gt;    &lt;LangDef DisplayName="Danish" LangID="da"/&gt;    &lt;LangDef DisplayName="Dutch" LangID="nl"/&gt;    &lt;LangDef DisplayName="English" LangID="en"/&gt;    &lt;LangDef DisplayName="Finnish" LangID="fi"/&gt;    &lt;LangDef DisplayName="French" LangID="fr"/&gt;    &lt;LangDef DisplayName="German" LangID="de"/&gt;    &lt;LangDef DisplayName="Greek" LangID="el"/&gt;    &lt;LangDef DisplayName="Gujarati" LangID="gu"/&gt;    &lt;LangDef DisplayName="Hebrew" LangID="he"/&gt;    &lt;LangDef DisplayName="Hindi" LangID="hi"/&gt;    &lt;LangDef DisplayName="Hungarian" LangID="hu"/&gt;    &lt;LangDef DisplayName="Icelandic" LangID="is"/&gt;    &lt;LangDef DisplayName="Indonesian" LangID="id"/&gt;    &lt;LangDef DisplayName="Italian" LangID="it"/&gt;    &lt;LangDef DisplayName="Japanese" LangID="ja"/&gt;    &lt;LangDef DisplayName="Kannada" LangID="kn"/&gt;    &lt;LangDef DisplayName="Korean" LangID="ko"/&gt;    &lt;LangDef DisplayName="Latvian" LangID="lv"/&gt;    &lt;LangDef DisplayName="Lithuanian" LangID="lt"/&gt;    &lt;LangDef DisplayName="Malay" LangID="ms"/&gt;    &lt;LangDef DisplayName="Malayalam" LangID="ml"/&gt;    &lt;LangDef DisplayName="Marathi" LangID="mr"/&gt;    &lt;LangDef DisplayName="Norwegian" LangID="no"/&gt;    &lt;LangDef DisplayName="Polish" LangID="pl"/&gt;    &lt;LangDef DisplayName="Portuguese" LangID="pt"/&gt;    &lt;LangDef DisplayName="Punjabi" LangID="pa"/&gt;    &lt;LangDef DisplayName="Romanian" LangID="ro"/&gt;    &lt;LangDef DisplayName="Russian" LangID="ru"/&gt;    &lt;LangDef DisplayName="Slovak" LangID="sk"/&gt;    &lt;LangDef DisplayName="Slovenian" LangID="sl"/&gt;    &lt;LangDef DisplayName="Spanish" LangID="es"/&gt;    &lt;LangDef DisplayName="Swedish" LangID="sv"/&gt;    &lt;LangDef DisplayName="Tamil" LangID="ta"/&gt;    &lt;LangDef DisplayName="Telugu" LangID="te"/&gt;    &lt;LangDef DisplayName="Thai" LangID="th"/&gt;    &lt;LangDef DisplayName="Turkish" LangID="tr"/&gt;    &lt;LangDef DisplayName="Ukrainian" LangID="uk"/&gt;    &lt;LangDef DisplayName="Urdu" LangID="ur"/&gt;    &lt;LangDef DisplayName="Vietnamese" LangID="vi"/&gt;  &lt;/LangDefs&gt;  &lt;Languages&gt;    &lt;Language LangRef="en"/&gt;    &lt;Language LangRef="fr"/&gt;    &lt;Language LangRef="de"/&gt;    &lt;Language LangRef="ja"/&gt;    &lt;Language LangRef="zh-cn"/&gt;    &lt;Language LangRef="es"/&gt;    &lt;Language LangRef="zh-tw"/&gt;  &lt;/Languages&gt;  &lt;PropertyDefs&gt;    &lt;PropertyDef Name="Path" DataType="url" DisplayName="URL"/&gt;    &lt;PropertyDef Name="Size" DataType="integer" DisplayName="Size (bytes)"/&gt;    &lt;PropertyDef Name="Write" DataType="datetime" DisplayName="Last Modified Date"/&gt;    &lt;PropertyDef Name="FileName" DataType="text" DisplayName="Name"/&gt;    &lt;PropertyDef Name="Description" DataType="text" DisplayName="Description"/&gt;    &lt;PropertyDef Name="Title" DataType="text" DisplayName="Title"/&gt;    &lt;PropertyDef Name="Author" DataType="text" DisplayName="Author"/&gt;    &lt;PropertyDef Name="DocSubject" DataType="text" DisplayName="Subject"/&gt;    &lt;PropertyDef Name="DocKeywords" DataType="text" DisplayName="Keywords"/&gt;    &lt;PropertyDef Name="DocComments" DataType="text" DisplayName="Comments"/&gt;    &lt;PropertyDef Name="CreatedBy" DataType="text" DisplayName="Created By"/&gt;    &lt;PropertyDef Name="ModifiedBy" DataType="text" DisplayName="Last Modified By"/&gt;    &lt;PropertyDef Name="EmployeeNumber" DataType="text" DisplayName="EmployeeNumber"/&gt;    &lt;PropertyDef Name="EmployeeId" DataType="text" DisplayName="EmployeeId"/&gt;    &lt;PropertyDef Name="EmployeeFirstName" DataType="text" DisplayName="EmployeeFirstName"/&gt;    &lt;PropertyDef Name="EmployeeLastName" DataType="text" DisplayName="EmployeeLastName"/&gt;  &lt;/PropertyDefs&gt;  &lt;ResultTypes&gt;    &lt;ResultType DisplayName="Employee Document" Name="default"&gt;      &lt;KeywordQuery/&gt;      &lt;PropertyRef Name="EmployeeNumber" /&gt;      &lt;PropertyRef Name="EmployeeId" /&gt;      &lt;PropertyRef Name="EmployeeFirstName" /&gt;      &lt;PropertyRef Name="EmployeeLastName" /&gt;    &lt;/ResultType&gt;    &lt;ResultType DisplayName="All Results"&gt;      &lt;KeywordQuery/&gt;      &lt;PropertyRef Name="Author" /&gt;      &lt;PropertyRef Name="Description" /&gt;      &lt;PropertyRef Name="FileName" /&gt;      &lt;PropertyRef Name="Size" /&gt;      &lt;PropertyRef Name="Path" /&gt;      &lt;PropertyRef Name="Write" /&gt;      &lt;PropertyRef Name="CreatedBy" /&gt;      &lt;PropertyRef Name="ModifiedBy" /&gt;    &lt;/ResultType&gt;    &lt;ResultType DisplayName="Documents" Name="documents"&gt;      &lt;KeywordQuery&gt;IsDocument="True"&lt;/KeywordQuery&gt;      &lt;PropertyRef Name="Author" /&gt;      &lt;PropertyRef Name="DocComments"/&gt;      &lt;PropertyRef Name="Description" /&gt;      &lt;PropertyRef Name="DocKeywords"/&gt;      &lt;PropertyRef Name="FileName" /&gt;      &lt;PropertyRef Name="Size" /&gt;      &lt;PropertyRef Name="DocSubject"/&gt;      &lt;PropertyRef Name="Path" /&gt;      &lt;PropertyRef Name="Write" /&gt;      &lt;PropertyRef Name="CreatedBy" /&gt;      &lt;PropertyRef Name="ModifiedBy" /&gt;      &lt;PropertyRef Name="Title"/&gt;    &lt;/ResultType&gt;    &lt;ResultType DisplayName="Word Documents" Name="worddocuments"&gt;      &lt;KeywordQuery&gt;FileExtension="doc" OR FileExtension="docx" OR FileExtension="dot" OR FileExtension="docm" OR FileExtension="odt"&lt;/KeywordQuery&gt;      &lt;PropertyRef Name="Author" /&gt;      &lt;PropertyRef Name="DocComments"/&gt;      &lt;PropertyRef Name="Description" /&gt;      &lt;PropertyRef Name="DocKeywords"/&gt;      &lt;PropertyRef Name="FileName" /&gt;      &lt;PropertyRef Name="Size" /&gt;      &lt;PropertyRef Name="DocSubject"/&gt;      &lt;PropertyRef Name="Path" /&gt;      &lt;PropertyRef Name="Write" /&gt;      &lt;PropertyRef Name="CreatedBy" /&gt;      &lt;PropertyRef Name="ModifiedBy" /&gt;      &lt;PropertyRef Name="Title"/&gt;    &lt;/ResultType&gt;    &lt;ResultType DisplayName="Excel Documents" Name="exceldocuments"&gt;      &lt;KeywordQuery&gt;FileExtension="xls" OR FileExtension="xlsx" OR FileExtension="xlsm" OR FileExtension="xlsb" OR FileExtension="ods"&lt;/KeywordQuery&gt;      &lt;PropertyRef Name="Author" /&gt;      &lt;PropertyRef Name="DocComments"/&gt;      &lt;PropertyRef Name="Description" /&gt;      &lt;PropertyRef Name="DocKeywords"/&gt;      &lt;PropertyRef Name="FileName" /&gt;      &lt;PropertyRef Name="Size" /&gt;      &lt;PropertyRef Name="DocSubject"/&gt;      &lt;PropertyRef Name="Path" /&gt;      &lt;PropertyRef Name="Write" /&gt;      &lt;PropertyRef Name="CreatedBy" /&gt;      &lt;PropertyRef Name="ModifiedBy" /&gt;      &lt;PropertyRef Name="Title"/&gt;    &lt;/ResultType&gt;    &lt;ResultType DisplayName="PowerPoint Presentations" Name="presentations"&gt;      &lt;KeywordQuery&gt;FileExtension="ppt" OR FileExtension="pptx" OR FileExtension="pptm" OR FileExtension="odp"&lt;/KeywordQuery&gt;      &lt;PropertyRef Name="Author" /&gt;      &lt;PropertyRef Name="DocComments"/&gt;      &lt;PropertyRef Name="Description" /&gt;      &lt;PropertyRef Name="DocKeywords"/&gt;      &lt;PropertyRef Name="FileName" /&gt;      &lt;PropertyRef Name="Size" /&gt;      &lt;PropertyRef Name="DocSubject"/&gt;      &lt;PropertyRef Name="Path" /&gt;      &lt;PropertyRef Name="Write" /&gt;      &lt;PropertyRef Name="CreatedBy" /&gt;      &lt;PropertyRef Name="ModifiedBy" /&gt;      &lt;PropertyRef Name="Title"/&gt;    &lt;/ResultType&gt;  &lt;/ResultTypes&gt;&lt;/root&gt;</Properties> </WebPart> ]]> </AllUsersWebPart> </File> 5.Deploy your custom solution and you will have a custom advanced search page.

    Read the article

  • How to deploy the advanced search page using Module in SharePoint 2013

    - by ybbest
    Today, I’d like to show you how to deploy your custom advanced search page using module in Visual Studio 2012.Using a module is the way how SharePoint deploy all the publishing pages to the search centre. Browse to the template under 15 hive of SharePoint2013, then go to the SearchCenterFiles under Features(as shown below).Then open the Files.xml it shows how SharePoint using module to deploy advanced search.You can download the solution here. Now I am going to show you how to deploy your custom advanced search page.The feature is located  in the C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\TEMPLATE\FEATURES\SearchCenterFiles . To deploy SharePoint advanced Search pages, you need to do the following: 1. Create SharePoint2013 project and then create a module item. 2. Find how Out of box SharePoint deploy the Advanced Search Page from Files.xml and copy and paste it into the elements.xml <File Url="advanced.aspx" Type="GhostableInLibrary"> <Property Name="PublishingPageLayout" Value="~SiteCollection/_catalogs/masterpage/AdvancedSearchLayout.aspx, $Resources:Microsoft.Office.Server.Search,SearchCenterAdvancedSearchTitle;" /> <Property Name="Title" Value="$Resources:Microsoft.Office.Server.Search,Search_Advanced_Page_Title;" /> <Property Name="ContentType" Value="$Resources:Microsoft.Office.Server.Search,contenttype_welcomepage_name;" /> <AllUsersWebPart WebPartZoneID="MainZone" WebPartOrder="1"> <![CDATA[ <WebPart xmlns="http://schemas.microsoft.com/WebPart/v2"> <Assembly>Microsoft.Office.Server.Search, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c</Assembly> <TypeName>Microsoft.Office.Server.Search.WebControls.AdvancedSearchBox</TypeName> <Title>$Resources:Microsoft.Office.Server.Search,AdvancedSearch_Webpart_Title;</Title> <Description>$Resources:Microsoft.Office.Server.Search,AdvancedSearch_Webpart_Description;</Description> <FrameType>None</FrameType> <AllowMinimize>true</AllowMinimize> <AllowRemove>true</AllowRemove> <IsVisible>true</IsVisible> <SearchResultPageURL xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">results.aspx</SearchResultPageURL> <TextQuerySectionLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">$Resources:Microsoft.Office.Server.Search,AdvancedSearch_FindDocsWith_Title;</TextQuerySectionLabelText> <ShowAndQueryTextBox xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowAndQueryTextBox> <ShowPhraseQueryTextBox xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowPhraseQueryTextBox> <ShowOrQueryTextBox xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowOrQueryTextBox> <ShowNotQueryTextBox xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowNotQueryTextBox> <ScopeSectionLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">$Resources:Microsoft.Office.Server.Search,AdvancedSearch_NarrowSearch_Title;</ScopeSectionLabelText> <ShowLanguageOptions xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowLanguageOptions> <ShowResultTypePicker xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowResultTypePicker> <ShowPropertiesSection xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowPropertiesSection> <PropertiesSectionLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">$Resources:Microsoft.Office.Server.Search,AdvancedSearch_AddPropRestrictions_Title;</PropertiesSectionLabelText> </WebPart> ]]> </AllUsersWebPart> </File> 3. Customize your SharePoint advanced Search Page by modifying the Advanced Search Box and Export the webpart and copy the webpart file to the elements under module. 4. Export the web part and copy the content of the web part file to the elements.xml in the module. <File Path="AdvancedSearchPage\advanced.aspx" Url="employeeAdvanced.aspx" Type="GhostableInLibrary"> <Property Name="PublishingPageLayout" Value="~SiteCollection/_catalogs/masterpage/AdvancedSearchLayout.aspx, $Resources:Microsoft.Office.Server.Search,SearchCenterAdvancedSearchTitle;" /> <Property Name="Title" Value="$Resources:Microsoft.Office.Server.Search,Search_Advanced_Page_Title;" /> <Property Name="ContentType" Value="$Resources:Microsoft.Office.Server.Search,contenttype_welcomepage_name;" /> <AllUsersWebPart WebPartZoneID="MainZone" WebPartOrder="1"> <![CDATA[ <WebPart xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://schemas.microsoft.com/WebPart/v2"> <Title>Advanced Search Box</Title> <FrameType>None</FrameType> <Description>Displays parameterized search options based on properties and combinations of words.</Description> <IsIncluded>true</IsIncluded> <ZoneID>MainZone</ZoneID> <PartOrder>1</PartOrder> <FrameState>Normal</FrameState> <Height /> <Width /> <AllowRemove>true</AllowRemove> <AllowZoneChange>true</AllowZoneChange> <AllowMinimize>true</AllowMinimize> <AllowConnect>true</AllowConnect> <AllowEdit>true</AllowEdit> <AllowHide>true</AllowHide> <IsVisible>true</IsVisible> <DetailLink /> <HelpLink /> <HelpMode>Modeless</HelpMode> <Dir>Default</Dir> <PartImageSmall /> <MissingAssembly>Cannot import this Web Part.</MissingAssembly> <PartImageLarge /> <IsIncludedFilter /> <Assembly>Microsoft.Office.Server.Search, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c</Assembly> <TypeName>Microsoft.Office.Server.Search.WebControls.AdvancedSearchBox</TypeName> <SearchResultPageURL xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">results.aspx</SearchResultPageURL> <TextQuerySectionLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">Find documents that have...</TextQuerySectionLabelText> <ShowAndQueryTextBox xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowAndQueryTextBox> <AndQueryTextBoxLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox" /> <ShowPhraseQueryTextBox xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowPhraseQueryTextBox> <PhraseQueryTextBoxLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox" /> <ShowOrQueryTextBox xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowOrQueryTextBox> <OrQueryTextBoxLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox" /> <ShowNotQueryTextBox xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowNotQueryTextBox> <NotQueryTextBoxLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox" /> <ScopeSectionLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">Narrow the search...</ScopeSectionLabelText> <ShowScopes xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">false</ShowScopes> <ScopeLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox" /> <DisplayGroup xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">Advanced Search</DisplayGroup> <ShowLanguageOptions xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">false</ShowLanguageOptions> <LanguagesLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox" /> <ShowResultTypePicker xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowResultTypePicker> <ResultTypeLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox" /> <ShowPropertiesSection xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">true</ShowPropertiesSection> <PropertiesSectionLabelText xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">Add property restrictions...</PropertiesSectionLabelText> <Properties xmlns="urn:schemas-microsoft-com:AdvancedSearchBox">&lt;root xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"&gt;  &lt;LangDefs&gt;    &lt;LangDef DisplayName="Arabic" LangID="ar"/&gt;    &lt;LangDef DisplayName="Bengali" LangID="bn"/&gt;    &lt;LangDef DisplayName="Bulgarian" LangID="bg"/&gt;    &lt;LangDef DisplayName="Catalan" LangID="ca"/&gt;    &lt;LangDef DisplayName="Simplified Chinese" LangID="zh-cn"/&gt;    &lt;LangDef DisplayName="Traditional Chinese" LangID="zh-tw"/&gt;    &lt;LangDef DisplayName="Croatian" LangID="hr"/&gt;    &lt;LangDef DisplayName="Czech" LangID="cs"/&gt;    &lt;LangDef DisplayName="Danish" LangID="da"/&gt;    &lt;LangDef DisplayName="Dutch" LangID="nl"/&gt;    &lt;LangDef DisplayName="English" LangID="en"/&gt;    &lt;LangDef DisplayName="Finnish" LangID="fi"/&gt;    &lt;LangDef DisplayName="French" LangID="fr"/&gt;    &lt;LangDef DisplayName="German" LangID="de"/&gt;    &lt;LangDef DisplayName="Greek" LangID="el"/&gt;    &lt;LangDef DisplayName="Gujarati" LangID="gu"/&gt;    &lt;LangDef DisplayName="Hebrew" LangID="he"/&gt;    &lt;LangDef DisplayName="Hindi" LangID="hi"/&gt;    &lt;LangDef DisplayName="Hungarian" LangID="hu"/&gt;    &lt;LangDef DisplayName="Icelandic" LangID="is"/&gt;    &lt;LangDef DisplayName="Indonesian" LangID="id"/&gt;    &lt;LangDef DisplayName="Italian" LangID="it"/&gt;    &lt;LangDef DisplayName="Japanese" LangID="ja"/&gt;    &lt;LangDef DisplayName="Kannada" LangID="kn"/&gt;    &lt;LangDef DisplayName="Korean" LangID="ko"/&gt;    &lt;LangDef DisplayName="Latvian" LangID="lv"/&gt;    &lt;LangDef DisplayName="Lithuanian" LangID="lt"/&gt;    &lt;LangDef DisplayName="Malay" LangID="ms"/&gt;    &lt;LangDef DisplayName="Malayalam" LangID="ml"/&gt;    &lt;LangDef DisplayName="Marathi" LangID="mr"/&gt;    &lt;LangDef DisplayName="Norwegian" LangID="no"/&gt;    &lt;LangDef DisplayName="Polish" LangID="pl"/&gt;    &lt;LangDef DisplayName="Portuguese" LangID="pt"/&gt;    &lt;LangDef DisplayName="Punjabi" LangID="pa"/&gt;    &lt;LangDef DisplayName="Romanian" LangID="ro"/&gt;    &lt;LangDef DisplayName="Russian" LangID="ru"/&gt;    &lt;LangDef DisplayName="Slovak" LangID="sk"/&gt;    &lt;LangDef DisplayName="Slovenian" LangID="sl"/&gt;    &lt;LangDef DisplayName="Spanish" LangID="es"/&gt;    &lt;LangDef DisplayName="Swedish" LangID="sv"/&gt;    &lt;LangDef DisplayName="Tamil" LangID="ta"/&gt;    &lt;LangDef DisplayName="Telugu" LangID="te"/&gt;    &lt;LangDef DisplayName="Thai" LangID="th"/&gt;    &lt;LangDef DisplayName="Turkish" LangID="tr"/&gt;    &lt;LangDef DisplayName="Ukrainian" LangID="uk"/&gt;    &lt;LangDef DisplayName="Urdu" LangID="ur"/&gt;    &lt;LangDef DisplayName="Vietnamese" LangID="vi"/&gt;  &lt;/LangDefs&gt;  &lt;Languages&gt;    &lt;Language LangRef="en"/&gt;    &lt;Language LangRef="fr"/&gt;    &lt;Language LangRef="de"/&gt;    &lt;Language LangRef="ja"/&gt;    &lt;Language LangRef="zh-cn"/&gt;    &lt;Language LangRef="es"/&gt;    &lt;Language LangRef="zh-tw"/&gt;  &lt;/Languages&gt;  &lt;PropertyDefs&gt;    &lt;PropertyDef Name="Path" DataType="url" DisplayName="URL"/&gt;    &lt;PropertyDef Name="Size" DataType="integer" DisplayName="Size (bytes)"/&gt;    &lt;PropertyDef Name="Write" DataType="datetime" DisplayName="Last Modified Date"/&gt;    &lt;PropertyDef Name="FileName" DataType="text" DisplayName="Name"/&gt;    &lt;PropertyDef Name="Description" DataType="text" DisplayName="Description"/&gt;    &lt;PropertyDef Name="Title" DataType="text" DisplayName="Title"/&gt;    &lt;PropertyDef Name="Author" DataType="text" DisplayName="Author"/&gt;    &lt;PropertyDef Name="DocSubject" DataType="text" DisplayName="Subject"/&gt;    &lt;PropertyDef Name="DocKeywords" DataType="text" DisplayName="Keywords"/&gt;    &lt;PropertyDef Name="DocComments" DataType="text" DisplayName="Comments"/&gt;    &lt;PropertyDef Name="CreatedBy" DataType="text" DisplayName="Created By"/&gt;    &lt;PropertyDef Name="ModifiedBy" DataType="text" DisplayName="Last Modified By"/&gt;    &lt;PropertyDef Name="EmployeeNumber" DataType="text" DisplayName="EmployeeNumber"/&gt;    &lt;PropertyDef Name="EmployeeId" DataType="text" DisplayName="EmployeeId"/&gt;    &lt;PropertyDef Name="EmployeeFirstName" DataType="text" DisplayName="EmployeeFirstName"/&gt;    &lt;PropertyDef Name="EmployeeLastName" DataType="text" DisplayName="EmployeeLastName"/&gt;  &lt;/PropertyDefs&gt;  &lt;ResultTypes&gt;    &lt;ResultType DisplayName="Employee Document" Name="default"&gt;      &lt;KeywordQuery/&gt;      &lt;PropertyRef Name="EmployeeNumber" /&gt;      &lt;PropertyRef Name="EmployeeId" /&gt;      &lt;PropertyRef Name="EmployeeFirstName" /&gt;      &lt;PropertyRef Name="EmployeeLastName" /&gt;    &lt;/ResultType&gt;    &lt;ResultType DisplayName="All Results"&gt;      &lt;KeywordQuery/&gt;      &lt;PropertyRef Name="Author" /&gt;      &lt;PropertyRef Name="Description" /&gt;      &lt;PropertyRef Name="FileName" /&gt;      &lt;PropertyRef Name="Size" /&gt;      &lt;PropertyRef Name="Path" /&gt;      &lt;PropertyRef Name="Write" /&gt;      &lt;PropertyRef Name="CreatedBy" /&gt;      &lt;PropertyRef Name="ModifiedBy" /&gt;    &lt;/ResultType&gt;    &lt;ResultType DisplayName="Documents" Name="documents"&gt;      &lt;KeywordQuery&gt;IsDocument="True"&lt;/KeywordQuery&gt;      &lt;PropertyRef Name="Author" /&gt;      &lt;PropertyRef Name="DocComments"/&gt;      &lt;PropertyRef Name="Description" /&gt;      &lt;PropertyRef Name="DocKeywords"/&gt;      &lt;PropertyRef Name="FileName" /&gt;      &lt;PropertyRef Name="Size" /&gt;      &lt;PropertyRef Name="DocSubject"/&gt;      &lt;PropertyRef Name="Path" /&gt;      &lt;PropertyRef Name="Write" /&gt;      &lt;PropertyRef Name="CreatedBy" /&gt;      &lt;PropertyRef Name="ModifiedBy" /&gt;      &lt;PropertyRef Name="Title"/&gt;    &lt;/ResultType&gt;    &lt;ResultType DisplayName="Word Documents" Name="worddocuments"&gt;      &lt;KeywordQuery&gt;FileExtension="doc" OR FileExtension="docx" OR FileExtension="dot" OR FileExtension="docm" OR FileExtension="odt"&lt;/KeywordQuery&gt;      &lt;PropertyRef Name="Author" /&gt;      &lt;PropertyRef Name="DocComments"/&gt;      &lt;PropertyRef Name="Description" /&gt;      &lt;PropertyRef Name="DocKeywords"/&gt;      &lt;PropertyRef Name="FileName" /&gt;      &lt;PropertyRef Name="Size" /&gt;      &lt;PropertyRef Name="DocSubject"/&gt;      &lt;PropertyRef Name="Path" /&gt;      &lt;PropertyRef Name="Write" /&gt;      &lt;PropertyRef Name="CreatedBy" /&gt;      &lt;PropertyRef Name="ModifiedBy" /&gt;      &lt;PropertyRef Name="Title"/&gt;    &lt;/ResultType&gt;    &lt;ResultType DisplayName="Excel Documents" Name="exceldocuments"&gt;      &lt;KeywordQuery&gt;FileExtension="xls" OR FileExtension="xlsx" OR FileExtension="xlsm" OR FileExtension="xlsb" OR FileExtension="ods"&lt;/KeywordQuery&gt;      &lt;PropertyRef Name="Author" /&gt;      &lt;PropertyRef Name="DocComments"/&gt;      &lt;PropertyRef Name="Description" /&gt;      &lt;PropertyRef Name="DocKeywords"/&gt;      &lt;PropertyRef Name="FileName" /&gt;      &lt;PropertyRef Name="Size" /&gt;      &lt;PropertyRef Name="DocSubject"/&gt;      &lt;PropertyRef Name="Path" /&gt;      &lt;PropertyRef Name="Write" /&gt;      &lt;PropertyRef Name="CreatedBy" /&gt;      &lt;PropertyRef Name="ModifiedBy" /&gt;      &lt;PropertyRef Name="Title"/&gt;    &lt;/ResultType&gt;    &lt;ResultType DisplayName="PowerPoint Presentations" Name="presentations"&gt;      &lt;KeywordQuery&gt;FileExtension="ppt" OR FileExtension="pptx" OR FileExtension="pptm" OR FileExtension="odp"&lt;/KeywordQuery&gt;      &lt;PropertyRef Name="Author" /&gt;      &lt;PropertyRef Name="DocComments"/&gt;      &lt;PropertyRef Name="Description" /&gt;      &lt;PropertyRef Name="DocKeywords"/&gt;      &lt;PropertyRef Name="FileName" /&gt;      &lt;PropertyRef Name="Size" /&gt;      &lt;PropertyRef Name="DocSubject"/&gt;      &lt;PropertyRef Name="Path" /&gt;      &lt;PropertyRef Name="Write" /&gt;      &lt;PropertyRef Name="CreatedBy" /&gt;      &lt;PropertyRef Name="ModifiedBy" /&gt;      &lt;PropertyRef Name="Title"/&gt;    &lt;/ResultType&gt;  &lt;/ResultTypes&gt;&lt;/root&gt;</Properties> </WebPart> ]]> </AllUsersWebPart> </File> 5.Deploy your custom solution and you will have a custom advanced search page.

    Read the article

  • Javascript not working in IE but works in Firefox chrome

    - by user1290528
    So i have the following php page with a java script that gets the total of items based on their quatity, then inputs the total into a text box for each item. In ie the text boxes are being filled with $NaN. While in firefox, chrome the text boxes are filled with the correct values. Any help would be graatly appreciated. <?php echo $_SESSION['SESS_MEMBER_ID']; require_once('auth.php'); ?> <!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"> <html> <head> <meta content="text/html; charset=ISO-8859-1" http-equiv="Content-Type"> <title>Breakfast Menu</title> <link href="loginmodule.css" rel="stylesheet" type="text/css"> <script type='text/javascript'> var totalarray=new Array(); var totalarray2= new Array(); var runningtotal = 0; var runningtotal2 = 0; var discount = .2; var discounttotal = 0; var discount1 = 0; runningtotal = runningtotal * 1; runningtotal2 = runningtotal2 * 1; function displayResult(price,init) { var newstring = "quantity"+init; var totstring = "total"+init; var quantity = document.getElementById(newstring).value; var quantity = parseFloat(quantity); var test = price * quantity; var test = test.toFixed(2); document.getElementById(newstring).value = quantity; document.getElementById(totstring).value = "$" + test; totalarray[init] = test; getTotal(); } function getTotal(){ runningtotal = 0; var i=0; for (i=0;i<totalarray.length;i++){ totalarray[i] = totalarray[i] *1; runningtotal = runningtotal + totalarray[i]; discounttotal = totalarray[i] * discount; discounttotal = totalarray[i] - discounttotal; This line is where IE shows its first error document.getElementById('totalcost').value="$" + runningtotal.toFixed(2); } var orderpart1 = document.getElementById('totalcost').value; var orderpart1 = orderpart1.substr(1); var orderpart1 = orderpart1 * 1; var orderpart2 = document.getElementById('totalcost2').value; var orderpart2 = orderpart2.substr(1); var orderpart2 = orderpart2 * 1; var ordertot = orderpart1 + orderpart2; document.getElementById('ordertotal').value ="$"+ ordertot.toFixed(2) } function displayResult2(price2,init2) { var newstring2 = "quantity2"+init2; var totstring2 = "total2"+init2; var quantity2 = document.getElementById(newstring2).value; var quantity2 = parseFloat(quantity2); var test2 = price2 * quantity2; var test2 = test2.toFixed(2); document.getElementById(newstring2).value = quantity2; document.getElementById(totstring2).value = "$" + test2; totalarray2[init2] = test2; getTotal2(); } function getTotal2(){ runningtotal2 = 0; var i=0; for (i=0;i<totalarray2.length;i++){ totalarray2[i] = totalarray2[i] *1; runningtotal2 = runningtotal2+ totalarray2[i]; This is where IE shows its second error document.getElementById('totalcost2').value="$" + runningtotal2.toFixed(2); }//IE Shows Second error here var orderpart1 = document.getElementById('totalcost').value; var orderpart1 = orderpart1.substr(1); var orderpart1 = orderpart1 * 1; var orderpart2 = document.getElementById('totalcost2').value; var orderpart2 = orderpart2.substr(1); var orderpart2 = orderpart2 * 1; var ordertot = orderpart1 + orderpart2; document.getElementById('ordertotal').value ="$"+ ordertot.toFixed(2); } </script> </head> <body> <?php include("newnew.php"); ?> <td style="vertical-align: top; width: 80%; height:80%;"><br> <div style="text-align: center;"> <form action="testplaceorder.php" method="post" onSubmit="return confirm('Are you sure?');"> <h4>Employee Breakfast Order Form</h4> <h1 align="left">Breakfest Foods</h1> <table border='0' cellpadding='0' cellspacing='0'> <tr> <td> <table width="100%" border="1"> <tr> <th>Item&nbsp&nbsp&nbsp&nbsp&nbsp</th> <th>Price&nbsp&nbsp&nbsp&nbsp&nbsp </th> <th>Quantity&nbsp&nbsp&nbsp&nbsp&nbsp</th> <th>Total&nbsp&nbsp&nbsp&nbsp&nbsp</th> </tr> <?php mysql_connect("localhost", "seniorproject", "farmingdale123") or die(mysql_error()); mysql_select_db("fsenior") or die(mysql_error()); $result = mysql_query("SELECT name, price,foodid FROM Food where foodtype='br'") or die(mysql_error()); $init = 0; while(list($name, $price, $brId) = mysql_fetch_row($result)) { echo "<tr> <td>$name</td> <td>\$$price</td> <td><select name='quantity$init' id='quantity$init' onchange='displayResult($price,$init)'><option>0</option><option>1</option><option>2</option><option>3</option><option>4</option><option>5</option><option>6</option><option>7</option><option>8</option><option>9</option></td> <td><input name='total$init' type='text' id='total$init' readonly='readonly' value='\$0.00'></td> </tr>" ; echo "<script type='text/javascript'>displayResult($price,$init);</script>"; $foodname = "'SESS_FOODNAME_" . $init . "'"; $foodid = "'SESS_FOODID_" . $init."'"; $_SESSION[$foodname] = $name; $_SESSION[$foodid] = $brId; $init = $init+1; } $_SESSION['SESS_INIT'] = $init; ?> <tr> <td></td> <td></td> <td>Total Cost</td> <td><input name='totalcost' type='text' id='totalcost' readonly='readonly' value='$0.00'></td> </tr> <tr><td></td><td></td><td>Discount</td><td><input name='discountvalue1' id ='discountvalue1' type='text' readonly='readonly' value='20%'></td> </tr> <tr><td></td><td></td><td>Total After Discount</td><td><input name='discounttotal1' id ='discounttotal1' type='text' readonly='readonly' value='$0.00'></td></tr> </table> <tr> <td><br></td> </tr> </table> <h1 align="left">Breakfest Drinks</h1> <table border='0' cellpadding='0' cellspacing='0'> <tr> <td> <table width="100%" border="1"> <tr> <th>Item&nbsp&nbsp&nbsp&nbsp&nbsp</th> <th>Price&nbsp&nbsp&nbsp&nbsp&nbsp </th> <th>Quantity&nbsp&nbsp&nbsp&nbsp&nbsp</th> <th>Total&nbsp&nbsp&nbsp&nbsp&nbsp</th> </tr> <?php mysql_connect("localhost", "****", "***") or die(mysql_error()); mysql_select_db("fsenior") or die(mysql_error()); $result2 = mysql_query("SELECT drinkname, price,drinkid FROM Drinks where drinktype='br'") or die(mysql_error()); $init2 = 0; while(list($name2, $price2, $brId2) = mysql_fetch_row($result2)) { echo "<tr> <td>$name2</td> <td>\$$price2</td> <td><select name='quantity2$init2' id='quantity2$init2' onchange='displayResult2($price2,$init2)'><option>0</option><option>1</option><option>2</option><option>3</option><option>4</option><option>5</option><option>6</option><option>7</option><option>8</option><option>9</option></td> <td><input name='total2$init2' type='text' id='total2$init2' readonly='readonly' value='\$0.00'></td> </tr>" ; echo "<script type='text/javascript'>displayResult2($price2,$init2);</script>"; $drinkname = "'SESS_DRINKNAME_" . $init2 . "'"; $drinkid = "'SESS_DRINKID_" . $init2."'"; $_SESSION[$drinkname] = $name2; $_SESSION[$drinkid] = $brId2; $init2 = $init2+1; } $_SESSION['SESS_INIT2'] = $init2; ?> <tr> <td></td> <td></td> <td>Total Cost</td> <td><input name='totalcost2' type='text' id='totalcost2' readonly='readonly' value='$0.00'></td> </tr> </table> <tr> <td><br></td> </tr> </table> <table border="2"> <tr><td>Total Order Cost:</td><td> <?php echo "<input name='ordertotal' type='text' id='ordertotal' readonly='readonly' value='\$0.00'></td></table>"; ?> <p align="left"><input type='submit' name='submit' value='Submit'/></p> </form> </div></td> </tr> </tbody> </table></td> </tr> </tbody> </table> </body> </html>

    Read the article

  • How to improve this piece of code

    - by user303518
    Can anyone help me on this. It may be very frustrating for you all. But I want you guys to take a moment to look at the code below and please tell me all the things that are wrong in the below piece of code. You can copy it into your visual studio to analyze this better. You don’t have to make this code compile. My task is to get the following things: Most obvious mistakes with this code All the things that are wrong/bad practices with the code below Modify/Write an optimized version of this code. Keep in mind, the code DOES NOT need to compile. Reduce the number of lines of code You should NEVER try to implement something like below: public List<ValidationErrorDto> ProcessEQuote(string eQuoteXml, long programUniversalID) { // Get Program Info. DataTable programs = GetAllPrograms(); DataRow[] programRows = programs.Select(string.Format("ProgramUniversalID = {0}", programUniversalID)); long programID = (long)programRows[0]["ProgramID"]; string programName = (string)programRows[0]["Description"]; // Get Config file values. string fromEmail = ConfigurationManager.AppSettings["eQuotesFromEmail"]; string technicalSupportPhone = ConfigurationManager.AppSettings["TechnicalSupportPhone"]; string fromEmailDisplayName = string.IsNullOrEmpty(ConfigurationManager.AppSettings["eQuotesFromDisplayName"]) ? null : string.Format(ConfigurationManager.AppSettings["eQuotesFromDisplayName"], programName); string itEmail = !string.IsNullOrEmpty(ConfigurationManager.AppSettings["ITEmail"]) ? ConfigurationManager.AppSettings["ITEmail"] : string.Empty; string itName = !string.IsNullOrEmpty(ConfigurationManager.AppSettings["ITName"]) ? ConfigurationManager.AppSettings["ITName"] : "IT"; try { List<ValidationErrorDto> allValidationErrors = new List<ValidationErrorDto>(); List<ValidationErrorDto> validationErrors = new List<ValidationErrorDto>(); if (validationErrors.Count == 0) { validationErrors.AddRange(ValidateEQuoteXmlAgainstSchema(eQuoteXml)); if (validationErrors.Count == 0) { XmlDocument eQuoteXmlDoc = new XmlDocument(); eQuoteXmlDoc.LoadXml(eQuoteXml); XmlElement rootElement = eQuoteXmlDoc.DocumentElement; XmlNodeList quotesList = rootElement.SelectNodes("Quote"); foreach (XmlNode node in quotesList) { // Each node should be a quote node but to be safe, check if (node.Name == "Quote") { string groupName = node.SelectSingleNode("Group/GroupName").InnerText; string groupCity = node.SelectSingleNode("Group/GroupCity").InnerText; string groupPostalCode = node.SelectSingleNode("Group/GroupZipCode").InnerText; string groupSicCode = node.SelectSingleNode("Group/GroupSIC").InnerText; string generalAgencyLicenseNumber = node.SelectSingleNode("Group/GALicenseNbr").InnerText; string brokerName = node.SelectSingleNode("Group/BrokerName").InnerText; string deliverToEmailAddress = node.SelectSingleNode("Group/ReturnEmailAddress").InnerText; string brokerEmail = node.SelectSingleNode("Group/BrokerEmail").InnerText; string groupEligibleEmployeeCountString = node.SelectSingleNode("Group/GroupNbrEmployees").InnerText; string quoteEffectiveDateString = node.SelectSingleNode("Group/QuoteEffectiveDate").InnerText; string salesRepName = node.SelectSingleNode("Group/SalesRepName").InnerText; string salesRepPhone = node.SelectSingleNode("Group/SalesRepPhone").InnerText; string salesRepEmail = node.SelectSingleNode("Group/SalesRepEmail").InnerText; string brokerLicenseNumber = node.SelectSingleNode("Group/BrokerLicenseNbr").InnerText; DateTime? quoteEffectiveDate = null; int eligibleEmployeeCount = int.Parse(groupEligibleEmployeeCountString); DateTime quoteEffectiveDateOut; if (!DateTime.TryParse(quoteEffectiveDateString, out quoteEffectiveDateOut)) validationErrors.Add(ValidationHelper.CreateValidationError((long)QuoteField.EffectiveDate, "Quote Effective Date", ValidationErrorDto.ValueOutOfRange, false, ValidationHelper.CreateValidationContext(Entity.QuoteDetail, "Quote"))); else quoteEffectiveDate = quoteEffectiveDateOut; Dictionary<string, string> replacementCodeValues = new Dictionary<string, string>(); if (string.IsNullOrEmpty(Resources.ParameterMessageKeys.ResourceManager.GetString("GroupName"))) throw new InvalidOperationException("GroupName key is not configured"); replacementCodeValues.Add(Resources.ParameterMessageKeys.GroupName, groupName); replacementCodeValues.Add(Resources.ParameterMessageKeys.ProgramName, programName); replacementCodeValues.Add(Resources.ParameterMessageKeys.SalesRepName, salesRepName); replacementCodeValues.Add(Resources.ParameterMessageKeys.SalesRepPhone, salesRepPhone); replacementCodeValues.Add(Resources.ParameterMessageKeys.SalesRepEmail, salesRepEmail); replacementCodeValues.Add(Resources.ParameterMessageKeys.TechnicalSupportPhone, technicalSupportPhone); replacementCodeValues.Add(Resources.ParameterMessageKeys.EligibleEmployeCount, eligibleEmployeeCount.ToString()); // Retrieve the CityID and StateID long? cityID = null; long? stateID = null; DataSet citiesAndStates = Addresses.GetCitiesAndStatesFromPostalCode(groupPostalCode); DataTable cities = citiesAndStates.Tables[0]; DataTable states = citiesAndStates.Tables[1]; DataRow[] cityRows = cities.Select(string.Format("Name = '{0}'", groupCity)); if (cityRows.Length > 0) { cityID = (long)cityRows[0]["CityID"]; DataRow[] stateRows = states.Select(string.Format("CityID = {0}", cityID)); if (stateRows.Length > 0) stateID = (long)stateRows[0]["StateID"]; } // If the StateID does not exist, then we cannot get the GeneralAgency, so set a validation error and do not contine. // Else, Continue and look for an General Agency. If a GA was found, look for or create a Broker. Then look for or create a Prospect Group // Then using the info, create a quote. if (!stateID.HasValue) validationErrors.Add(ValidationHelper.CreateValidationError((long)ProspectGroupField.State, "Prospect Group State", ValidationErrorDto.RequiredFieldMissing, false, ValidationHelper.CreateValidationContext(Entity.ProspectGroup, "Prospect Group"))); bool brokerValidationError = false; SalesRepDto salesRep = GetSalesRepByEmail(salesRepEmail, ref validationErrors); if (salesRep == null) { string exceptionMessage = "Sales Rep Not found in Opportunity System. Please make sure Sales Rep is present in the system by adding the sales rep in AWP SR Add Screen." + Environment.NewLine; exceptionMessage = exceptionMessage + " Sales Rep Name: " + salesRepName + Environment.NewLine; exceptionMessage = exceptionMessage + " Sales Rep Email: " + salesRepEmail + Environment.NewLine; exceptionMessage = exceptionMessage + " Module : E-Quote Service" + Environment.NewLine; throw new Exception(exceptionMessage); } if (validationErrors.Count == 0) { // Note that StateID and EffectiveDate should be valid at this point. If it weren't there would be validation errors. // Find General Agency long? generalAgencyID; validationErrors.AddRange(GetEQuoteGeneralAgency(generalAgencyLicenseNumber, stateID.Value, out generalAgencyID)); // If GA was found, check for Broker. if (validationErrors.Count == 0 && generalAgencyID.HasValue) { Dictionary<string, string> brokerNameParts = ContactHelper.GetNamePartsFromFullName(brokerName); long? brokerID; validationErrors.AddRange(CreateEQuoteBroker(brokerNameParts["FirstName"], brokerNameParts["LastName"], brokerEmail, brokerLicenseNumber, stateID.Value, generalAgencyID.Value, salesRep, programID, out brokerID)); // If Broker was found but had validation errors if (validationErrors.Count > 0) { DeliverEmailMessage(programID, quoteEffectiveDate.Value, fromEmail, fromEmailDisplayName, itEmail, DocumentType.EQuoteBrokerValidationFailureMessageEmail, replacementCodeValues); brokerValidationError = true; } // If Broker was found, check for Prospect Group if (validationErrors.Count == 0 && brokerID.HasValue) { long? prospectGroupID; validationErrors.AddRange(CreateEQuoteProspectGroup(groupName, cityID, stateID, groupPostalCode, groupSicCode, brokerID.Value, out prospectGroupID)); if (validationErrors.Count == 0 && prospectGroupID.HasValue) { if (validationErrors.Count == 0) { long? quoteID; validationErrors.AddRange(CreateEQuote(programID, prospectGroupID.Value, generalAgencyID.Value, quoteEffectiveDate.Value, eligibleEmployeeCount, deliverToEmailAddress, node.SelectNodes("Employees/Employee"), deliverToEmailAddress, out quoteID)); if (validationErrors.Count == 0 && quoteID.HasValue) { QuoteFromServiceDto quoteFromService = GetQuoteByQuoteID(quoteID.Value); // Generate Pre-Message replacementCodeValues.Add(Resources.ParameterMessageKeys.QuoteNumber, string.Format("{0}.{1}", quoteFromService.QuoteNumber, quoteFromService.QuoteVersion)); replacementCodeValues.Add(Resources.ParameterMessageKeys.Name, brokerName); replacementCodeValues.Add(Resources.ParameterMessageKeys.LicenseNumbers, brokerLicenseNumber); DeliverEmailMessage(programID, quoteEffectiveDate.Value, fromEmail, fromEmailDisplayName, deliverToEmailAddress, DocumentType.EQuotePreMessageEmail, replacementCodeValues); bool quoteGenerated = false; try { quoteGenerated = GenerateAndDeliverInitialQuote(quoteID.Value); } catch (Exception exception) { TraceLogger.LogException(exception, LoggingCategory); if (replacementCodeValues.ContainsKey(Resources.ParameterMessageKeys.Name)) replacementCodeValues[Resources.ParameterMessageKeys.Name] = itName; else replacementCodeValues.Add(Resources.ParameterMessageKeys.Name, itName); if (replacementCodeValues.ContainsKey(Resources.ParameterMessageKeys.Errors)) replacementCodeValues[Resources.ParameterMessageKeys.Errors] = string.Format("Errors:\r\n:{0}", exception); else replacementCodeValues.Add(Resources.ParameterMessageKeys.Errors, string.Format("Errors:\r\n:{0}", exception)); DeliverEmailMessage(programID, quoteEffectiveDate.Value, fromEmail, fromEmailDisplayName, itEmail, DocumentType.EQuoteSystemFailureMessageEmail, replacementCodeValues); } if (!quoteGenerated) { // Generate System Failure Message if (replacementCodeValues.ContainsKey(Resources.ParameterMessageKeys.Name)) replacementCodeValues[Resources.ParameterMessageKeys.Name] = brokerName; else replacementCodeValues.Add(Resources.ParameterMessageKeys.Name, brokerName); if (replacementCodeValues.ContainsKey(Resources.ParameterMessageKeys.Errors)) replacementCodeValues[Resources.ParameterMessageKeys.Errors] = string.Empty; else replacementCodeValues.Add(Resources.ParameterMessageKeys.Errors, string.Empty); DeliverEmailMessage(programID, quoteEffectiveDate.Value, fromEmail, fromEmailDisplayName, itEmail, DocumentType.EQuoteSystemFailureMessageEmail, replacementCodeValues); } } } } } } } //if (validationErrors.Count > 0) // Per spec, if Broker Validation returned an error we already sent an email, don't send another generic one if (validationErrors.Count > 0 && (!brokerValidationError)) { StringBuilder errorString = new StringBuilder(); foreach (ValidationErrorDto validationError in validationErrors) errorString = errorString.AppendLine(string.Format(" - {0}", ValidationHelper.GetValidationErrorReason(validationError, true))); replacementCodeValues.Add(Resources.ParameterMessageKeys.Errors, errorString.ToString()); if (replacementCodeValues.ContainsKey(Resources.ParameterMessageKeys.Name)) replacementCodeValues[Resources.ParameterMessageKeys.Name] = brokerName; else replacementCodeValues.Add(Resources.ParameterMessageKeys.Name, brokerName); // HACK: If there is no effective date, then use Today's date. Do we care about the effecitve dat on validation message? if (quoteEffectiveDate.HasValue) DeliverEmailMessage(programID, quoteEffectiveDate.Value, fromEmail, fromEmailDisplayName, itEmail, DocumentType.EQuoteValidationFailureMessageEmail, replacementCodeValues); else DeliverEmailMessage(programID, DateTime.Now, fromEmail, fromEmailDisplayName, itEmail, DocumentType.EQuoteValidationFailureMessageEmail, replacementCodeValues); } allValidationErrors.AddRange(validationErrors); validationErrors.Clear(); } } } else { // Use todays date as the effective date. Dictionary<string, string> replacementCodeValues = new Dictionary<string, string>(); StringBuilder errorString = new StringBuilder(); foreach (ValidationErrorDto validationError in validationErrors) errorString = errorString.AppendLine(string.Format(" - {0}", ValidationHelper.GetValidationErrorReason(validationError, true))); replacementCodeValues.Add(Resources.ParameterMessageKeys.Errors, string.Format("The following validation errors occurred: \r\n{0}", errorString)); replacementCodeValues.Add(Resources.ParameterMessageKeys.ProgramName, programName); replacementCodeValues.Add(Resources.ParameterMessageKeys.GroupName, "Group"); replacementCodeValues.Add(Resources.ParameterMessageKeys.Name, itName); DeliverEmailMessage(programID, DateTime.Now, fromEmail, null, itEmail, DocumentType.EQuoteSystemFailureMessageEmail, replacementCodeValues); allValidationErrors.AddRange(validationErrors); validationErrors.Clear(); } } return allValidationErrors; } catch (Exception exception) { TraceLogger.LogException(exception, LoggingCategory); // Use todays date as the effective date. Dictionary<string, string> replacementCodeValues = new Dictionary<string, string>(); replacementCodeValues.Add(Resources.ParameterMessageKeys.ProgramName, programName); replacementCodeValues.Add(Resources.ParameterMessageKeys.GroupName, "Group"); replacementCodeValues.Add(Resources.ParameterMessageKeys.Name, itName); replacementCodeValues.Add(Resources.ParameterMessageKeys.Errors, string.Format("Errors:\r\n:{0}", exception)); DeliverEmailMessage(programID, DateTime.Now, fromEmail, null, itEmail, DocumentType.EQuoteSystemFailureMessageEmail, replacementCodeValues); throw new FaultException(exception.ToString()); } }

    Read the article

  • custom collection in property grid

    - by guyl
    Hi guys. I'm using this article as a reference to use custom collection in propertygrid: LINK When I open the collectioneditor and remove all items then I press OK, I get an exception if null. How can i solve that ? I am using: public T this[int index] { get { if (List.Count == 0) { return default(T); } else { return (T)this.List[index]; } } } as a getter for an item, of course if I have no object how can i restart the whole collection ? this is the whole code /// <summary> /// A generic folder settings collection to use in a property grid. /// </summary> /// <typeparam name="T">can be import or export folder settings.</typeparam> [Serializable] [TypeConverter(typeof(FolderSettingsCollectionConverter)), Editor(typeof(FolderSettingsCollectionEditor), typeof(UITypeEditor))] public class FolderSettingsCollection_New<T> : CollectionBase, ICustomTypeDescriptor { private bool m_bRestrictNumberOfItems; private int m_bNumberOfItems; private Dictionary<string, int> m_UID2Idx = new Dictionary<string, int>(); private T[] arrTmp; /// <summary> /// C'tor, can determine the number of objects to hold. /// </summary> /// <param name="bRestrictNumberOfItems">restrict the number of folders to hold.</param> /// <param name="iNumberOfItems">The number of folders to hold.</param> public FolderSettingsCollection_New(bool bRestrictNumberOfItems = false , int iNumberOfItems = 1) { m_bRestrictNumberOfItems = bRestrictNumberOfItems; m_bNumberOfItems = iNumberOfItems; } /// <summary> /// Add folder to collection. /// </summary> /// <param name="t">Folder to add.</param> public void Add(T t) { if (m_bRestrictNumberOfItems) { if (this.List.Count >= m_bNumberOfItems) { return; } } int index = this.List.Add(t); if (t is WriteDataFolderSettings || t is ReadDataFolderSettings) { FolderSettingsBase tmp = t as FolderSettingsBase; m_UID2Idx.Add(tmp.UID, index); } } /// <summary> /// Remove folder to collection. /// </summary> /// <param name="t">Folder to remove.</param> public void Remove(T t) { this.List.Remove(t); if (t is WriteDataFolderSettings || t is ReadDataFolderSettings) { FolderSettingsBase tmp = t as FolderSettingsBase; m_UID2Idx.Remove(tmp.UID); } } /// <summary> /// Gets ot sets a folder. /// </summary> /// <param name="index">The index of the folder in the collection.</param> /// <returns>A folder object.</returns> public T this[int index] { get { //if (List.Count == 0) //{ // return default(T); //} //else //{ return (T)this.List[index]; //} } } /// <summary> /// Gets or sets a folder. /// </summary> /// <param name="sUID">The UID of the folder.</param> /// <returns>A folder object.</returns> public T this[string sUID] { get { if (this.Count == 0 || !m_UID2Idx.ContainsKey(sUID)) { return default(T); } else { return (T)this.List[m_UID2Idx[sUID]]; } } } /// <summary> /// /// </summary> /// <param name="sUID"></param> /// <returns></returns> public bool ContainsItemByUID(string sUID) { return m_UID2Idx.ContainsKey(sUID); } /// <summary> /// /// </summary> /// <returns></returns> public String GetClassName() { return TypeDescriptor.GetClassName(this, true); } /// <summary> /// /// </summary> /// <returns></returns> public AttributeCollection GetAttributes() { return TypeDescriptor.GetAttributes(this, true); } /// <summary> /// /// </summary> /// <returns></returns> public String GetComponentName() { return TypeDescriptor.GetComponentName(this, true); } /// <summary> /// /// </summary> /// <returns></returns> public TypeConverter GetConverter() { return TypeDescriptor.GetConverter(this, true); } /// <summary> /// /// </summary> /// <returns></returns> public EventDescriptor GetDefaultEvent() { return TypeDescriptor.GetDefaultEvent(this, true); } /// <summary> /// /// </summary> /// <returns></returns> public PropertyDescriptor GetDefaultProperty() { return TypeDescriptor.GetDefaultProperty(this, true); } /// <summary> /// /// </summary> /// <param name="editorBaseType"></param> /// <returns></returns> public object GetEditor(Type editorBaseType) { return TypeDescriptor.GetEditor(this, editorBaseType, true); } /// <summary> /// /// </summary> /// <param name="attributes"></param> /// <returns></returns> public EventDescriptorCollection GetEvents(Attribute[] attributes) { return TypeDescriptor.GetEvents(this, attributes, true); } /// <summary> /// /// </summary> /// <returns></returns> public EventDescriptorCollection GetEvents() { return TypeDescriptor.GetEvents(this, true); } /// <summary> /// /// </summary> /// <param name="pd"></param> /// <returns></returns> public object GetPropertyOwner(PropertyDescriptor pd) { return this; } /// <summary> /// /// </summary> /// <param name="attributes"></param> /// <returns></returns> public PropertyDescriptorCollection GetProperties(Attribute[] attributes) { return GetProperties(); } /// <summary> /// Called to get the properties of this type. /// </summary> /// <returns></returns> public PropertyDescriptorCollection GetProperties() { // Create a collection object to hold property descriptors PropertyDescriptorCollection pds = new PropertyDescriptorCollection(null); // Iterate the list of employees for (int i = 0; i < this.List.Count; i++) { // Create a property descriptor for the employee item and add to the property descriptor collection CollectionPropertyDescriptor_New<T> pd = new CollectionPropertyDescriptor_New<T>(this, i); pds.Add(pd); } // return the property descriptor collection return pds; } public T[] ToArray() { if (arrTmp == null) { arrTmp = new T[List.Count]; for (int i = 0; i < List.Count; i++) { arrTmp[i] = (T)List[i]; } } return arrTmp; } } /// <summary> /// Enable to display data about a collection in a property grid. /// </summary> /// <typeparam name="T">Folder object.</typeparam> public class CollectionPropertyDescriptor_New<T> : PropertyDescriptor { private FolderSettingsCollection_New<T> collection = null; private int index = -1; /// <summary> /// /// </summary> /// <param name="coll"></param> /// <param name="idx"></param> public CollectionPropertyDescriptor_New(FolderSettingsCollection_New<T> coll, int idx) : base("#" + idx.ToString(), null) { this.collection = coll; this.index = idx; } /// <summary> /// /// </summary> public override AttributeCollection Attributes { get { return new AttributeCollection(null); } } /// <summary> /// /// </summary> /// <param name="component"></param> /// <returns></returns> public override bool CanResetValue(object component) { return true; } /// <summary> /// /// </summary> public override Type ComponentType { get { return this.collection.GetType(); } } /// <summary> /// /// </summary> public override string DisplayName { get { if (this.collection[index] != null) { return this.collection[index].ToString(); } else { return null; } } } public override string Description { get { return ""; } } /// <summary> /// /// </summary> /// <param name="component"></param> /// <returns></returns> public override object GetValue(object component) { if (this.collection[index] != null) { return this.collection[index]; } else { return null; } } /// <summary> /// /// </summary> public override bool IsReadOnly { get { return false; } } public override string Name { get { return "#" + index.ToString(); } } /// <summary> /// /// </summary> public override Type PropertyType { get { return this.collection[index].GetType(); } } public override void ResetValue(object component) { } /// <summary> /// /// </summary> /// <param name="component"></param> /// <returns></returns> public override bool ShouldSerializeValue(object component) { return true; } /// <summary> /// /// </summary> /// <param name="component"></param> /// <param name="value"></param> public override void SetValue(object component, object value) { // this.collection[index] = value; } }

    Read the article

  • PHP issue json_decode echo array

    - by dfdf
    <?php $string = file_get_contents("http://www.reddit.com/r/news.json"); $array = json_decode($string, true); echo $array['data'][0]['children'][0]['data'][0]['title'][0]; ?> I've got a problem- the code doesn't echo anything. I'm kinda new to json_decode, so any help is appreciated :-) Edit: As a response to a comment, here's what print_r results to: Array ( [kind] => Listing [data] => Array ( [modhash] => [children] => Array ( [0] => Array ( [kind] => t3 [data] => Array ( [domain] => syracuse.com [banned_by] => [media_embed] => Array ( ) [subreddit] => news [selftext_html] => [selftext] => [likes] => [secure_media] => [link_flair_text] => [id] => 1qdtqr [secure_media_embed] => Array ( ) [clicked] => [stickied] => [author] => cadencehz [media] => [score] => 1552 [approved_by] => [over_18] => [hidden] => [thumbnail] => [subreddit_id] => t5_2qh3l [edited] => [link_flair_css_class] => [author_flair_css_class] => [downs] => 978 [saved] => [is_self] => [permalink] => /r/news/comments/1qdtqr/thousands_defend_grocery_store_employee_with/ [name] => t3_1qdtqr [created] => 1384215998 [url] => http://www.syracuse.com/news/index.ssf/2013/11/thousands_come_to_defense_of_clay_wegmans_employee_with_aspergers_syndrome_after.html#incart_m-rpt-2 [author_flair_text] => [title] => Thousands defend grocery store employee with Asperger's syndrome after customer yells at him for being too slow [created_utc] => 1384187198 [ups] => 2530 [num_comments] => 401 [visited] => [num_reports] => [distinguished] => ) ) [1] => Array ( [kind] => t3 [data] => Array ( [domain] => bostonherald.com [banned_by] => [media_embed] => Array ( ) [subreddit] => news [selftext_html] => [selftext] => [likes] => [secure_media] => [link_flair_text] => [id] => 1qddl6 [secure_media_embed] => Array ( ) [clicked] => [stickied] => [author] => boxofrain [media] => [score] => 2086 [approved_by] => [over_18] => [hidden] => [thumbnail] => [subreddit_id] => t5_2qh3l [edited] => [link_flair_css_class] => [author_flair_css_class] => [downs] => 2556 [saved] => [is_self] => [permalink] => /r/news/comments/1qddl6/motorcycle_stolen_in_1961_found_is_recovered_and/ [name] => t3_1qddl6 [created] => 1384199801 [url] => http://bostonherald.com/news_opinion/offbeat_news/2013/11/man_glad_stolen_motorcycle_found_after_46_years [author_flair_text] => [title] => Motorcycle stolen in 1961 found is recovered and will be returned to it 73 year old owner. [created_utc] => 1384171001 [ups] => 4642 [num_comments] => 141 [visited] => [num_reports] => [distinguished] => ) ) [2] => Array ( [kind] => t3 [data] => Array ( [domain] => hosted.ap.org [banned_by] => [media_embed] => Array ( ) [subreddit] => news [selftext_html] => [selftext] => [likes] => [secure_media] => [link_flair_text] => [id] => 1qe6gp [secure_media_embed] => Array ( ) [clicked] => [stickied] => [author] => donkey-kick [media] => [score] => 415 [approved_by] => [over_18] => [hidden] => [thumbnail] => [subreddit_id] => t5_2qh3l [edited] => [link_flair_css_class] => [author_flair_css_class] => [downs] => 334 [saved] => [is_self] => [permalink] => /r/news/comments/1qe6gp/atheist_mega_churches_take_hold_in_the_us_and/ [name] => t3_1qe6gp [created] => 1384224670 [url] => http://hosted.ap.org/dynamic/stories/U/US_ATHEIST_MEGACHURCH?SITE=AP&amp;SECTION=HOME&amp;TEMPLATE=DEFAULT&amp;CTIME=2013-11-10-17-03-15 [author_flair_text] => [title] => Atheist Mega Churches take hold in the US and around the world. [created_utc] => 1384195870 [ups] => 749 [num_comments] => 368 [visited] => [num_reports] => [distinguished] => ) ) [3] => Array ( [kind] => t3 [data] => Array ( [domain] => abcnews.go.com [banned_by] => [media_embed] => Array ( ) [subreddit] => news [selftext_html] => [selftext] => [likes] => [secure_media] => [link_flair_text] => [id] => 1qdie4 [secure_media_embed] => Array ( ) [clicked] => [stickied] => [author] => Hoosier_made [media] => [score] => 984 [approved_by] => [over_18] => [hidden] => [thumbnail] => [subreddit_id] => t5_2qh3l [edited] => [link_flair_css_class] => [author_flair_css_class] => [downs] => 400 [saved] => [is_self] => [permalink] => /r/news/comments/1qdie4/abc_news_amy_robach_has_mammogram_live_on_gma/ [name] => t3_1qdie4 [created] => 1384206209 [url] => http://abcnews.go.com/blogs/health/2013/11/11/abc-news-amy-robach-reveals-breast-cancer-diagnosis/ [author_flair_text] => [title] => ABC News’ Amy Robach has mammogram live on GMA. Results Reveal Breast Cancer Diagnosis. [created_utc] => 1384177409 [ups] => 1384 [num_comments] => 130 [visited] => [num_reports] => [distinguished] => ) ) // ECT... I CUT HERE BECAUSE IT WENT ON FOR A WHILE [after] => t3_1qdinr [before] => ) ) Here's the url to the exact output, I had to trim the code to fit the character limit so maybe there's something I messed up ! :P Okay: Link goes here...

    Read the article

  • Is there a Telecommunications Reference Architecture?

    - by raul.goycoolea
    @font-face { font-family: "Arial"; }@font-face { font-family: "Courier New"; }@font-face { font-family: "Wingdings"; }@font-face { font-family: "Cambria"; }p.MsoNormal, li.MsoNormal, div.MsoNormal { margin: 0cm 0cm 0.0001pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraph, li.MsoListParagraph, div.MsoListParagraph { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpFirst, li.MsoListParagraphCxSpFirst, div.MsoListParagraphCxSpFirst { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpMiddle, li.MsoListParagraphCxSpMiddle, div.MsoListParagraphCxSpMiddle { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpLast, li.MsoListParagraphCxSpLast, div.MsoListParagraphCxSpLast { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }div.Section1 { page: Section1; }ol { margin-bottom: 0cm; }ul { margin-bottom: 0cm; } Abstract   Reference architecture provides needed architectural information that can be provided in advance to an enterprise to enable consistent architectural best practices. Enterprise Reference Architecture helps business owners to actualize their strategies, vision, objectives, and principles. It evaluates the IT systems, based on Reference Architecture goals, principles, and standards. It helps to reduce IT costs by increasing functionality, availability, scalability, etc. Telecom Reference Architecture provides customers with the flexibility to view bundled service bills online with the provision of multiple services. It provides real-time, flexible billing and charging systems, to handle complex promotions, discounts, and settlements with multiple parties. This paper attempts to describe the Reference Architecture for the Telecom Enterprises. It lays the foundation for a Telecom Reference Architecture by articulating the requirements, drivers, and pitfalls for telecom service providers. It describes generic reference architecture for telecom enterprises and moves on to explain how to achieve Enterprise Reference Architecture by using SOA.   Introduction   A Reference Architecture provides a methodology, set of practices, template, and standards based on a set of successful solutions implemented earlier. These solutions have been generalized and structured for the depiction of both a logical and a physical architecture, based on the harvesting of a set of patterns that describe observations in a number of successful implementations. It helps as a reference for the various architectures that an enterprise can implement to solve various problems. It can be used as the starting point or the point of comparisons for various departments/business entities of a company, or for the various companies for an enterprise. It provides multiple views for multiple stakeholders.   Major artifacts of the Enterprise Reference Architecture are methodologies, standards, metadata, documents, design patterns, etc.   Purpose of Reference Architecture   In most cases, architects spend a lot of time researching, investigating, defining, and re-arguing architectural decisions. It is like reinventing the wheel as their peers in other organizations or even the same organization have already spent a lot of time and effort defining their own architectural practices. This prevents an organization from learning from its own experiences and applying that knowledge for increased effectiveness.   Reference architecture provides missing architectural information that can be provided in advance to project team members to enable consistent architectural best practices.   Enterprise Reference Architecture helps an enterprise to achieve the following at the abstract level:   ·       Reference architecture is more of a communication channel to an enterprise ·       Helps the business owners to accommodate to their strategies, vision, objectives, and principles. ·       Evaluates the IT systems based on Reference Architecture Principles ·       Reduces IT spending through increasing functionality, availability, scalability, etc ·       A Real-time Integration Model helps to reduce the latency of the data updates Is used to define a single source of Information ·       Provides a clear view on how to manage information and security ·       Defines the policy around the data ownership, product boundaries, etc. ·       Helps with cost optimization across project and solution portfolios by eliminating unused or duplicate investments and assets ·       Has a shorter implementation time and cost   Once the reference architecture is in place, the set of architectural principles, standards, reference models, and best practices ensure that the aligned investments have the greatest possible likelihood of success in both the near term and the long term (TCO).     Common pitfalls for Telecom Service Providers   Telecom Reference Architecture serves as the first step towards maturity for a telecom service provider. During the course of our assignments/experiences with telecom players, we have come across the following observations – Some of these indicate a lack of maturity of the telecom service provider:   ·       In markets that are growing and not so mature, it has been observed that telcos have a significant amount of in-house or home-grown applications. In some of these markets, the growth has been so rapid that IT has been unable to cope with business demands. Telcos have shown a tendency to come up with workarounds in their IT applications so as to meet business needs. ·       Even for core functions like provisioning or mediation, some telcos have tried to manage with home-grown applications. ·       Most of the applications do not have the required scalability or maintainability to sustain growth in volumes or functionality. ·       Applications face interoperability issues with other applications in the operator's landscape. Integrating a new application or network element requires considerable effort on the part of the other applications. ·       Application boundaries are not clear, and functionality that is not in the initial scope of that application gets pushed onto it. This results in the development of the multiple, small applications without proper boundaries. ·       Usage of Legacy OSS/BSS systems, poor Integration across Multiple COTS Products and Internal Systems. Most of the Integrations are developed on ad-hoc basis and Point-to-Point Integration. ·       Redundancy of the business functions in different applications • Fragmented data across the different applications and no integrated view of the strategic data • Lot of performance Issues due to the usage of the complex integration across OSS and BSS systems   However, this is where the maturity of the telecom industry as a whole can be of help. The collaborative efforts of telcos to overcome some of these problems have resulted in bodies like the TM Forum. They have come up with frameworks for business processes, data, applications, and technology for telecom service providers. These could be a good starting point for telcos to clean up their enterprise landscape.   Industry Trends in Telecom Reference Architecture   Telecom reference architectures are evolving rapidly because telcos are facing business and IT challenges.   “The reality is that there probably is no killer application, no silver bullet that the telcos can latch onto to carry them into a 21st Century.... Instead, there are probably hundreds – perhaps thousands – of niche applications.... And the only way to find which of these works for you is to try out lots of them, ramp up the ones that work, and discontinue the ones that fail.” – Martin Creaner President & CTO TM Forum.   The following trends have been observed in telecom reference architecture:   ·       Transformation of business structures to align with customer requirements ·       Adoption of more Internet-like technical architectures. The Web 2.0 concept is increasingly being used. ·       Virtualization of the traditional operations support system (OSS) ·       Adoption of SOA to support development of IP-based services ·       Adoption of frameworks like Service Delivery Platforms (SDPs) and IP Multimedia Subsystem ·       (IMS) to enable seamless deployment of various services over fixed and mobile networks ·       Replacement of in-house, customized, and stove-piped OSS/BSS with standards-based COTS products ·       Compliance with industry standards and frameworks like eTOM, SID, and TAM to enable seamless integration with other standards-based products   Drivers of Reference Architecture   The drivers of the Reference Architecture are Reference Architecture Goals, Principles, and Enterprise Vision and Telecom Transformation. The details are depicted below diagram. @font-face { font-family: "Cambria"; }p.MsoNormal, li.MsoNormal, div.MsoNormal { margin: 0cm 0cm 0.0001pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoCaption, li.MsoCaption, div.MsoCaption { margin: 0cm 0cm 10pt; font-size: 9pt; font-family: "Times New Roman"; color: rgb(79, 129, 189); font-weight: bold; }div.Section1 { page: Section1; } Figure 1. Drivers for Reference Architecture @font-face { font-family: "Arial"; }@font-face { font-family: "Courier New"; }@font-face { font-family: "Wingdings"; }@font-face { font-family: "Cambria"; }p.MsoNormal, li.MsoNormal, div.MsoNormal { margin: 0cm 0cm 0.0001pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraph, li.MsoListParagraph, div.MsoListParagraph { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpFirst, li.MsoListParagraphCxSpFirst, div.MsoListParagraphCxSpFirst { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpMiddle, li.MsoListParagraphCxSpMiddle, div.MsoListParagraphCxSpMiddle { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpLast, li.MsoListParagraphCxSpLast, div.MsoListParagraphCxSpLast { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }div.Section1 { page: Section1; }ol { margin-bottom: 0cm; }ul { margin-bottom: 0cm; } Today’s telecom reference architectures should seamlessly integrate traditional legacy-based applications and transition to next-generation network technologies (e.g., IP multimedia subsystems). This has resulted in new requirements for flexible, real-time billing and OSS/BSS systems and implications on the service provider’s organizational requirements and structure.   Telecom reference architectures are today expected to:   ·       Integrate voice, messaging, email and other VAS over fixed and mobile networks, back end systems ·       Be able to provision multiple services and service bundles • Deliver converged voice, video and data services ·       Leverage the existing Network Infrastructure ·       Provide real-time, flexible billing and charging systems to handle complex promotions, discounts, and settlements with multiple parties. ·       Support charging of advanced data services such as VoIP, On-Demand, Services (e.g.  Video), IMS/SIP Services, Mobile Money, Content Services and IPTV. ·       Help in faster deployment of new services • Serve as an effective platform for collaboration between network IT and business organizations ·       Harness the potential of converging technology, networks, devices and content to develop multimedia services and solutions of ever-increasing sophistication on a single Internet Protocol (IP) ·       Ensure better service delivery and zero revenue leakage through real-time balance and credit management ·       Lower operating costs to drive profitability   Enterprise Reference Architecture   The Enterprise Reference Architecture (RA) fills the gap between the concepts and vocabulary defined by the reference model and the implementation. Reference architecture provides detailed architectural information in a common format such that solutions can be repeatedly designed and deployed in a consistent, high-quality, supportable fashion. This paper attempts to describe the Reference Architecture for the Telecom Application Usage and how to achieve the Enterprise Level Reference Architecture using SOA.   • Telecom Reference Architecture • Enterprise SOA based Reference Architecture   Telecom Reference Architecture   Tele Management Forum’s New Generation Operations Systems and Software (NGOSS) is an architectural framework for organizing, integrating, and implementing telecom systems. NGOSS is a component-based framework consisting of the following elements:   ·       The enhanced Telecom Operations Map (eTOM) is a business process framework. ·       The Shared Information Data (SID) model provides a comprehensive information framework that may be specialized for the needs of a particular organization. ·       The Telecom Application Map (TAM) is an application framework to depict the functional footprint of applications, relative to the horizontal processes within eTOM. ·       The Technology Neutral Architecture (TNA) is an integrated framework. TNA is an architecture that is sustainable through technology changes.   NGOSS Architecture Standards are:   ·       Centralized data ·       Loosely coupled distributed systems ·       Application components/re-use  ·       A technology-neutral system framework with technology specific implementations ·       Interoperability to service provider data/processes ·       Allows more re-use of business components across multiple business scenarios ·       Workflow automation   The traditional operator systems architecture consists of four layers,   ·       Business Support System (BSS) layer, with focus toward customers and business partners. Manages order, subscriber, pricing, rating, and billing information. ·       Operations Support System (OSS) layer, built around product, service, and resource inventories. ·       Networks layer – consists of Network elements and 3rd Party Systems. ·       Integration Layer – to maximize application communication and overall solution flexibility.   Reference architecture for telecom enterprises is depicted below. @font-face { font-family: "Arial"; }@font-face { font-family: "Courier New"; }@font-face { font-family: "Wingdings"; }@font-face { font-family: "Cambria"; }p.MsoNormal, li.MsoNormal, div.MsoNormal { margin: 0cm 0cm 0.0001pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoCaption, li.MsoCaption, div.MsoCaption { margin: 0cm 0cm 10pt; font-size: 9pt; font-family: "Times New Roman"; color: rgb(79, 129, 189); font-weight: bold; }p.MsoListParagraph, li.MsoListParagraph, div.MsoListParagraph { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpFirst, li.MsoListParagraphCxSpFirst, div.MsoListParagraphCxSpFirst { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpMiddle, li.MsoListParagraphCxSpMiddle, div.MsoListParagraphCxSpMiddle { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpLast, li.MsoListParagraphCxSpLast, div.MsoListParagraphCxSpLast { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }div.Section1 { page: Section1; }ol { margin-bottom: 0cm; }ul { margin-bottom: 0cm; } Figure 2. Telecom Reference Architecture   The major building blocks of any Telecom Service Provider architecture are as follows:   1. Customer Relationship Management   CRM encompasses the end-to-end lifecycle of the customer: customer initiation/acquisition, sales, ordering, and service activation, customer care and support, proactive campaigns, cross sell/up sell, and retention/loyalty.   CRM also includes the collection of customer information and its application to personalize, customize, and integrate delivery of service to a customer, as well as to identify opportunities for increasing the value of the customer to the enterprise.   The key functionalities related to Customer Relationship Management are   ·       Manage the end-to-end lifecycle of a customer request for products. ·       Create and manage customer profiles. ·       Manage all interactions with customers – inquiries, requests, and responses. ·       Provide updates to Billing and other south bound systems on customer/account related updates such as customer/ account creation, deletion, modification, request bills, final bill, duplicate bills, credit limits through Middleware. ·       Work with Order Management System, Product, and Service Management components within CRM. ·       Manage customer preferences – Involve all the touch points and channels to the customer, including contact center, retail stores, dealers, self service, and field service, as well as via any media (phone, face to face, web, mobile device, chat, email, SMS, mail, the customer's bill, etc.). ·       Support single interface for customer contact details, preferences, account details, offers, customer premise equipment, bill details, bill cycle details, and customer interactions.   CRM applications interact with customers through customer touch points like portals, point-of-sale terminals, interactive voice response systems, etc. The requests by customers are sent via fulfillment/provisioning to billing system for ordering processing.   2. Billing and Revenue Management   Billing and Revenue Management handles the collection of appropriate usage records and production of timely and accurate bills – for providing pre-bill usage information and billing to customers; for processing their payments; and for performing payment collections. In addition, it handles customer inquiries about bills, provides billing inquiry status, and is responsible for resolving billing problems to the customer's satisfaction in a timely manner. This process grouping also supports prepayment for services.   The key functionalities provided by these applications are   ·       To ensure that enterprise revenue is billed and invoices delivered appropriately to customers. ·       To manage customers’ billing accounts, process their payments, perform payment collections, and monitor the status of the account balance. ·       To ensure the timely and effective fulfillment of all customer bill inquiries and complaints. ·       Collect the usage records from mediation and ensure appropriate rating and discounting of all usage and pricing. ·       Support revenue sharing; split charging where usage is guided to an account different from the service consumer. ·       Support prepaid and post-paid rating. ·       Send notification on approach / exceeding the usage thresholds as enforced by the subscribed offer, and / or as setup by the customer. ·       Support prepaid, post paid, and hybrid (where some services are prepaid and the rest of the services post paid) customers and conversion from post paid to prepaid, and vice versa. ·       Support different billing function requirements like charge prorating, promotion, discount, adjustment, waiver, write-off, account receivable, GL Interface, late payment fee, credit control, dunning, account or service suspension, re-activation, expiry, termination, contract violation penalty, etc. ·       Initiate direct debit to collect payment against an invoice outstanding. ·       Send notification to Middleware on different events; for example, payment receipt, pre-suspension, threshold exceed, etc.   Billing systems typically get usage data from mediation systems for rating and billing. They get provisioning requests from order management systems and inquiries from CRM systems. Convergent and real-time billing systems can directly get usage details from network elements.   3. Mediation   Mediation systems transform/translate the Raw or Native Usage Data Records into a general format that is acceptable to billing for their rating purposes.   The following lists the high-level roles and responsibilities executed by the Mediation system in the end-to-end solution.   ·       Collect Usage Data Records from different data sources – like network elements, routers, servers – via different protocol and interfaces. ·       Process Usage Data Records – Mediation will process Usage Data Records as per the source format. ·       Validate Usage Data Records from each source. ·       Segregates Usage Data Records coming from each source to multiple, based on the segregation requirement of end Application. ·       Aggregates Usage Data Records based on the aggregation rule if any from different sources. ·       Consolidates multiple Usage Data Records from each source. ·       Delivers formatted Usage Data Records to different end application like Billing, Interconnect, Fraud Management, etc. ·       Generates audit trail for incoming Usage Data Records and keeps track of all the Usage Data Records at various stages of mediation process. ·       Checks duplicate Usage Data Records across files for a given time window.   4. Fulfillment   This area is responsible for providing customers with their requested products in a timely and correct manner. It translates the customer's business or personal need into a solution that can be delivered using the specific products in the enterprise's portfolio. This process informs the customers of the status of their purchase order, and ensures completion on time, as well as ensuring a delighted customer. These processes are responsible for accepting and issuing orders. They deal with pre-order feasibility determination, credit authorization, order issuance, order status and tracking, customer update on customer order activities, and customer notification on order completion. Order management and provisioning applications fall into this category.   The key functionalities provided by these applications are   ·       Issuing new customer orders, modifying open customer orders, or canceling open customer orders; ·       Verifying whether specific non-standard offerings sought by customers are feasible and supportable; ·       Checking the credit worthiness of customers as part of the customer order process; ·       Testing the completed offering to ensure it is working correctly; ·       Updating of the Customer Inventory Database to reflect that the specific product offering has been allocated, modified, or cancelled; ·       Assigning and tracking customer provisioning activities; ·       Managing customer provisioning jeopardy conditions; and ·       Reporting progress on customer orders and other processes to customer.   These applications typically get orders from CRM systems. They interact with network elements and billing systems for fulfillment of orders.   5. Enterprise Management   This process area includes those processes that manage enterprise-wide activities and needs, or have application within the enterprise as a whole. They encompass all business management processes that   ·       Are necessary to support the whole of the enterprise, including processes for financial management, legal management, regulatory management, process, cost, and quality management, etc.;   ·       Are responsible for setting corporate policies, strategies, and directions, and for providing guidelines and targets for the whole of the business, including strategy development and planning for areas, such as Enterprise Architecture, that are integral to the direction and development of the business;   ·       Occur throughout the enterprise, including processes for project management, performance assessments, cost assessments, etc.     (i) Enterprise Risk Management:   Enterprise Risk Management focuses on assuring that risks and threats to the enterprise value and/or reputation are identified, and appropriate controls are in place to minimize or eliminate the identified risks. The identified risks may be physical or logical/virtual. Successful risk management ensures that the enterprise can support its mission critical operations, processes, applications, and communications in the face of serious incidents such as security threats/violations and fraud attempts. Two key areas covered in Risk Management by telecom operators are:   ·       Revenue Assurance: Revenue assurance system will be responsible for identifying revenue loss scenarios across components/systems, and will help in rectifying the problems. The following lists the high-level roles and responsibilities executed by the Revenue Assurance system in the end-to-end solution. o   Identify all usage information dropped when networks are being upgraded. o   Interconnect bill verification. o   Identify where services are routinely provisioned but never billed. o   Identify poor sales policies that are intensifying collections problems. o   Find leakage where usage is sent to error bucket and never billed for. o   Find leakage where field service, CRM, and network build-out are not optimized.   ·       Fraud Management: Involves collecting data from different systems to identify abnormalities in traffic patterns, usage patterns, and subscription patterns to report suspicious activity that might suggest fraudulent usage of resources, resulting in revenue losses to the operator.   The key roles and responsibilities of the system component are as follows:   o   Fraud management system will capture and monitor high usage (over a certain threshold) in terms of duration, value, and number of calls for each subscriber. The threshold for each subscriber is decided by the system and fixed automatically. o   Fraud management will be able to detect the unauthorized access to services for certain subscribers. These subscribers may have been provided unauthorized services by employees. The component will raise the alert to the operator the very first time of such illegal calls or calls which are not billed. o   The solution will be to have an alarm management system that will deliver alarms to the operator/provider whenever it detects a fraud, thus minimizing fraud by catching it the first time it occurs. o   The Fraud Management system will be capable of interfacing with switches, mediation systems, and billing systems   (ii) Knowledge Management   This process focuses on knowledge management, technology research within the enterprise, and the evaluation of potential technology acquisitions.   Key responsibilities of knowledge base management are to   ·       Maintain knowledge base – Creation and updating of knowledge base on ongoing basis. ·       Search knowledge base – Search of knowledge base on keywords or category browse ·       Maintain metadata – Management of metadata on knowledge base to ensure effective management and search. ·       Run report generator. ·       Provide content – Add content to the knowledge base, e.g., user guides, operational manual, etc.   (iii) Document Management   It focuses on maintaining a repository of all electronic documents or images of paper documents relevant to the enterprise using a system.   (iv) Data Management   It manages data as a valuable resource for any enterprise. For telecom enterprises, the typical areas covered are Master Data Management, Data Warehousing, and Business Intelligence. It is also responsible for data governance, security, quality, and database management.   Key responsibilities of Data Management are   ·       Using ETL, extract the data from CRM, Billing, web content, ERP, campaign management, financial, network operations, asset management info, customer contact data, customer measures, benchmarks, process data, e.g., process inputs, outputs, and measures, into Enterprise Data Warehouse. ·       Management of data traceability with source, data related business rules/decisions, data quality, data cleansing data reconciliation, competitors data – storage for all the enterprise data (customer profiles, products, offers, revenues, etc.) ·       Get online update through night time replication or physical backup process at regular frequency. ·       Provide the data access to business intelligence and other systems for their analysis, report generation, and use.   (v) Business Intelligence   It uses the Enterprise Data to provide the various analysis and reports that contain prospects and analytics for customer retention, acquisition of new customers due to the offers, and SLAs. It will generate right and optimized plans – bolt-ons for the customers.   The following lists the high-level roles and responsibilities executed by the Business Intelligence system at the Enterprise Level:   ·       It will do Pattern analysis and reports problem. ·       It will do Data Analysis – Statistical analysis, data profiling, affinity analysis of data, customer segment wise usage patterns on offers, products, service and revenue generation against services and customer segments. ·       It will do Performance (business, system, and forecast) analysis, churn propensity, response time, and SLAs analysis. ·       It will support for online and offline analysis, and report drill down capability. ·       It will collect, store, and report various SLA data. ·       It will provide the necessary intelligence for marketing and working on campaigns, etc., with cost benefit analysis and predictions.   It will advise on customer promotions with additional services based on loyalty and credit history of customer   ·       It will Interface with Enterprise Data Management system for data to run reports and analysis tasks. It will interface with the campaign schedules, based on historical success evidence.   (vi) Stakeholder and External Relations Management   It manages the enterprise's relationship with stakeholders and outside entities. Stakeholders include shareholders, employee organizations, etc. Outside entities include regulators, local community, and unions. Some of the processes within this grouping are Shareholder Relations, External Affairs, Labor Relations, and Public Relations.   (vii) Enterprise Resource Planning   It is used to manage internal and external resources, including tangible assets, financial resources, materials, and human resources. Its purpose is to facilitate the flow of information between all business functions inside the boundaries of the enterprise and manage the connections to outside stakeholders. ERP systems consolidate all business operations into a uniform and enterprise wide system environment.   The key roles and responsibilities for Enterprise System are given below:   ·        It will handle responsibilities such as core accounting, financial, and management reporting. ·       It will interface with CRM for capturing customer account and details. ·       It will interface with billing to capture the billing revenue and other financial data. ·       It will be responsible for executing the dunning process. Billing will send the required feed to ERP for execution of dunning. ·       It will interface with the CRM and Billing through batch interfaces. Enterprise management systems are like horizontals in the enterprise and typically interact with all major telecom systems. E.g., an ERP system interacts with CRM, Fulfillment, and Billing systems for different kinds of data exchanges.   6. External Interfaces/Touch Points   The typical external parties are customers, suppliers/partners, employees, shareholders, and other stakeholders. External interactions from/to a Service Provider to other parties can be achieved by a variety of mechanisms, including:   ·       Exchange of emails or faxes ·       Call Centers ·       Web Portals ·       Business-to-Business (B2B) automated transactions   These applications provide an Internet technology driven interface to external parties to undertake a variety of business functions directly for themselves. These can provide fully or partially automated service to external parties through various touch points.   Typical characteristics of these touch points are   ·       Pre-integrated self-service system, including stand-alone web framework or integration front end with a portal engine ·       Self services layer exposing atomic web services/APIs for reuse by multiple systems across the architectural environment ·       Portlets driven connectivity exposing data and services interoperability through a portal engine or web application   These touch points mostly interact with the CRM systems for requests, inquiries, and responses.   7. Middleware   The component will be primarily responsible for integrating the different systems components under a common platform. It should provide a Standards-Based Platform for building Service Oriented Architecture and Composite Applications. The following lists the high-level roles and responsibilities executed by the Middleware component in the end-to-end solution.   ·       As an integration framework, covering to and fro interfaces ·       Provide a web service framework with service registry. ·       Support SOA framework with SOA service registry. ·       Each of the interfaces from / to Middleware to other components would handle data transformation, translation, and mapping of data points. ·       Receive data from the caller / activate and/or forward the data to the recipient system in XML format. ·       Use standard XML for data exchange. ·       Provide the response back to the service/call initiator. ·       Provide a tracking until the response completion. ·       Keep a store transitional data against each call/transaction. ·       Interface through Middleware to get any information that is possible and allowed from the existing systems to enterprise systems; e.g., customer profile and customer history, etc. ·       Provide the data in a common unified format to the SOA calls across systems, and follow the Enterprise Architecture directive. ·       Provide an audit trail for all transactions being handled by the component.   8. Network Elements   The term Network Element means a facility or equipment used in the provision of a telecommunications service. Such terms also includes features, functions, and capabilities that are provided by means of such facility or equipment, including subscriber numbers, databases, signaling systems, and information sufficient for billing and collection or used in the transmission, routing, or other provision of a telecommunications service.   Typical network elements in a GSM network are Home Location Register (HLR), Intelligent Network (IN), Mobile Switching Center (MSC), SMS Center (SMSC), and network elements for other value added services like Push-to-talk (PTT), Ring Back Tone (RBT), etc.   Network elements are invoked when subscribers use their telecom devices for any kind of usage. These elements generate usage data and pass it on to downstream systems like mediation and billing system for rating and billing. They also integrate with provisioning systems for order/service fulfillment.   9. 3rd Party Applications   3rd Party systems are applications like content providers, payment gateways, point of sale terminals, and databases/applications maintained by the Government.   Depending on applicability and the type of functionality provided by 3rd party applications, the integration with different telecom systems like CRM, provisioning, and billing will be done.   10. Service Delivery Platform   A service delivery platform (SDP) provides the architecture for the rapid deployment, provisioning, execution, management, and billing of value added telecom services. SDPs are based on the concept of SOA and layered architecture. They support the delivery of voice, data services, and content in network and device-independent fashion. They allow application developers to aggregate network capabilities, services, and sources of content. SDPs typically contain layers for web services exposure, service application development, and network abstraction.   SOA Reference Architecture   SOA concept is based on the principle of developing reusable business service and building applications by composing those services, instead of building monolithic applications in silos. It’s about bridging the gap between business and IT through a set of business-aligned IT services, using a set of design principles, patterns, and techniques.   In an SOA, resources are made available to participants in a value net, enterprise, line of business (typically spanning multiple applications within an enterprise or across multiple enterprises). It consists of a set of business-aligned IT services that collectively fulfill an organization’s business processes and goals. We can choreograph these services into composite applications and invoke them through standard protocols. SOA, apart from agility and reusability, enables:   ·       The business to specify processes as orchestrations of reusable services ·       Technology agnostic business design, with technology hidden behind service interface ·       A contractual-like interaction between business and IT, based on service SLAs ·       Accountability and governance, better aligned to business services ·       Applications interconnections untangling by allowing access only through service interfaces, reducing the daunting side effects of change ·       Reduced pressure to replace legacy and extended lifetime for legacy applications, through encapsulation in services   ·       A Cloud Computing paradigm, using web services technologies, that makes possible service outsourcing on an on-demand, utility-like, pay-per-usage basis   The following section represents the Reference Architecture of logical view for the Telecom Solution. The new custom built application needs to align with this logical architecture in the long run to achieve EA benefits.   Packaged implementation applications, such as ERP billing applications, need to expose their functions as service providers (as other applications consume) and interact with other applications as service consumers.   COT applications need to expose services through wrappers such as adapters to utilize existing resources and at the same time achieve Enterprise Architecture goal and objectives.   The following are the various layers for Enterprise level deployment of SOA. This diagram captures the abstract view of Enterprise SOA layers and important components of each layer. Layered architecture means decomposition of services such that most interactions occur between adjacent layers. However, there is no strict rule that top layers should not directly communicate with bottom layers.   The diagram below represents the important logical pieces that would result from overall SOA transformation. @font-face { font-family: "Arial"; }@font-face { font-family: "Courier New"; }@font-face { font-family: "Wingdings"; }@font-face { font-family: "Cambria"; }p.MsoNormal, li.MsoNormal, div.MsoNormal { margin: 0cm 0cm 0.0001pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoCaption, li.MsoCaption, div.MsoCaption { margin: 0cm 0cm 10pt; font-size: 9pt; font-family: "Times New Roman"; color: rgb(79, 129, 189); font-weight: bold; }p.MsoListParagraph, li.MsoListParagraph, div.MsoListParagraph { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpFirst, li.MsoListParagraphCxSpFirst, div.MsoListParagraphCxSpFirst { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpMiddle, li.MsoListParagraphCxSpMiddle, div.MsoListParagraphCxSpMiddle { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpLast, li.MsoListParagraphCxSpLast, div.MsoListParagraphCxSpLast { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }div.Section1 { page: Section1; }ol { margin-bottom: 0cm; }ul { margin-bottom: 0cm; } Figure 3. Enterprise SOA Reference Architecture 1.          Operational System Layer: This layer consists of all packaged applications like CRM, ERP, custom built applications, COTS based applications like Billing, Revenue Management, Fulfilment, and the Enterprise databases that are essential and contribute directly or indirectly to the Enterprise OSS/BSS Transformation.   ERP holds the data of Asset Lifecycle Management, Supply Chain, and Advanced Procurement and Human Capital Management, etc.   CRM holds the data related to Order, Sales, and Marketing, Customer Care, Partner Relationship Management, Loyalty, etc.   Content Management handles Enterprise Search and Query. Billing application consists of the following components:   ·       Collections Management, Customer Billing Management, Invoices, Real-Time Rating, Discounting, and Applying of Charges ·       Enterprise databases will hold both the application and service data, whether structured or unstructured.   MDM - Master data majorly consists of Customer, Order, Product, and Service Data.     2.          Enterprise Component Layer:   This layer consists of the Application Services and Common Services that are responsible for realizing the functionality and maintaining the QoS of the exposed services. This layer uses container-based technologies such as application servers to implement the components, workload management, high availability, and load balancing.   Application Services: This Service Layer enables application, technology, and database abstraction so that the complex accessing logic is hidden from the other service layers. This is a basic service layer, which exposes application functionalities and data as reusable services. The three types of the Application access services are:   ·       Application Access Service: This Service Layer exposes application level functionalities as a reusable service between BSS to BSS and BSS to OSS integration. This layer is enabled using disparate technology such as Web Service, Integration Servers, and Adaptors, etc.   ·       Data Access Service: This Service Layer exposes application data services as a reusable reference data service. This is done via direct interaction with application data. and provides the federated query.   ·       Network Access Service: This Service Layer exposes provisioning layer as a reusable service from OSS to OSS integration. This integration service emphasizes the need for high performance, stateless process flows, and distributed design.   Common Services encompasses management of structured, semi-structured, and unstructured data such as information services, portal services, interaction services, infrastructure services, and security services, etc.   3.          Integration Layer:   This consists of service infrastructure components like service bus, service gateway for partner integration, service registry, service repository, and BPEL processor. Service bus will carry the service invocation payloads/messages between consumers and providers. The other important functions expected from it are itinerary based routing, distributed caching of routing information, transformations, and all qualities of service for messaging-like reliability, scalability, and availability, etc. Service registry will hold all contracts (wsdl) of services, and it helps developers to locate or discover service during design time or runtime.   • BPEL processor would be useful in orchestrating the services to compose a complex business scenario or process. • Workflow and business rules management are also required to support manual triggering of certain activities within business process. based on the rules setup and also the state machine information. Application, data, and service mediation layer typically forms the overall composite application development framework or SOA Framework.   4.          Business Process Layer: These are typically the intermediate services layer and represent Shared Business Process Services. At Enterprise Level, these services are from Customer Management, Order Management, Billing, Finance, and Asset Management application domains.   5.          Access Layer: This layer consists of portals for Enterprise and provides a single view of Enterprise information management and dashboard services.   6.          Channel Layer: This consists of various devices; applications that form part of extended enterprise; browsers through which users access the applications.   7.          Client Layer: This designates the different types of users accessing the enterprise applications. The type of user typically would be an important factor in determining the level of access to applications.   8.          Vertical pieces like management, monitoring, security, and development cut across all horizontal layers Management and monitoring involves all aspects of SOA-like services, SLAs, and other QoS lifecycle processes for both applications and services surrounding SOA governance.     9.          EA Governance, Reference Architecture, Roadmap, Principles, and Best Practices:   EA Governance is important in terms of providing the overall direction to SOA implementation within the enterprise. This involves board-level involvement, in addition to business and IT executives. At a high level, this involves managing the SOA projects implementation, managing SOA infrastructure, and controlling the entire effort through all fine-tuned IT processes in accordance with COBIT (Control Objectives for Information Technology).   Devising tools and techniques to promote reuse culture, and the SOA way of doing things needs competency centers to be established in addition to training the workforce to take up new roles that are suited to SOA journey.   Conclusions   Reference Architectures can serve as the basis for disparate architecture efforts throughout the organization, even if they use different tools and technologies. Reference architectures provide best practices and approaches in the independent way a vendor deals with technology and standards. Reference Architectures model the abstract architectural elements for an enterprise independent of the technologies, protocols, and products that are used to implement an SOA. Telecom enterprises today are facing significant business and technology challenges due to growing competition, a multitude of services, and convergence. Adopting architectural best practices could go a long way in meeting these challenges. The use of SOA-based architecture for communication to each of the external systems like Billing, CRM, etc., in OSS/BSS system has made the architecture very loosely coupled, with greater flexibility. Any change in the external systems would be absorbed at the Integration Layer without affecting the rest of the ecosystem. The use of a Business Process Management (BPM) tool makes the management and maintenance of the business processes easy, with better performance in terms of lead time, quality, and cost. Since the Architecture is based on standards, it will lower the cost of deploying and managing OSS/BSS applications over their lifecycles.

    Read the article

< Previous Page | 52 53 54 55 56