Search Results

Search found 13950 results on 558 pages for 'durable services'.

Page 194/558 | < Previous Page | 190 191 192 193 194 195 196 197 198 199 200 201  | Next Page >

  • Expanding on requestaudit - Tracing who is doing what...and for how long

    - by Kyle Hatlestad
    One of the most helpful tracing sections in WebCenter Content (and one that is on by default) is the requestaudit tracing.  This tracing section summarizes the top service requests happening in the server along with how they are performing.  By default, it has 2 different rotations.  One happens every 2 minutes (listing up to 5 services) and another happens every 60 minutes (listing up to 20 services).  These traces provide the total time for all the requests against that service along with the number of requests and its average request time.  This information can provide a good start in possibly troubleshooting performance issues or tracking a particular issue.   >requestaudit/6 12.10 16:48:00.493 Audit Request Monitor !csMonitorTotalRequests,47,1,0.39009329676628113,0.21034042537212372,1>requestaudit/6 12.10 16:48:00.509 Audit Request Monitor Request Audit Report over the last 120 Seconds for server wcc-base_4444****requestaudit/6 12.10 16:48:00.509 Audit Request Monitor -Num Requests 47 Errors 1 Reqs/sec. 0.39009329676628113 Avg. Latency (secs) 0.21034042537212372 Max Thread Count 1requestaudit/6 12.10 16:48:00.509 Audit Request Monitor 1 Service FLD_BROWSE Total Elapsed Time (secs) 3.5320000648498535 Num requests 10 Num errors 0 Avg. Latency (secs) 0.3531999886035919 requestaudit/6 12.10 16:48:00.509 Audit Request Monitor 2 Service GET_SEARCH_RESULTS Total Elapsed Time (secs) 2.694999933242798 Num requests 6 Num errors 0 Avg. Latency (secs) 0.4491666555404663requestaudit/6 12.10 16:48:00.509 Audit Request Monitor 3 Service GET_DOC_PAGE Total Elapsed Time (secs) 1.8839999437332153 Num requests 5 Num errors 1 Avg. Latency (secs) 0.376800000667572requestaudit/6 12.10 16:48:00.509 Audit Request Monitor 4 Service DOC_INFO Total Elapsed Time (secs) 0.4620000123977661 Num requests 3 Num errors 0 Avg. Latency (secs) 0.15399999916553497requestaudit/6 12.10 16:48:00.509 Audit Request Monitor 5 Service GET_PERSONALIZED_JAVASCRIPT Total Elapsed Time (secs) 0.4099999964237213 Num requests 8 Num errors 0 Avg. Latency (secs) 0.051249999552965164requestaudit/6 12.10 16:48:00.509 Audit Request Monitor ****End Audit Report***** To change the default rotation or size of output, these can be set as configuration variables for the server: RequestAuditIntervalSeconds1 – Used for the shorter of the two summary intervals (default is 120 seconds)RequestAuditIntervalSeconds2 – Used for the longer of the two summary intervals (default is 3600 seconds)RequestAuditListDepth1 – Number of services listed for the first request audit summary interval (default is 5)RequestAuditListDepth2 – Number of services listed for the second request audit summary interval (default is 20) If you want to get more granular, you can enable 'Full Verbose Tracing' from the System Audit Information page and now you will get an audit entry for each and every service request.  >requestaudit/6 12.10 16:58:35.431 IdcServer-68 GET_USER_INFO [dUser=bob][StatusMessage=You are logged in as 'bob'.] 0.08765099942684174(secs) What's nice is it reports who executed the service and how long that particular request took.  In some cases, depending on the service, additional information will be added to the tracing relevant to that  service. >requestaudit/6 12.10 17:00:44.727 IdcServer-81 GET_SEARCH_RESULTS [dUser=bob][QueryText=%28+dDocType+%3cmatches%3e+%60Document%60+%29][StatusCode=0][StatusMessage=Success] 0.4696030020713806(secs) You can even go into more detail and insert any additional data into the tracing.  You simply need to add this configuration variable with a comma separated list of variables from local data to insert. RequestAuditAdditionalVerboseFieldsList=TotalRows,path In this case, for any search results, the number of items the user found is traced: >requestaudit/6 12.10 17:15:28.665 IdcServer-36 GET_SEARCH_RESULTS [TotalRows=224][dUser=bob][QueryText=%28+dDocType+%3cmatches%3e+%60Application%60+%29][Sta... I also recently ran into the case where services were being called from a client through RIDC.  All of the services were being executed as the same user, but they wanted to correlate the requests coming from the client to the ones being executed on the server.  So what we did was add a new field to the request audit list: RequestAuditAdditionalVerboseFieldsList=ClientToken And then in the RIDC client, ClientToken was added to the binder along with a unique value that could be traced for that request.  Now they had a way of tracing on both ends and identifying exactly which client request resulted in which request on the server.

    Read the article

  • Using WKA in Large Coherence Clusters (Disabling Multicast)

    - by jpurdy
    Disabling hardware multicast (by configuring well-known addresses aka WKA) will place significant stress on the network. For messages that must be sent to multiple servers, rather than having a server send a single packet to the switch and having the switch broadcast that packet to the rest of the cluster, the server must send a packet to each of the other servers. While hardware varies significantly, consider that a server with a single gigabit connection can send at most ~70,000 packets per second. To continue with some concrete numbers, in a cluster with 500 members, that means that each server can send at most 140 cluster-wide messages per second. And if there are 10 cluster members on each physical machine, that number shrinks to 14 cluster-wide messages per second (or with only mild hyperbole, roughly zero). It is also important to keep in mind that network I/O is not only expensive in terms of the network itself, but also the consumption of CPU required to send (or receive) a message (due to things like copying the packet bytes, processing a interrupt, etc). Fortunately, Coherence is designed to rely primarily on point-to-point messages, but there are some features that are inherently one-to-many: Announcing the arrival or departure of a member Updating partition assignment maps across the cluster Creating or destroying a NamedCache Invalidating a cache entry from a large number of client-side near caches Distributing a filter-based request across the full set of cache servers (e.g. queries, aggregators and entry processors) Invoking clear() on a NamedCache The first few of these are operations that are primarily routed through a single senior member, and also occur infrequently, so they usually are not a primary consideration. There are cases, however, where the load from introducing new members can be substantial (to the point of destabilizing the cluster). Consider the case where cluster in the first paragraph grows from 500 members to 1000 members (holding the number of physical machines constant). During this period, there will be 500 new member introductions, each of which may consist of several cluster-wide operations (for the cluster membership itself as well as the partitioned cache services, replicated cache services, invocation services, management services, etc). Note that all of these introductions will route through that one senior member, which is sharing its network bandwidth with several other members (which will be communicating to a lesser degree with other members throughout this process). While each service may have a distinct senior member, there's a good chance during initial startup that a single member will be the senior for all services (if those services start on the senior before the second member joins the cluster). It's obvious that this could cause CPU and/or network starvation. In the current release of Coherence (3.7.1.3 as of this writing), the pure unicast code path also has less sophisticated flow-control for cluster-wide messages (compared to the multicast-enabled code path), which may also result in significant heap consumption on the senior member's JVM (from the message backlog). This is almost never a problem in practice, but with sufficient CPU or network starvation, it could become critical. For the non-operational concerns (near caches, queries, etc), the application itself will determine how much load is placed on the cluster. Applications intended for deployment in a pure unicast environment should be careful to avoid excessive dependence on these features. Even in an environment with multicast support, these operations may scale poorly since even with a constant request rate, the underlying workload will increase at roughly the same rate as the underlying resources are added. Unless there is an infrastructural requirement to the contrary, multicast should be enabled. If it can't be enabled, care should be taken to ensure the added overhead doesn't lead to performance or stability issues. This is particularly crucial in large clusters.

    Read the article

  • Multiple Zend application code organisation

    - by user966936
    For the past year I have been working on a series of applications all based on the Zend framework and centered on a complex business logic that all applications must have access to even if they don't use all (easier than having multiple library folders for each application as they are all linked together with a common center). Without going into much detail about what the project is specifically about, I am looking for some input (as I am working on the project alone) on how I have "grouped" my code. I have tried to split it all up in such a way that it removes dependencies as much as possible. I'm trying to keep it as decoupled as I logically can, so in 12 months time when my time is up anyone else coming in can have no problem extending on what I have produced. Example structure: applicationStorage\ (contains all applications and associated data) applicationStorage\Applications\ (contains the applications themselves) applicationStorage\Applications\external\ (application grouping folder) (contains all external customer access applications) applicationStorage\Applications\external\site\ (main external customer access application) applicationStorage\Applications\external\site\Modules\ applicationStorage\Applications\external\site\Config\ applicationStorage\Applications\external\site\Layouts\ applicationStorage\Applications\external\site\ZendExtended\ (contains extended Zend classes specific to this application example: ZendExtended_Controller_Action extends zend_controller_Action ) applicationStorage\Applications\external\mobile\ (mobile external customer access application different workflow limited capabilities compared to full site version) applicationStorage\Applications\internal\ (application grouping folder) (contains all internal company applications) applicationStorage\Applications\internal\site\ (main internal application) applicationStorage\Applications\internal\mobile\ (mobile access has different flow and limited abilities compared to main site version) applicationStorage\Tests\ (contains PHP unit tests) applicationStorage\Library\ applicationStorage\Library\Service\ (contains all business logic, services and servicelocator; these are completely decoupled from Zend framework and rely on models' interfaces) applicationStorage\Library\Zend\ (Zend framework) applicationStorage\Library\Models\ (doesn't know services but is linked to Zend framework for DB operations; contains model interfaces and model datamappers for all business objects; examples include Iorder/IorderMapper, Iworksheet/IWorksheetMapper, Icustomer/IcustomerMapper) (Note: the Modules, Config, Layouts and ZendExtended folders are duplicated in each application folder; but i have omitted them as they are not required for my purposes.) For the library this contains all "universal" code. The Zend framework is at the heart of all applications, but I wanted my business logic to be Zend-framework-independent. All model and mapper interfaces have no public references to Zend_Db but actually wrap around it in private. So my hope is that in the future I will be able to rewrite the mappers and dbtables (containing a Models_DbTable_Abstract that extends Zend_Db_Table_Abstract) in order to decouple my business logic from the Zend framework if I want to move my business logic (services) to a non-Zend framework environment (maybe some other PHP framework). Using a serviceLocator and registering the required services within the bootstrap of each application, I can use different versions of the same service depending on the request and which application is being accessed. Example: all external applications will have a service_auth_External implementing service_auth_Interface registered. Same with internal aplications with Service_Auth_Internal implementing service_auth_Interface Service_Locator::getService('Auth'). I'm concerned I may be missing some possible problems with this. One I'm half-thinking about is a config.ini file for all externals, then a separate application config.ini overriding or adding to the global external config.ini. If anyone has any suggestions I would be greatly appreciative. I have used contextswitching for AJAX functions within the individual applications, but there is a big chance both external and internal will get web services created for them. Again, these will be separated due to authorization and different available services. \applicationstorage\Applications\internal\webservice \applicationstorage\Applications\external\webservice

    Read the article

  • Multiple Base Addresses and Multiple Endpoints in WCF

    - by mnhab
    I'm using two bindings TCP and HTTP. I want to give mex data on both bindings. What I want is that the mexHttpBinding only exposes the HTTP services while the mexTcpBinding exposes TCP services only. Or is this possible that I access stats service only from HTTP binding and the eventLogging service from TCP? For Example: For TCP I should only have net.tcp://localhost:9001/ABC/mex net.tcp://localhost:9001/ABC/eventLogging For HTTP http://localhost:9002/ABC/stats http://localhost:9002/ABC/mex When I connect with any of the base address (using the WCF Test Client) I'm able to access all the services? Like when I connect with net.tcp://localhost:9001/ABC I'm able to use the services which are offered on the HTTP binding. Why is that so? <system.serviceModel> <services> <service behaviorConfiguration="ABCServiceBehavior" name="ABC.Data.DataServiceWCF"> <endpoint address="eventLogging" binding="netTcpBinding" contract="ABC.Campaign.IEventLoggingService" /> <endpoint address="mex" binding="mexTcpBinding" bindingConfiguration="" contract="IMetadataExchange" /> <endpoint address="stats" binding="basicHttpBinding" contract="ABC.Data.IStatsService" /> <endpoint address="mex" binding="mexHttpBinding" contract="IMetadataExchange" /> <host> <baseAddresses> <add baseAddress="net.tcp://localhost:9001/ABC" /> <add baseAddress="http://localhost:9002/ABC" /> </baseAddresses> </host> </service> </services> <behaviors> <serviceBehaviors> <behavior name="ABCServiceBehavior"> <serviceMetadata httpGetEnabled="false" /> <serviceDebug includeExceptionDetailInFaults="true" /> </behavior> </serviceBehaviors> </behaviors> </system.serviceModel>

    Read the article

  • Load Balance WCF and Share a Remote MSMQ for High Throughput

    - by BarDev
    After a ton of reading in books and on the web, I have noticed hints of information that WCF and MSMQ can be used in achieving high throughput. The information I have seen mentions using multiple WCF services in a farm that reads from a single MSMQ queue. The problem is that I have found paragraphs here and there that mentions that high throughput can be done, but I cannot seem to find a document of how to implement it. The following is an excerpt from a MSDN article. The following paragraph is from Best Practices for Queued Communication http://msdn.microsoft.com/en-us/library/ms731093.aspx To achieve higher throughput and availability, use a farm of WCF services that read from the queue. This requires that all of these services expose the same contract on the same endpoint. The farm approach works best for applications that have high production rates of messages because it enables a number of services to all read from the same queue. This is what I'm trying to solve. I have an intranet application where a client sends a request to a WCF service. But I want the ability to load balance the WCF services on multiple servers in a farm. I also want these WCF services in the farm to do transactional reads from a remote MSMQ when an item is available in the Queue. If this is possible, an issue I have is that I do not understand the activation process of WCF to retrieve messages from a remote queue. If this is possible, does anyone know of any articles or Webcasts that would explain it in detail? BarDev

    Read the article

  • XML/PHP : Content is not allowed in prolog

    - by Tristan
    Hello, i have this message error and i don't know where does the problem comes from: <?php include "DBconnection.class.php"; $sql = DBConnection::getInstance(); $requete = "SELECT g.siteweb, g.offreDedie, g.coupon, g.only_dedi, g.transparence, g.abonnement , s.GSP_nom as nom , COUNT(s.GSP_nom) as nb_votes, TRUNCATE(AVG(vote), 2) as qualite, TRUNCATE(AVG(prix), 2) as rapport, TRUNCATE(AVG(serviceClient), 2) as serviceCli, TRUNCATE(AVG(interface), 2) as interface, TRUNCATE(AVG(services), 2) as services FROM votes_serveur AS v INNER JOIN serveur AS s ON v.idServ = s.idServ INNER JOIN gsp AS g ON s.GSP_nom = g.nom WHERE s.valide = 1 GROUP BY s.GSP_nom"; $sql->query($requete); $xml = '<?xml version="1.0" encoding="UTF-8" ?>'; $xml .='<GamerCertified>'; while($row = $sql->fetchArray()){ $moyenne_services = ($row['services'] + $row['serviceCli'] + $row['interface'] ) / 3 ; $moyenne_services = round( $moyenne_services, 2); $moyenne_ge = ($row['services'] + $row['serviceCli'] + $row['interface'] + $row['qualite'] + $row['rapport'] ) / 5 ; $moyenne_ge = round( $moyenne_ge, 2); $xml .= '<GSP>'; $xml .= '<nom>'.$row["nom"].'</nom>'; $xml .= '<nombre-votes>'.$row["nb_votes"].'</nombre-votes>'; $xml .= '<services>'.$moyenne_services.'</services>'; $xml .= '<qualite>'.$row["qualite"].'</qualite>'; $xml .= '<prix>'.$row["rapport"].'</prix>'; $xml .= '<label-transparence>'.$row["transparence"].'</label-transparence>'; $xml .= '<moyenne-generale>'.$moyenne_ge.'</moyenne-generale>'; $xml .= '<serveurs-dedies>'.$row["offreDedie"].'</serveurs-dedies>'; $xml .= '</GSP>'; } $xml .= '</GamerCertified>'; echo $xml; Thanks

    Read the article

  • AspNetMembership provider with WCF service

    - by Sly
    I'm trying to configure AspNetMembershipProvider to be used for authenticating in my WCF service that is using basicHttpBinding. I have following configuration: <system.serviceModel> <serviceHostingEnvironment aspNetCompatibilityEnabled="true" /> <bindings> <basicHttpBinding> <binding name="basicSecureBinding"> <security mode="Message"></security> </binding> </basicHttpBinding> </bindings> <behaviors> <serviceBehaviors> <behavior name="MyApp.Services.ComputersServiceBehavior"> <serviceAuthorization roleProviderName="AspNetSqlRoleProvider" principalPermissionMode="UseAspNetRoles" /> <serviceCredentials> <userNameAuthentication userNamePasswordValidationMode="MembershipProvider" membershipProviderName="AspNetSqlMembershipProvider"/> </serviceCredentials> <serviceMetadata httpGetEnabled="true" /> <serviceDebug includeExceptionDetailInFaults="true" /> </behavior> </serviceBehaviors> </behaviors> <services> <service behaviorConfiguration="MyApp.Services.ComputersServiceBehavior" name="MyApp.Services.ComputersService"> <endpoint binding="basicHttpBinding" contract="MyApp.Services.IComputersService" /> <endpoint address="mex" binding="mexHttpBinding" contract="IMetadataExchange" /> </service> </services> </system.serviceModel> Roles are enabled and membership provider is configured (its working for web site). But authentication process is not fired at all. There is no calles to data base during request, and when I try to set following attribute on method: [PrincipalPermission(SecurityAction.Demand, Authenticated = true)] public bool Test() { return true; } I'm getting access denied exception. Any thoughts how to fix it?

    Read the article

  • Struggling with a data modeling problem

    - by rpat
    I am struggling with a data model (I use MySQL for the database). I am uneasy about what I have come up with. If someone could suggest a better approach, or point me to some reference matter I would appreciate it. The data would have organizations of many types. I am trying to do a 3 level classification (Class, Category, Type). Say if I have 'Italian Restaurant', it will have the following classification Food Services Restaurants Italian However, an organization may belong to multiple groups. A restaurant may also serve Chinese and Italian. So it will fit into 2 classifications Food Services Restaurants Italian Food Services Restaurants Chinese The classification reference tables would be like the following: ORG_CLASS (RowId, ClassCode, ClassName) 1, FOOD, Food Services ORG_CATEGORY(RowId, ClassCode, CategoryCode, CategoryName) 1, FOOD, REST, Restaurants ORG_TYPE (RowId, ClassCode, CategoryCode, TypeCode, TypeName) 100, FOOD, REST, ITAL, Italian 101, FOOD, REST, CHIN, Chinese 102, FOOD, REST, SPAN, Spanish 103, FOOD, REST, MEXI, Mexican 104, FOOD, REST, FREN, French 105, FOOD, REST, MIDL, Middle Eastern The actual data tables would be like the following: I will allow an organization a max of 3 classifications. I will have 3 GroupIds each pointing to a row in ORG_TYPE. So I have my ORGANIZATION_TABLE ORGANIZATION_TABLE (OrgGroupId1, OrgGroupId2, OrgGroupId3, OrgName, OrgAddres) 100,103,NULL,MyRestaurant1, MyAddr1 100,102,NULL,MyRestaurant2, MyAddr2 100,104,105, MyRestaurant3, MyAddr3 During data add, a dialog could let the user choose the clssa, category, type and the corresponding GroupId could be populated with the rowid from the ORG_TYPE table. During Search, If all three classification are chosen, It will be more specific. For example, if Food Services Restaurants Italian is the criteria, the where clause would be 'where OrgGroupId1 = 100' If only 2 levels are chosen Food Services Restaurants I have to do 'where OrgGroupId1 in (100,101,102,103,104,105, .....)' - There could be a hundred in that list I will disallow class level search. That is I will force selection of a class and category The Ids would be integers. I am trying to see performance issues and other issues. Overall, would this work? or I need to throw this out and start from scratch.

    Read the article

  • XML/PHP : Content is not allowed in trailing section

    - by Tristan
    Hello, i have this message error and i don't know where does the problem comes from: <?php include "DBconnection.class.php"; $sql = DBConnection::getInstance(); $requete = "SELECT g.siteweb, g.offreDedie, g.coupon, g.only_dedi, g.transparence, g.abonnement , s.GSP_nom as nom , COUNT(s.GSP_nom) as nb_votes, TRUNCATE(AVG(vote), 2) as qualite, TRUNCATE(AVG(prix), 2) as rapport, TRUNCATE(AVG(serviceClient), 2) as serviceCli, TRUNCATE(AVG(interface), 2) as interface, TRUNCATE(AVG(services), 2) as services FROM votes_serveur AS v INNER JOIN serveur AS s ON v.idServ = s.idServ INNER JOIN gsp AS g ON s.GSP_nom = g.nom WHERE s.valide = 1 GROUP BY s.GSP_nom"; $sql->query($requete); $xml = '<?xml version="1.0" encoding="UTF-8" ?><GamerCertified>'; while($row = $sql->fetchArray()){ $moyenne_services = ($row['services'] + $row['serviceCli'] + $row['interface'] ) / 3 ; $moyenne_services = round( $moyenne_services, 2); $moyenne_ge = ($row['services'] + $row['serviceCli'] + $row['interface'] + $row['qualite'] + $row['rapport'] ) / 5 ; $moyenne_ge = round( $moyenne_ge, 2); $xml .= '<GSP>'; $xml .= '<nom>'.$row["nom"].'</nom>'; $xml .= '<nombre-votes>'.$row["nb_votes"].'</nombre-votes>'; $xml .= '<services>'.$moyenne_services.'</services>'; $xml .= '<qualite>'.$row["qualite"].'</qualite>'; $xml .= '<prix>'.$row["rapport"].'</prix>'; $xml .= '<label-transparence>'.$row["transparence"].'</label-transparence>'; $xml .= '<moyenne-generale>'.$moyenne_ge.'</moyenne-generale>'; $xml .= '<serveurs-dedies>'.$row["offreDedie"].'</serveurs-dedies>'; $xml .= '</GSP>'; } $xml .= '</GamerCertified>'; echo $xml; Thanks

    Read the article

  • Dependency Injection Question - ASP.NET

    - by Paul
    I'm starting a web application that contains the following projects: Booking.Web Booking.Services Booking.DataObjects Booking.Data I'm using the repository pattern in my data project only. All services will be the same, no matter what happens. However, if a customer wants to use Access, it will use a different data repository than if the customer wants to use SQL Server. I have StructureMap, and want to be able to do the following: Web project is unaffected. It's a web forms application that will only know about the services project and the dataobjects project. When a service is called, it will use StructureMap (by looking up the bootstrapper.cs file) to see which data repository to use. An example of a services class is the error logging class: public class ErrorLog : IErrorLog { ILogging logger; public ErrorLog() { } public ErrorLog(ILogging logger) { this.logger = logger; } public void AddToLog(string errorMessage) { try { AddToDatabaseLog(errorMessage); } catch (Exception ex) { AddToFileLog(ex.Message); } finally { AddToFileLog(errorMessage); } } private void AddToDatabaseLog(string errorMessage) { ErrorObject error = new ErrorObject { ErrorDateTime = DateTime.Now, ErrorMessage = errorMessage }; logger.Insert(error); } private void AddToFileLog(string errorMessage) { // TODO: Take this value from the web.config instead of hard coding it TextWriter writer = new StreamWriter(@"E:\Work\Booking\Booking\Booking.Web\Logs\ErrorLog.txt", true); writer.WriteLine(DateTime.Now.ToString() + " ---------- " + errorMessage); writer.Close(); } } I want to be able to call this service from my web project, without defining which repository to use for the data access. My boostrapper.cs file in the services project is defined as: public class Bootstrapper { public static void ConfigureStructureMap() { ObjectFactory.Initialize(x => { x.AddRegistry(new ServiceRegistry()); } ); } public class ServiceRegistry : Registry { protected override void configure() { ForRequestedType<IErrorLog>().TheDefaultIsConcreteType<Booking.Services.Logging.ErrorLog>(); ForRequestedType<ILogging>().TheDefaultIsConcreteType<SqlServerLoggingProvider>(); } } } What else do I need to get this to work? When I defined a test, the ILogger object was null. Thanks,

    Read the article

  • ASP.NET MVC application architecture "guidelines"

    - by Joe Future
    I'm looking for some feedback on my ASP.NET MVC based CMS application architecture. Domain Model - depends on nothing but the System classes to define types. For now, mostly anemic. Repository Layer - abstracted data access, only called by the services layer Services Layer - performs business logic on domain models. Exposes view models to the controllers. ViewModelMapper - service that translates back and forth between view models and domain models Controllers - super thin "traffic cop" style functionality that interacts with the service layer and only talks in terms of view models, never domain models My domain model is mostly used as data transfer (DTO) objects and has minimal logic at the moment. I'm finding this is nice because it depends on nothing (not even classes in the services layer). The services layer is a bit tricky... I only want the controllers to have access to viewmodels for ease of GUI programming. However, some of the services need to talk to each other. For example, I have an eventing service that notifies other listener services when content is tagged, when blog posts are created, etc. Currently, the methods that take domain models as inputs or return them are marked internal so they can't be used by the controllers. Sounds like overkill? Not enough abstraction? I'm mainly doing this as a learning exercise in being strict about architecture, not for an actual product, so please no feedback along the lines of "right depends on what you want to do". thanks! Jason

    Read the article

  • WCF service reference namespace differs from original

    - by Thorarin
    I'm having a problem regarding namespaces used by my service references. I have a number of WCF services, say with the namespace MyCompany.Services.MyProduct (the actual namespaces are longer). As part of the product, I'm also providing a sample C# .NET website. This web application uses the namespace MyCompany.MyProduct. During initial development, the service was added as a project reference to the website and uses directly. I used a factory pattern that returns an object instance that implements MyCompany.Services.MyProduct.IMyService. So far, so good. Now I want to change this to use an actual service reference. After adding the reference and typing MyCompany.Services.MyProduct in the namespace textbox, it generates classes in the namespace MyCompany.MyProduct.MyCompany.Services.MyProduct. BAD! I don't want to have to change using directives in several places just because I'm using a proxy class. So I tried prepending the namespace with global::, but that is not accepted. Note that I hadn't even deleted the original assembly references yet, and "reuse types" is enabled, but no reusing was done, apparently. However, I don't want to keep the assembly references around in my sample website for it to work anyway. The only solution I've come up with so far is setting the default namespace for my web application to MyCompany (because it cannot be empty), and adding the service reference as Services.MyProduct. Suppose that a customer wants to use my sample website as a starting point, and they change the default namespace to OtherCompany.Whatever, this will obviously break my workaround. Is there a good solution to this problem? To summarize: I want to generate a service reference proxy in the original namespace, without referencing the assembly. Note: I have seen this question, but there was no solution provided that is acceptable for my use case. Edit: As John Saunders suggested, I've submitted some feedback to Microsoft about this: Feedback item @ Microsoft Connect

    Read the article

  • Implementation of MVC with SQLite and NSURLConnection, use cases?

    - by user324723
    I'm interested in knowing how others have implemented/designed database & web services in their iphone app and how they simplified it for the entire application. My application is dependent on these services and I can't figure out a efficient way to use them together due to the (semi)complexity of my requirements. My past attempts on combining them haven't been completely successful or at least optimal in my mind. I'm building a database driven iphone app that uses a relational database in sqlite and consumes web services based on missing content or user interaction. Like this hasn't been done before...right? Since I am using a relational database - any web services consumed requires normalization, parsing the result and persisting it to the database before it can be displayed in a table view controller. The applications UI consists of nested(nav controller) table views where a user can select a cell and be taken to the next table view where it attempts to populate the table views data source from the database. If nothing exists in the database then it will send a request via web services to download its content, thus download - parse - persist - query - display. Since the user has the ability to request a refresh of this data it still requires the same process. Quickly describing what I've implemented and tried to run with - 1st attempt - Used a singleton web service class that handled sending web service requests, parsing the result and returning it to the table view controller via delegate protocols. Once the controller received that data it would then be responsible for persisting it to the database and re-returning the result. I didn't like the idea of only preventing the case where the app delegate selector doesn't exists(released) causing the app to crash. 2nd attempt - Used NSNotificationCenter for easy access to both database and web services but later realized it was more complex due to adding and removing observers per view(which isn't advised anyways).

    Read the article

  • RIA Service/oData ... "Requests that attempt to access a single element using key values from a resu

    - by user327911
    I've recently started working up a sample project to play with an oData feed coming from a RIA service. I am able to view the feed and the metadata via any web browser, however, if I try to perform certain query operations on the feed I receive "unsupported" exceptions. Sample oData feed: ProductSet http://localhost:50880/Services/Rebirth-Web-Services-ProductService.svc/OData/ProductSet/ 2010-04-28T14:02:10Z http://localhost:50880/Services/Rebirth-Web-Services-ProductService.svc/OData/ProductSet(guid'b0a2b170-c6df-441f-ae2a-74dd19901128') 2010-04-28T14:02:10Z b0a2b170-c6df-441f-ae2a-74dd19901128 Product 0 Type 1 Active Sample web.config entry: Sample service: [EnableClientAccess()] public class ProductService : DomainService { [Query(IsDefault = true)] public IQueryable GetProducts() { IList products = new List(); for (int i = 0; i < 90; i++) { Product product = new Product { Id = Guid.NewGuid(), Name = "Product " + i.ToString(), ProductType = i < 30 ? "Type 1" : ((i > 30 && i < 60) ? "Type 2" : "Type 3"), Status = i % 2 == 0 ? "Active" : "NotActive" }; products.Add(product); } return products.AsQueryable(); } } If I provide the url "http://localhost:50880/Services/Rebirth-Web-Services-ProductService.svc/OData/ProductSet(guid'b0a2b170-c6df-441f-ae2a-74dd19901128')" to my web browser I receive the following xml: Requests that attempt to access a single element using key values from a result set are not supported. I'm new to RIA and oData. Could this be something as simple as my web browsers not supporting this type of querying on the result set or something else? Thanks ahead! Corey

    Read the article

  • Introducing the Earthquake Locator – A Bing Maps Silverlight Application, part 1

    - by Bobby Diaz
    Update: Live demo and source code now available!  The recent wave of earthquakes (no pun intended) being reported in the news got me wondering about the frequency and severity of earthquakes around the world. Since I’ve been doing a lot of Silverlight development lately, I decided to scratch my curiosity with a nice little Bing Maps application that will show the location and relative strength of recent seismic activity. Here is a list of technologies this application will utilize, so be sure to have everything downloaded and installed if you plan on following along. Silverlight 3 WCF RIA Services Bing Maps Silverlight Control * Managed Extensibility Framework (optional) MVVM Light Toolkit (optional) log4net (optional) * If you are new to Bing Maps or have not signed up for a Developer Account, you will need to visit www.bingmapsportal.com to request a Bing Maps key for your application. Getting Started We start out by creating a new Silverlight Application called EarthquakeLocator and specify that we want to automatically create the Web Application Project with RIA Services enabled. I cleaned up the web app by removing the Default.aspx and EarthquakeLocatorTestPage.html. Then I renamed the EarthquakeLocatorTestPage.aspx to Default.aspx and set it as my start page. I also set the development server to use a specific port, as shown below. RIA Services Next, I created a Services folder in the EarthquakeLocator.Web project and added a new Domain Service Class called EarthquakeService.cs. This is the RIA Services Domain Service that will provide earthquake data for our client application. I am not using LINQ to SQL or Entity Framework, so I will use the <empty domain service class> option. We will be pulling data from an external Atom feed, but this example could just as easily pull data from a database or another web service. This is an important distinction to point out because each scenario I just mentioned could potentially use a different Domain Service base class (i.e. LinqToSqlDomainService<TDataContext>). Now we can start adding Query methods to our EarthquakeService that pull data from the USGS web site. Here is the complete code for our service class: using System; using System.Collections.Generic; using System.IO; using System.Linq; using System.ServiceModel.Syndication; using System.Web.DomainServices; using System.Web.Ria; using System.Xml; using log4net; using EarthquakeLocator.Web.Model;   namespace EarthquakeLocator.Web.Services {     /// <summary>     /// Provides earthquake data to client applications.     /// </summary>     [EnableClientAccess()]     public class EarthquakeService : DomainService     {         private static readonly ILog log = LogManager.GetLogger(typeof(EarthquakeService));           // USGS Data Feeds: http://earthquake.usgs.gov/earthquakes/catalogs/         private const string FeedForPreviousDay =             "http://earthquake.usgs.gov/earthquakes/catalogs/1day-M2.5.xml";         private const string FeedForPreviousWeek =             "http://earthquake.usgs.gov/earthquakes/catalogs/7day-M2.5.xml";           /// <summary>         /// Gets the earthquake data for the previous week.         /// </summary>         /// <returns>A queryable collection of <see cref="Earthquake"/> objects.</returns>         public IQueryable<Earthquake> GetEarthquakes()         {             var feed = GetFeed(FeedForPreviousWeek);             var list = new List<Earthquake>();               if ( feed != null )             {                 foreach ( var entry in feed.Items )                 {                     var quake = CreateEarthquake(entry);                     if ( quake != null )                     {                         list.Add(quake);                     }                 }             }               return list.AsQueryable();         }           /// <summary>         /// Creates an <see cref="Earthquake"/> object for each entry in the Atom feed.         /// </summary>         /// <param name="entry">The Atom entry.</param>         /// <returns></returns>         private Earthquake CreateEarthquake(SyndicationItem entry)         {             Earthquake quake = null;             string title = entry.Title.Text;             string summary = entry.Summary.Text;             string point = GetElementValue<String>(entry, "point");             string depth = GetElementValue<String>(entry, "elev");             string utcTime = null;             string localTime = null;             string depthDesc = null;             double? magnitude = null;             double? latitude = null;             double? longitude = null;             double? depthKm = null;               if ( !String.IsNullOrEmpty(title) && title.StartsWith("M") )             {                 title = title.Substring(2, title.IndexOf(',')-3).Trim();                 magnitude = TryParse(title);             }             if ( !String.IsNullOrEmpty(point) )             {                 var values = point.Split(' ');                 if ( values.Length == 2 )                 {                     latitude = TryParse(values[0]);                     longitude = TryParse(values[1]);                 }             }             if ( !String.IsNullOrEmpty(depth) )             {                 depthKm = TryParse(depth);                 if ( depthKm != null )                 {                     depthKm = Math.Round((-1 * depthKm.Value) / 100, 2);                 }             }             if ( !String.IsNullOrEmpty(summary) )             {                 summary = summary.Replace("</p>", "");                 var values = summary.Split(                     new string[] { "<p>" },                     StringSplitOptions.RemoveEmptyEntries);                   if ( values.Length == 3 )                 {                     var times = values[1].Split(                         new string[] { "<br>" },                         StringSplitOptions.RemoveEmptyEntries);                       if ( times.Length > 0 )                     {                         utcTime = times[0];                     }                     if ( times.Length > 1 )                     {                         localTime = times[1];                     }                       depthDesc = values[2];                     depthDesc = "Depth: " + depthDesc.Substring(depthDesc.IndexOf(":") + 2);                 }             }               if ( latitude != null && longitude != null )             {                 quake = new Earthquake()                 {                     Id = entry.Id,                     Title = entry.Title.Text,                     Summary = entry.Summary.Text,                     Date = entry.LastUpdatedTime.DateTime,                     Url = entry.Links.Select(l => Path.Combine(l.BaseUri.OriginalString,                         l.Uri.OriginalString)).FirstOrDefault(),                     Age = entry.Categories.Where(c => c.Label == "Age")                         .Select(c => c.Name).FirstOrDefault(),                     Magnitude = magnitude.GetValueOrDefault(),                     Latitude = latitude.GetValueOrDefault(),                     Longitude = longitude.GetValueOrDefault(),                     DepthInKm = depthKm.GetValueOrDefault(),                     DepthDesc = depthDesc,                     UtcTime = utcTime,                     LocalTime = localTime                 };             }               return quake;         }           private T GetElementValue<T>(SyndicationItem entry, String name)         {             var el = entry.ElementExtensions.Where(e => e.OuterName == name).FirstOrDefault();             T value = default(T);               if ( el != null )             {                 value = el.GetObject<T>();             }               return value;         }           private double? TryParse(String value)         {             double d;             if ( Double.TryParse(value, out d) )             {                 return d;             }             return null;         }           /// <summary>         /// Gets the feed at the specified URL.         /// </summary>         /// <param name="url">The URL.</param>         /// <returns>A <see cref="SyndicationFeed"/> object.</returns>         public static SyndicationFeed GetFeed(String url)         {             SyndicationFeed feed = null;               try             {                 log.Debug("Loading RSS feed: " + url);                   using ( var reader = XmlReader.Create(url) )                 {                     feed = SyndicationFeed.Load(reader);                 }             }             catch ( Exception ex )             {                 log.Error("Error occurred while loading RSS feed: " + url, ex);             }               return feed;         }     } }   The only method that will be generated in the client side proxy class, EarthquakeContext, will be the GetEarthquakes() method. The reason being that it is the only public instance method and it returns an IQueryable<Earthquake> collection that can be consumed by the client application. GetEarthquakes() calls the static GetFeed(String) method, which utilizes the built in SyndicationFeed API to load the external data feed. You will need to add a reference to the System.ServiceModel.Web library in order to take advantage of the RSS/Atom reader. The API will also allow you to create your own feeds to serve up in your applications. Model I have also created a Model folder and added a new class, Earthquake.cs. The Earthquake object will hold the various properties returned from the Atom feed. Here is a sample of the code for that class. Notice the [Key] attribute on the Id property, which is required by RIA Services to uniquely identify the entity. using System; using System.Collections.Generic; using System.Linq; using System.Runtime.Serialization; using System.ComponentModel.DataAnnotations;   namespace EarthquakeLocator.Web.Model {     /// <summary>     /// Represents an earthquake occurrence and related information.     /// </summary>     [DataContract]     public class Earthquake     {         /// <summary>         /// Gets or sets the id.         /// </summary>         /// <value>The id.</value>         [Key]         [DataMember]         public string Id { get; set; }           /// <summary>         /// Gets or sets the title.         /// </summary>         /// <value>The title.</value>         [DataMember]         public string Title { get; set; }           /// <summary>         /// Gets or sets the summary.         /// </summary>         /// <value>The summary.</value>         [DataMember]         public string Summary { get; set; }           // additional properties omitted     } }   View Model The recent trend to use the MVVM pattern for WPF and Silverlight provides a great way to separate the data and behavior logic out of the user interface layer of your client applications. I have chosen to use the MVVM Light Toolkit for the Earthquake Locator, but there are other options out there if you prefer another library. That said, I went ahead and created a ViewModel folder in the Silverlight project and added a EarthquakeViewModel class that derives from ViewModelBase. Here is the code: using System; using System.Collections.ObjectModel; using System.ComponentModel.Composition; using System.ComponentModel.Composition.Hosting; using Microsoft.Maps.MapControl; using GalaSoft.MvvmLight; using EarthquakeLocator.Web.Model; using EarthquakeLocator.Web.Services;   namespace EarthquakeLocator.ViewModel {     /// <summary>     /// Provides data for views displaying earthquake information.     /// </summary>     public class EarthquakeViewModel : ViewModelBase     {         [Import]         public EarthquakeContext Context;           /// <summary>         /// Initializes a new instance of the <see cref="EarthquakeViewModel"/> class.         /// </summary>         public EarthquakeViewModel()         {             var catalog = new AssemblyCatalog(GetType().Assembly);             var container = new CompositionContainer(catalog);             container.ComposeParts(this);             Initialize();         }           /// <summary>         /// Initializes a new instance of the <see cref="EarthquakeViewModel"/> class.         /// </summary>         /// <param name="context">The context.</param>         public EarthquakeViewModel(EarthquakeContext context)         {             Context = context;             Initialize();         }           private void Initialize()         {             MapCenter = new Location(20, -170);             ZoomLevel = 2;         }           #region Private Methods           private void OnAutoLoadDataChanged()         {             LoadEarthquakes();         }           private void LoadEarthquakes()         {             var query = Context.GetEarthquakesQuery();             Context.Earthquakes.Clear();               Context.Load(query, (op) =>             {                 if ( !op.HasError )                 {                     foreach ( var item in op.Entities )                     {                         Earthquakes.Add(item);                     }                 }             }, null);         }           #endregion Private Methods           #region Properties           private bool autoLoadData;         /// <summary>         /// Gets or sets a value indicating whether to auto load data.         /// </summary>         /// <value><c>true</c> if auto loading data; otherwise, <c>false</c>.</value>         public bool AutoLoadData         {             get { return autoLoadData; }             set             {                 if ( autoLoadData != value )                 {                     autoLoadData = value;                     RaisePropertyChanged("AutoLoadData");                     OnAutoLoadDataChanged();                 }             }         }           private ObservableCollection<Earthquake> earthquakes;         /// <summary>         /// Gets the collection of earthquakes to display.         /// </summary>         /// <value>The collection of earthquakes.</value>         public ObservableCollection<Earthquake> Earthquakes         {             get             {                 if ( earthquakes == null )                 {                     earthquakes = new ObservableCollection<Earthquake>();                 }                   return earthquakes;             }         }           private Location mapCenter;         /// <summary>         /// Gets or sets the map center.         /// </summary>         /// <value>The map center.</value>         public Location MapCenter         {             get { return mapCenter; }             set             {                 if ( mapCenter != value )                 {                     mapCenter = value;                     RaisePropertyChanged("MapCenter");                 }             }         }           private double zoomLevel;         /// <summary>         /// Gets or sets the zoom level.         /// </summary>         /// <value>The zoom level.</value>         public double ZoomLevel         {             get { return zoomLevel; }             set             {                 if ( zoomLevel != value )                 {                     zoomLevel = value;                     RaisePropertyChanged("ZoomLevel");                 }             }         }           #endregion Properties     } }   The EarthquakeViewModel class contains all of the properties that will be bound to by the various controls in our views. Be sure to read through the LoadEarthquakes() method, which handles calling the GetEarthquakes() method in our EarthquakeService via the EarthquakeContext proxy, and also transfers the loaded entities into the view model’s Earthquakes collection. Another thing to notice is what’s going on in the default constructor. I chose to use the Managed Extensibility Framework (MEF) for my composition needs, but you can use any dependency injection library or none at all. To allow the EarthquakeContext class to be discoverable by MEF, I added the following partial class so that I could supply the appropriate [Export] attribute: using System; using System.ComponentModel.Composition;   namespace EarthquakeLocator.Web.Services {     /// <summary>     /// The client side proxy for the EarthquakeService class.     /// </summary>     [Export]     public partial class EarthquakeContext     {     } }   One last piece I wanted to point out before moving on to the user interface, I added a client side partial class for the Earthquake entity that contains helper properties that we will bind to later: using System;   namespace EarthquakeLocator.Web.Model {     /// <summary>     /// Represents an earthquake occurrence and related information.     /// </summary>     public partial class Earthquake     {         /// <summary>         /// Gets the location based on the current Latitude/Longitude.         /// </summary>         /// <value>The location.</value>         public string Location         {             get { return String.Format("{0},{1}", Latitude, Longitude); }         }           /// <summary>         /// Gets the size based on the Magnitude.         /// </summary>         /// <value>The size.</value>         public double Size         {             get { return (Magnitude * 3); }         }     } }   View Now the fun part! Usually, I would create a Views folder to place all of my View controls in, but I took the easy way out and added the following XAML code to the default MainPage.xaml file. Be sure to add the bing prefix associating the Microsoft.Maps.MapControl namespace after adding the assembly reference to your project. The MVVM Light Toolkit project templates come with a ViewModelLocator class that you can use via a static resource, but I am instantiating the EarthquakeViewModel directly in my user control. I am setting the AutoLoadData property to true as a way to trigger the LoadEarthquakes() method call. The MapItemsControl found within the <bing:Map> control binds its ItemsSource property to the Earthquakes collection of the view model, and since it is an ObservableCollection<T>, we get the automatic two way data binding via the INotifyCollectionChanged interface. <UserControl x:Class="EarthquakeLocator.MainPage"     xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"     xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"     xmlns:d="http://schemas.microsoft.com/expression/blend/2008"     xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"     xmlns:bing="clr-namespace:Microsoft.Maps.MapControl;assembly=Microsoft.Maps.MapControl"     xmlns:vm="clr-namespace:EarthquakeLocator.ViewModel"     mc:Ignorable="d" d:DesignWidth="640" d:DesignHeight="480" >     <UserControl.Resources>         <DataTemplate x:Key="EarthquakeTemplate">             <Ellipse Fill="Red" Stroke="Black" StrokeThickness="1"                      Width="{Binding Size}" Height="{Binding Size}"                      bing:MapLayer.Position="{Binding Location}"                      bing:MapLayer.PositionOrigin="Center">                 <ToolTipService.ToolTip>                     <StackPanel>                         <TextBlock Text="{Binding Title}" FontSize="14" FontWeight="Bold" />                         <TextBlock Text="{Binding UtcTime}" />                         <TextBlock Text="{Binding LocalTime}" />                         <TextBlock Text="{Binding DepthDesc}" />                     </StackPanel>                 </ToolTipService.ToolTip>             </Ellipse>         </DataTemplate>     </UserControl.Resources>       <UserControl.DataContext>         <vm:EarthquakeViewModel AutoLoadData="True" />     </UserControl.DataContext>       <Grid x:Name="LayoutRoot">           <bing:Map x:Name="map" CredentialsProvider="--Your-Bing-Maps-Key--"                   Center="{Binding MapCenter, Mode=TwoWay}"                   ZoomLevel="{Binding ZoomLevel, Mode=TwoWay}">             <bing:MapItemsControl ItemsSource="{Binding Earthquakes}"                                   ItemTemplate="{StaticResource EarthquakeTemplate}" />         </bing:Map>       </Grid> </UserControl>   The EarthquakeTemplate defines the Ellipse that will represent each earthquake, the Width and Height that are determined by the Magnitude, the Position on the map, and also the tooltip that will appear when we mouse over each data point. Running the application will give us the following result (shown with a tooltip example): That concludes this portion of our show but I plan on implementing additional functionality in later blog posts. Be sure to come back soon to see the next installments in this series. Enjoy!   Additional Resources USGS Earthquake Data Feeds Brad Abrams shows how RIA Services and MVVM can work together

    Read the article

  • Need help with software licensing? Read on&hellip;

    - by juanlarios
    Figuring out which software licensing options best suit your needs while being cost-effective can be confusing. Some businesses end up making their purchases through retail stores which means they miss out on volume licensing opportunities and others may unknowingly be using unlicensed software which means their business may be at risk. So let me help you make the best decision for your situation. You may want to review this blog post that lays out licensing basics for any organization that needs to license software for more than 5 or less than 250 devices or users. It details the different ways you can buy a license and what choices are available for volume licensing, which can give you pricing advantages and provide flexible options for your business. As technology evolves and more organizations move to online services such as Microsoft Office 365, Microsoft Dynamics CRM Online, Windows Azure Platform, Windows Intune and others, it’s important to understand how to purchase, activate and use online service subscriptions to get the most out of your investment. Once purchased through a volume licensing agreement or the Microsoft Online Subscription Program, these services can be managed through web portals: · Online Services Customer Portal (Microsoft Office 365, Microsoft Intune) · Dynamics CRM Online Customer Portal (Microsoft Dynamics CRM Online) · Windows Azure Customer Portal (Windows Azure Platform) · Volume Licensing Service Center (other services) Learn more >> Licensing Resources: The SMB How to Buy Portal – receive clear purchasing and licensing information that is easy to understand in order to help facilitate quick decision making. Microsoft License Advisor (MLA) – Use MLA to research Microsoft Volume Licensing products, programs and pricing. Volume Licensing Service Center (VLSC) – Already have a volume License? Use the VLSC to get you easy access to all your licensing information in one location. Online Services – licensing information for off-premise options. Windows 7 Comparison: – Compare versions of Windows and find out which one is right for you. Office 2010 Comparison: – Find out which Office suite is right for you. Licensing FAQs – Frequently Asked Questions About Product Licensing. Additional Resources You May Find Useful: · TechNet Evaluation Center Try some of our latest Microsoft products For free, Like System Center 2012 Pre-Release Products, and evaluate them before you buy. · Springboard Series Your destination for technical resources, free tools and expert guidance to ease the deployment and management of your Windows-based client infrastructure.   · AlignIT Manager Tech Talk Series A monthly streamed video series with a range of topics for both infrastructure and development managers.  Ask questions and participate real-time or watch the on-demand recording.

    Read the article

  • Silverlight Cream for May 04, 2010 -- #855

    - by Dave Campbell
    In this Issue: John Papa, Adam Kinney, Mike Taulty, Kirupa, Gunnar Peipman, Mike Snow(-2-, -3-), Jesse Liberty, and Lee. Shoutout: Jeff Wilcox announced Silverlight Unit Test Framework: New version in the April 2010 Silverlight Toolkit From SilverlightCream.com: Silverlight TV 23: MVP Q&A with WWW (Wildermuth, Wahlin and Ward) John Papa has Silverlight 23 up which is a panel discussion between Shawn Wildermuth, Dan Wahlin, Ward Bell and John... wow... what a crew! Design-time Resources in Expression Blend 4 RC Adam Kinney reports on the new feature of Expresseion Blend RC to load resources at design time. Adam also has a project available to demonstrate the concepts he's explaining. Silverlight and WCF RIA Services (1 - Overview) Mike Taulty is starting a series on WCF RIA Services. This first one is an overview and looks to be a good series as expected. Introduction to Sample Data - Page 1 Kirupa has a great 5-part post up about sample data in Expression Blend. Windows Phone 7 development: Using WebBrowser control Gunnar Peipman posted about using the web browser control in WP7 to display RSS data. Good stuff, and all the code too. Silverlight Tip of the Day #10 – Converting Client IP to Geographical Location Mike Snow's Tip #10 is about taking an IP address and getting a geographical location from it. Combine this with his Tip #9 that retrieves the IP address. Silverlight Tip of the Day #11 – Deploying Silverlight Applications with WCF web services. Mike Snow's Tip #11 is much bigger than most ... it's almost an end-to-end solution for creating and deploying a WCF service, including resolving problems. Silverlight Tip of the Day #12 – Getting an Images Source File Name Mike Snow also has tip #12 up, and it's a quick one on getting the original source file name for an image you've loaded. Screen Scraping – When All You Have Is A Hammer… Jesse Liberty posted his solution to a self-imposed problem and ended up writing a 'mini tutorial on using Silverlight for creating desk-top utilities' ... all with source. RIA services and combobox lookups Lee has a post up about RIA Services and setting up comboboxes for lookups. Lots of source in the post and full project download. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • WebCenter Customer Spotlight: Instituto Mexicano de la Propiedad Industrial

    - by me
    Author: Peter Reiser - Social Business Evangelist, Oracle WebCenter  Solution SummaryInstituto Mexicano de la Propiedad Industrial (IMPI) is a decentralized  federal agency with the goals of protecting and ensuring awareness of industrial property rights in Mexico. IMPI  business objectives were to increase efficiency, improve client service, accelerate services to the public and reduce paper use by digitizing management of necessary documentation for patent and trademark submissions and approvals. IMPI  implemented  Oracle WebCenter Content to develop electronic inquiry service by digitizing and managing documents and a public Web site making patent-related information easily available online. With the implemented solution IMPI increased the number of monthly inquires from 200 in person consultations to 80,000 electronic consultations and the number of trademark record inquiries from 30,000 to 300,000. Company OverviewInstituto Mexicano de la Propiedad Industrial (IMPI) is a decentralized federal agency with the goals of protecting and ensuring awareness of industrial property rights in Mexico. IMPI is responsible for registering and publicizing inventions, distinctive signs, trademarks, and patents. In addition to its Mexico City headquarters, IMPI has five regional offices.  Business Challenges IMPI  business objectives were to increase efficiency by automating internal operations and patent and trademark-related procedures and services, improve client service by simplifying patent and trademark procedures, accelerate services to the public and reduce paper use by digitizing management of necessary documentation for patent and trademark submissions and approvals. Solution DeployedIMPI worked with Oracle Consulting to implement Oracle WebCenter Content to develop electronic inquiry service - services that were previously provided in person only - by digitizing and managing documents. They use Oracle Database 11g, Enterprise Edition to manage data for all mission-critical systems, automating patent and trademark transactions, providing consistent, readily available, and accurate data. IMPI developed a Web site to support newly digitized information with simple and flexible interfaces, making patent-related information easily available online to the public. Business ResultsWith the implemented solution IMPI increased the number of monthly inquires  from 200 in person consultations to 80,000 electronic consultations and the number of trademark record inquiries from 30,000 to 300,000. “Oracle WebCenter Content structure is unique. It lets us separately manage communication with other applications and databases, and performs content management itself. It’s a stable tool, at an appropriate cost, that lets us develop and provide reliable electronic services.” Eugenio Ponce de León, Divisional Director of Systems and Technology, Instituto Mexicano de la Propiedad Industrial Additional Information Instituto Mexicano de la Propiedad Customer Snapshot Oracle WebCenter Content

    Read the article

  • Five things SSIS should drop

    - by jamiet
    There’s a current SQL Server meme going round entitled Five things SQL Server should drop and, whilst no-one tagged me to write anything, I couldn’t resist doing the same for SQL Server Integration Services. So, without further ado, here are five things that I think should be dropped from SSIS.Data source connectionsSeriously, does anyone use these? I know why they’re there. Someone sat in a meeting back in the early part of the last decade and said “Ooo, Reporting Services and Analysis Services have these things called Data Sources. If we used them in Integration Services then we’d have a really cool integration story.” Errr….no.Web Service TaskDitto. If you want to do anything useful against anything but the simplest of SOAP web services steer well clear of this peculiar SSIS additionActiveX Script TaskAnother task that I suspect has never seen the light of day in a SSIS package. It was billed as a way of running upgraded DTS2000 ActiveX scripts in SSIS – sounds good except for one thing. Anytime one of those scripts would try to talk to the DTS object model (which they all do – otherwise what’s the point) then they will error out. This one has always been a real head scratcher.Slow Changing Dimension wizardI suspect I may get some push back on this one but I’m mentioning it anyway. Some people like the SCD wizard; I am not one of those people! Everything that the SCD component does can easily be reproduced using other components and from a performance point of view its much more beneficial to use those alternatives.Multifile Connection ManagerImagining buying a house that came with a set of keys that didn’t open any of the doors. Sounds ridiculous right? How about a SSIS Connection Manager that doesn’t get used by any of the tasks or components. Ah, that’ll be the Multifile Connection Manager then!Comments are of course welcome. Diatribes are assumed :)@Jamiet Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Oracle Utilities Application Framework V4.2.0.0.0 Released

    - by ACShorten
    The Oracle Utilities Application Framework V4.2.0.0.0 has been released with Oracle Utilities Customer Care And Billing V2.4. This release includes new functionality and updates to existing functionality and will be progressively released across the Oracle Utilities applications. The release is quite substantial with lots of new and exciting changes. The release notes shipped with the product includes a summary of the changes implemented in V4.2.0.0.0. They include the following: Configuration Migration Assistant (CMA) - A new data management capability to allow you to export and import Configuration Data from one environment to another with support for Approval/Rejection of individual changes. Database Connection Tagging - Additional tags have been added to the database connection to allow database administrators, Oracle Enterprise Manager and other Oracle technology the ability to monitor and use individual database connection information. Native Support for Oracle WebLogic - In the past the Oracle Utilities Application Framework used Oracle WebLogic in embedded mode, and now, to support advanced configuration and the ExaLogic platform, we are adding Native Support for Oracle WebLogic as configuration option. Native Web Services Support - In the past the Oracle Utilities Application Framework supplied a servlet to handle Web Services calls and now we offer an alternative to use the native Web Services capability of Oracle WebLogic. This allows for enhanced clustering, a greater level of Web Service standards support, enchanced security options and the ability to use the Web Services management capabilities in Oracle WebLogic to implement higher levels of management including defining additional security rules to control access to individual Web Services. XML Data Type Support - Oracle Utilities Application Framework now allows implementors to define XML Data types used in Oracle in the definition of custom objects to take advantage of XQuery and other XML features. Fuzzy Operator Support - Oracle Utilities Application Framework supports the use of the fuzzy operator in conjunction with Oracle Text to take advantage of the fuzzy searching capabilities within the database. Global Batch View - A new JMX based API has been implemented to allow JSR120 compliant consoles the ability to view batch execution across all threadpools in the Coherence based Named Cache Cluster. Portal Personalization - It is now possible to store the runtime customizations of query zones such as preferred sorting, field order and filters to reuse as personal preferences each time that zone is used. These are just the major changes and there are quite a few more that have been delivered (and more to come in the service packs!!). Over the next few weeks we will be publishing new whitepapers and new entries in this blog outlining new facilities that you want to take advantage of.

    Read the article

  • The value of money

    - by ambreesh
    A dictionary definition of money is "any circulating medium of exchange, including coins, paper money, anddemand deposits". If you ask an economist for a definition of money, you will be introduced to terms like M1, M2, M3, all of which denote tangible assets - currency, and anything that is liquid enough to be used as currency; checks, stamps and now mobile minutes being examples. The macroeconomic theory of money is fascinating - the effect of money supply on exchange rates and interest rates, the concept of the "money multiplier" (if I deposit $10 into a bank, the bank will likely loan $8 of it to someone else, who will then give it to someone else in exchange for goods and services, who will then likely deposit it again, which will result in the bank loaning it again and so on - making that $10 of money supply worth a lot more ($10+$8+$x+...)).  But all this depends on money supply - in other words, money that is printed by the mint. The Treasury Department spends a lot of time figuring out how much money to print, there is lot being written on QE2 now-a-days, which is intended to increase the money supply. Money is used to purchase goods and services, and yes it is saved too but that is so one can purchase goods and services later. Completely unrelated, there is a sea change occurring in the web world, dominated by, I believe, Facebook. With 500M active users and growing, FB has the ability to introduce a "money supply" which is completely unrelated to today's "money". Using today's money, a FB user can buy a certain number of FB$s, and then use the FB$s within FB to purchase goods and services - with the money multiplier kicking in. I remember talking with a colleague about this a few years ago, the true way to monetize the web is to introduce an alternative system to the existing, and FB has the ability to do just that. There is enough momentum, enough mass for FB to start to monetize its user base. And completely screw up the economists at the Treasury, not to mention disintermediating the banks completely. The only other ubiquitous asset is mobile minutes. People exchanging mobile minutes for tangible goods and services happens today, the big difference however is the demographic. While Safaricom offers this ability in Kenya today, FB has the 15-40 year middle class user as their user. And the next generation is growing up with FB as a standard channel for communicating with their peers. Virtual flowers when going in for the kill? If your target is an avid FB user, why not? It certainly is a lot more green - no pun intended!

    Read the article

  • Oracle CRM For Public Sector, Commercial Business, Education

    - by michael.seback
    Chongqing Transport Commission Improves Management of Transport Projects The Chongqing Transport Commission is responsible for public passenger, road, and waterway transport in urban and rural areas of Chongqing. The commission administers the region's road and water industry; oversees the construction of transport infrastructure; and manages civil aviation, railroads, roads, waterways, ports, and wharves. "After studying the IT initiatives of other provincial transport commissions, we decided to use Siebel Public Sector to build our integrated transport service system. The Siebel software offers powerful functions that allow us to integrate information and improve the management of our road, rail, and waterway infrastructure projects." - Chen Xiaoming, Vice Director, Information Center, Chongqing Transport Commission. Read more here. Siemens Information Services Increases Productivity by 20% Siemens Information Services Pvt, Ltd. provides back-office account processing services to Siemens' vendors. The company works with Siemens' healthcare, energy, and industry divisions in Europe, the United States, and parts of the Asia-Pacific region. It approves financial services such as processing payroll, accounts data, purchase orders, invoices, and payments, and also creates service catalogs for customers and internal teams. "Oracle CRM ON Demand provides us with a complete view of each customer's data from the moment they log a request to the time we close it. This has eliminated manual requests, and improved the service we offer to our clients across the Asia-Pacific region." -Sunil Zutshi, General Manager, IT, Siemens Information Services Pvt, Ltd. Read more here. China Distance Education Holdings Improves Call Center Productivity by 24% China Distance Education Holdings Limited is a leading provider of online education. The organization offers 174 courses through 16 Web sites, including accounting, healthcare, law, and engineering. In 2010, 215,000 students were enrolled. "Online education is a fast growing sector in China. To maintain our competitiveness, we implemented Oracle Contact Center Anywhere to make it easier and faster for our call center staff to respond to student enquiries. As a result, their productivity increased by 24%." - Qin Songjiang, Chief Technology Officer, China Distance Education Holdings Limited. Read more here.

    Read the article

  • SharePoint 2010 Hosting :: How to Enable Office Web Apps on SharePoint 2010

    - by mbridge
    Office Web App is the online version of Microsoft Office 2010 which is very helpful if you are going to use SharePoint 2010 in your organization as it allows you to do basic editing of word document without installing the Office Suite in the client machine. Prerequisites : - Microsoft Server 2008 R2 - Microsoft SharePoint Server 2010 or Microsoft SharePoint Foundation 2010 - Microsoft Office Web Apps. If you have installed all the above products, just follow this steps: 1. Go to Central Administration > Click on Manage Service Application. 2. All the menus are not displayed in ribbon Menu format which was first introduced in Office 2007. Click on New > Word Viewing Services ( You can choose PowerPoint or Excel also, steps are same ). This will open a pop window. Adding Services for Office Web Apps 3. Give a Proper Name which can have your companies or project name. 4. Under Application Pool select : SharePoint Web Services Default. 5. Next keep the check box checked which says : Add this service application’s proxy to the farm’s default proxy list. Click Ok Adding Word Viewer as Service Application Office Web Apps as Services in Sharepoint 2010 6. This will install all the Office Web App services required. You can see the name as you gave in the above step. How to Activate Office Web Apps in Site Collection? 1. Go to the site for which you want to activate this feature. 2. Click on Site Action > Site Settings > Site Collection Administrator > Site Collection Features 3. Activate Office Web Apps. Activate Office Web Apps Feature in Site Collection How to make sure Office Web Apps is working for your site collection? 1. Locate any office document you have and click on the smart menu which appears when you hover your mouse on it. Dont double-click as this will launch the document in Office Client if its installed. This feature can be changed. 2. If you see View or Edit in Browser as menu item, your Office Web Apps is configured correctly. View Edit Office Document in Browser Editing Office Document in Browser Another post related SharePoint 2010: 1. How to Configure SharePoint Foundation 2010 for SharePoint Workspace 2010 2. Integrating SharePoint 2010 and SQL 2008 R2

    Read the article

  • Pro SharePoint 2010 Business Intelligence Solutions

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). Oh yeah baby, it’s out finally! This book is what I wanted to write for so long now, but never really got a chance to. For SharePoint 2007, I authored the SharePoint section of “Smart BI Solutions with SQL Server 2008” for MS Press. But never really got the time, to author a full book that this topic deserved. Until SharePoint 2010, we actually have a full book on this topic. So first things first, I didn’t actually write it. My role was limited to the overall concept, the outline, the layout, completion of it, code samples, identifying what we need in here, vouching for technical accuracy, identifying authors etc. The real work was done by Srini (5 chapters), and Steve (1 chapter). So credit given where it is due. But, with that said, this is a pretty good book. It has always been a challenge to find the superman that knows both, data ware housing concepts, and SharePoint concepts. The data ware housing concepts include basic stuff you need to know to work in the BI area such as cubes, MDX queries, etc. So chapter 1 covers that – and if you’re a hardcore DBA, feel free to skip Chapter 1. Then beyond that, we take every single SharePoint 2010 BI topic, and slice and dice it in detail. The topics we deal with are - Visio Services Reporting services Business Connectivity Services Excel Services PerformancePoint Services And in covering each of these topics, we ensure that a general layout was followed for each topic, to ensure completeness of content. We make sure we cover Setup related issues and advice Point and click usage Code usage, i.e. extensibility using visual studio and a walkthrough of the administration side of things, including powershell. (Yes, I insisted on that in being there in every chapter). Writing a book is always a lot of work, so we hope you find it useful. And it should go very well with the other book I just reviewed, which is Microsoft ADO.NET 4, step by step. Comment on the article ....

    Read the article

  • Enterprise Service Bus (ESB): Important architectural piece to a SOA or is it just vendor hype?

    Is an Enterprise Service Bus (ESB) an important architectural piece to a Service-Oriented Architecture (SOA), or is it just vendor hype in order to sell a particular product such as SOA-in-a-box? According to IBM.com, an ESB is a flexible connectivity infrastructure for integrating applications and services; it offers a flexible and manageable approach to service-oriented architecture implementation. With this being said, it is my personal belief that ESBs are an important architectural piece to any SOA. Additionally, generic design patterns have been created around the integration of web services in to ESB regardless of any vendor. ESB design patterns, according to Philip Hartman, can be classified in to the following categories: Interaction Patterns: Enable service interaction points to send and/or receive messages from the bus Mediation Patterns: Enable the altering of message exchanges Deployment Patterns: Support solution deployment into a federated infrastructure Examples of Interaction Patterns: One-Way Message Synchronous Interaction Asynchronous Interaction Asynchronous Interaction with Timeout Asynchronous Interaction with a Notification Timer One Request, Multiple Responses One Request, One of Two Possible Responses One Request, a Mandatory Response, and an Optional Response Partial Processing Multiple Application Interactions Benefits of the Mediation Pattern: Mediator promotes loose coupling by keeping objects from referring to each other explicitly, and it lets you vary their interaction independently Design an intermediary to decouple many peers Promote the many-to-many relationships between interacting peers to “full object status” Examples of Interaction Patterns: Global ESB: Services share a single namespace and all service providers are visible to every service requester across an entire network Directly Connected ESB: Global service registry that enables independent ESB installations to be visible Brokered ESB: Bridges services that are reluctant to expose requesters or providers to ESBs in other domains Federated ESB: Service consumers and providers connect to the master or to a dependent ESB to access services throughout the network References: Mediator Design Pattern. (2011). Retrieved 2011, from SourceMaking.com: http://sourcemaking.com/design_patterns/mediator Hartman, P. (2006, 24 1). ESB Patterns that "Click". Retrieved 2011, from The Art and Science of Being an IT Architect: http://artsciita.blogspot.com/2006/01/esb-patterns-that-click.html IBM. (2011). WebSphere DataPower XC10 Appliance Version 2.0. Retrieved 2011, from IBM.com: http://publib.boulder.ibm.com/infocenter/wdpxc/v2r0/index.jsp?topic=%2Fcom.ibm.websphere.help.glossary.doc%2Ftopics%2Fglossary.html Oracle. (2005). 12 Interaction Patterns. Retrieved 2011, from Oracle® BPEL Process Manager Developer's Guide: http://docs.oracle.com/cd/B31017_01/integrate.1013/b28981/interact.htm#BABHHEHD

    Read the article

< Previous Page | 190 191 192 193 194 195 196 197 198 199 200 201  | Next Page >