Search Results

Search found 14829 results on 594 pages for 'sharepoint api'.

Page 84/594 | < Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >

  • Is it possible to use Sharepoint 2007 without installing it?

    - by foxtrot
    Hi there! My company wants to buy Sharepoint 2007 and they asked me if I could give an opinion. I've saw already a lot of videos and read e-books but would like to use it for a while, specially the integration with Sharepoint Designer 2007. The only way to do that is installing it on a trial basis? Is there any other way? Any available public installation? I only have an old 32-bit laptop with Windows XP. Thanks in advance!

    Read the article

  • I don't get prices with Amazon Product Advertising API

    - by Xarem
    I try to get prices of an ASIN number with the Amazon Product Advertising API. Code: $artNr = "B003TKSD8E"; $base_url = "http://ecs.amazonaws.de/onca/xml"; $params = array( 'AWSAccessKeyId' => self::API_KEY, 'AssociateTag' => self::API_ASSOCIATE_TAG, 'Version' => "2010-11-01", 'Operation' => "ItemLookup", 'Service' => "AWSECommerceService", 'Condition' => "All", 'IdType' => 'ASIN', 'ItemId' => $artNr); $params['Timestamp'] = gmdate("Y-m-d\TH:i:s.\\0\\0\\0\\Z", time()); $url_parts = array(); foreach(array_keys($params) as $key) $url_parts[] = $key . "=" . str_replace('%7E', '~', rawurlencode($params[$key])); sort($url_parts); $url_string = implode("&", $url_parts); $string_to_sign = "GET\necs.amazonaws.de\n/onca/xml\n" . $url_string; $signature = hash_hmac("sha256", $string_to_sign, self::API_SECRET, TRUE); $signature = urlencode(base64_encode($signature)); $url = $base_url . '?' . $url_string . "&Signature=" . $signature; $response = file_get_contents($url); $parsed_xml = simplexml_load_string($response); I think this should be correct - but I don't get offers in the response: SimpleXMLElement Object ( [OperationRequest] => SimpleXMLElement Object ( [RequestId] => ************************* [Arguments] => SimpleXMLElement Object ( [Argument] => Array ( [0] => SimpleXMLElement Object ( [@attributes] => Array ( [Name] => Condition [Value] => All ) ) [1] => SimpleXMLElement Object ( [@attributes] => Array ( [Name] => Operation [Value] => ItemLookup ) ) [2] => SimpleXMLElement Object ( [@attributes] => Array ( [Name] => Service [Value] => AWSECommerceService ) ) [3] => SimpleXMLElement Object ( [@attributes] => Array ( [Name] => ItemId [Value] => B003TKSD8E ) ) [4] => SimpleXMLElement Object ( [@attributes] => Array ( [Name] => IdType [Value] => ASIN ) ) [5] => SimpleXMLElement Object ( [@attributes] => Array ( [Name] => AWSAccessKeyId [Value] => ************************* ) ) [6] => SimpleXMLElement Object ( [@attributes] => Array ( [Name] => Timestamp [Value] => 2011-11-29T01:32:12.000Z ) ) [7] => SimpleXMLElement Object ( [@attributes] => Array ( [Name] => Signature [Value] => ************************* ) ) [8] => SimpleXMLElement Object ( [@attributes] => Array ( [Name] => AssociateTag [Value] => ************************* ) ) [9] => SimpleXMLElement Object ( [@attributes] => Array ( [Name] => Version [Value] => 2010-11-01 ) ) ) ) [RequestProcessingTime] => 0.0091540000000000 ) [Items] => SimpleXMLElement Object ( [Request] => SimpleXMLElement Object ( [IsValid] => True [ItemLookupRequest] => SimpleXMLElement Object ( [Condition] => All [IdType] => ASIN [ItemId] => B003TKSD8E [ResponseGroup] => Small [VariationPage] => All ) ) [Item] => SimpleXMLElement Object ( [ASIN] => B003TKSD8E [DetailPageURL] => http://www.amazon.de/Apple-iPhone-4-32GB-schwarz/dp/B003TKSD8E%3FSubscriptionId%3DAKIAI6NFQHK2DQIPRUEQ%26tag%3Dbanholzerme-20%26linkCode%3Dxm2%26camp%3D2025%26creative%3D165953%26creativeASIN%3DB003TKSD8E [ItemLinks] => SimpleXMLElement Object ( [ItemLink] => Array ( [0] => SimpleXMLElement Object ( [Description] => Add To Wishlist [URL] => http://www.amazon.de/gp/registry/wishlist/add-item.html%3Fasin.0%3DB003TKSD8E%26SubscriptionId%3DAKIAI6NFQHK2DQIPRUEQ%26tag%3Dbanholzerme-20%26linkCode%3Dxm2%26camp%3D2025%26creative%3D12738%26creativeASIN%3DB003TKSD8E ) [1] => SimpleXMLElement Object ( [Description] => Tell A Friend [URL] => http://www.amazon.de/gp/pdp/taf/B003TKSD8E%3FSubscriptionId%3DAKIAI6NFQHK2DQIPRUEQ%26tag%3Dbanholzerme-20%26linkCode%3Dxm2%26camp%3D2025%26creative%3D12738%26creativeASIN%3DB003TKSD8E ) [2] => SimpleXMLElement Object ( [Description] => All Customer Reviews [URL] => http://www.amazon.de/review/product/B003TKSD8E%3FSubscriptionId%3DAKIAI6NFQHK2DQIPRUEQ%26tag%3Dbanholzerme-20%26linkCode%3Dxm2%26camp%3D2025%26creative%3D12738%26creativeASIN%3DB003TKSD8E ) [3] => SimpleXMLElement Object ( [Description] => All Offers [URL] => http://www.amazon.de/gp/offer-listing/B003TKSD8E%3FSubscriptionId%3DAKIAI6NFQHK2DQIPRUEQ%26tag%3Dbanholzerme-20%26linkCode%3Dxm2%26camp%3D2025%26creative%3D12738%26creativeASIN%3DB003TKSD8E ) ) ) [ItemAttributes] => SimpleXMLElement Object ( [Manufacturer] => Apple Computer [ProductGroup] => CE [Title] => Apple iPhone 4 32GB schwarz ) ) ) ) Can someone please explain me why I don't get any price-information? Thank you very much

    Read the article

  • How do I use SPMemember.ID property to get User or Group

    - by Jason
    I have to write a utility to enumerate and manage the owners of groups within a SharePoint site. I know I can use the Groups property of the SPWeb object to retrieve a collection of groups. And I know I can use the Owner property of the group to get back the owner. My problem is that I do not know what to do next. The SPGroup.Owner property returns a SPMember object. The member object has one property called ID that returns the unique ID (an integer) of the member. What I cannot seem to find information on is how to use that integer value to determine if the member is a User or a Group and how to get back additional details (say the name). Any ideas? Thanks.

    Read the article

  • JMSContext, @JMSDestinationDefintion, DefaultJMSConnectionFactory with simplified JMS API: TOTD #213

    - by arungupta
    "What's New in JMS 2.0" Part 1 and Part 2 provide comprehensive introduction to new messaging features introduced in JMS 2.0. The biggest improvement in JMS 2.0 is introduction of the "new simplified API". This was explained in the Java EE 7 Launch Technical Keynote. You can watch a complete replay here. Sending and Receiving a JMS message using JMS 1.1 requires lot of boilerplate code, primarily because the API was designed 10+ years ago. Here is a code that shows how to send a message using JMS 1.1 API: @Statelesspublic class ClassicMessageSender { @Resource(lookup = "java:comp/DefaultJMSConnectionFactory") ConnectionFactory connectionFactory; @Resource(mappedName = "java:global/jms/myQueue") Queue demoQueue; public void sendMessage(String payload) { Connection connection = null; try { connection = connectionFactory.createConnection(); connection.start(); Session session = connection.createSession(false, Session.AUTO_ACKNOWLEDGE); MessageProducer messageProducer = session.createProducer(demoQueue); TextMessage textMessage = session.createTextMessage(payload); messageProducer.send(textMessage); } catch (JMSException ex) { ex.printStackTrace(); } finally { if (connection != null) { try { connection.close(); } catch (JMSException ex) { ex.printStackTrace(); } } } }} There are several issues with this code: A JMS ConnectionFactory needs to be created in a application server-specific way before this application can run. Application-specific destination needs to be created in an application server-specific way before this application can run. Several intermediate objects need to be created to honor the JMS 1.1 API, e.g. ConnectionFactory -> Connection -> Session -> MessageProducer -> TextMessage. Everything is a checked exception and so try/catch block must be specified. Connection need to be explicitly started and closed, and that bloats even the finally block. The new JMS 2.0 simplified API code looks like: @Statelesspublic class SimplifiedMessageSender { @Inject JMSContext context; @Resource(mappedName="java:global/jms/myQueue") Queue myQueue; public void sendMessage(String message) { context.createProducer().send(myQueue, message); }} The code is significantly improved from the previous version in the following ways: The JMSContext interface combines in a single object the functionality of both the Connection and the Session in the earlier JMS APIs.  You can obtain a JMSContext object by simply injecting it with the @Inject annotation.  No need to explicitly specify a ConnectionFactory. A default ConnectionFactory under the JNDI name of java:comp/DefaultJMSConnectionFactory is used if no explicit ConnectionFactory is specified. The destination can be easily created using newly introduced @JMSDestinationDefinition as: @JMSDestinationDefinition(name = "java:global/jms/myQueue",        interfaceName = "javax.jms.Queue") It can be specified on any Java EE component and the destination is created during deployment. JMSContext, Session, Connection, JMSProducer and JMSConsumer objects are now AutoCloseable. This means that these resources are automatically closed when they go out of scope. This also obviates the need to explicitly start the connection JMSException is now a runtime exception. Method chaining on JMSProducers allows to use builder patterns. No need to create separate Message object, you can specify the message body as an argument to the send() method instead. Want to try this code ? Download source code! Download Java EE 7 SDK and install. Start GlassFish: bin/asadmin start-domain Build the WAR (in the unzipped source code directory): mvn package Deploy the WAR: bin/asadmin deploy <source-code>/jms/target/jms-1.0-SNAPSHOT.war And access the application at http://localhost:8080/jms-1.0-SNAPSHOT/index.jsp to send and receive a message using classic and simplified API. A replay of JMS 2.0 session from Java EE 7 Launch Webinar provides complete details on what's new in this specification: Enjoy!

    Read the article

  • Android: restful API service

    - by Martyn
    Hey, I'm looking to make a service which I can use to make calls to a web based rest api. I've spent a couple of days looking through stackoverflow.com, reading books and looking at articles whilst playing about with some code and I can't get anything which I'm happy with. Basically I want to start a service on app init then I want to be able to ask that service to request a url and return the results. In the meantime I want to be able to display a progress window or something similar. I've created a service currently which uses IDL, I've read somewhere that you only really need this for cross app communication, so think these needs stripping out but unsure how to do callbacks without it. Also when I hit the post(Config.getURL("login"), values) the app seems to pause for a while (seems weird - thought the idea behind a service was that it runs on a different thread!) Currently I have a service with post and get http methods inside, a couple of AIDL files (for two way communication), a ServiceManager which deals with starting, stopping, binding etc to the service and I'm dynamically creating a Handler with specific code for the callbacks as needed. I don't want anyone to give me a complete code base to work on, but some pointers would be greatly appreciated; even if it's to say I'm doing it completely wrong. I'm pretty new to Android and Java dev so if there are any blindingly obvious mistakes here - please don't think I'm a rubbish developer, I'm just wet behind the ears and would appreciate being told exactly where I'm going wrong. Anyway, code in (mostly) full (really didn't want to put this much code here, but I don't know where I'm going wrong - apologies in advance): public class RestfulAPIService extends Service { final RemoteCallbackList<IRemoteServiceCallback> mCallbacks = new RemoteCallbackList<IRemoteServiceCallback>(); public void onStart(Intent intent, int startId) { super.onStart(intent, startId); } public IBinder onBind(Intent intent) { return binder; } public void onCreate() { super.onCreate(); } public void onDestroy() { super.onDestroy(); mCallbacks.kill(); } private final IRestfulService.Stub binder = new IRestfulService.Stub() { public void doLogin(String username, String password) { Message msg = new Message(); Bundle data = new Bundle(); HashMap<String, String> values = new HashMap<String, String>(); values.put("username", username); values.put("password", password); String result = post(Config.getURL("login"), values); data.putString("response", result); msg.setData(data); msg.what = Config.ACTION_LOGIN; mHandler.sendMessage(msg); } public void registerCallback(IRemoteServiceCallback cb) { if (cb != null) mCallbacks.register(cb); } }; private final Handler mHandler = new Handler() { public void handleMessage(Message msg) { // Broadcast to all clients the new value. final int N = mCallbacks.beginBroadcast(); for (int i = 0; i < N; i++) { try { switch (msg.what) { case Config.ACTION_LOGIN: mCallbacks.getBroadcastItem(i).userLogIn( msg.getData().getString("response")); break; default: super.handleMessage(msg); return; } } catch (RemoteException e) { } } mCallbacks.finishBroadcast(); } public String post(String url, HashMap<String, String> namePairs) {...} public String get(String url) {...} }; A couple of AIDL files: package com.something.android oneway interface IRemoteServiceCallback { void userLogIn(String result); } and package com.something.android import com.something.android.IRemoteServiceCallback; interface IRestfulService { void doLogin(in String username, in String password); void registerCallback(IRemoteServiceCallback cb); } and the service manager: public class ServiceManager { final RemoteCallbackList<IRemoteServiceCallback> mCallbacks = new RemoteCallbackList<IRemoteServiceCallback>(); public IRestfulService restfulService; private RestfulServiceConnection conn; private boolean started = false; private Context context; public ServiceManager(Context context) { this.context = context; } public void startService() { if (started) { Toast.makeText(context, "Service already started", Toast.LENGTH_SHORT).show(); } else { Intent i = new Intent(); i.setClassName("com.something.android", "com.something.android.RestfulAPIService"); context.startService(i); started = true; } } public void stopService() { if (!started) { Toast.makeText(context, "Service not yet started", Toast.LENGTH_SHORT).show(); } else { Intent i = new Intent(); i.setClassName("com.something.android", "com.something.android.RestfulAPIService"); context.stopService(i); started = false; } } public void bindService() { if (conn == null) { conn = new RestfulServiceConnection(); Intent i = new Intent(); i.setClassName("com.something.android", "com.something.android.RestfulAPIService"); context.bindService(i, conn, Context.BIND_AUTO_CREATE); } else { Toast.makeText(context, "Cannot bind - service already bound", Toast.LENGTH_SHORT).show(); } } protected void destroy() { releaseService(); } private void releaseService() { if (conn != null) { context.unbindService(conn); conn = null; Log.d(LOG_TAG, "unbindService()"); } else { Toast.makeText(context, "Cannot unbind - service not bound", Toast.LENGTH_SHORT).show(); } } class RestfulServiceConnection implements ServiceConnection { public void onServiceConnected(ComponentName className, IBinder boundService) { restfulService = IRestfulService.Stub.asInterface((IBinder) boundService); try { restfulService.registerCallback(mCallback); } catch (RemoteException e) {} } public void onServiceDisconnected(ComponentName className) { restfulService = null; } }; private IRemoteServiceCallback mCallback = new IRemoteServiceCallback.Stub() { public void userLogIn(String result) throws RemoteException { mHandler.sendMessage(mHandler.obtainMessage(Config.ACTION_LOGIN, result)); } }; private Handler mHandler; public void setHandler(Handler handler) { mHandler = handler; } } Service init and bind: // this I'm calling on app onCreate servicemanager = new ServiceManager(this); servicemanager.startService(); servicemanager.bindService(); application = (ApplicationState)this.getApplication(); application.setServiceManager(servicemanager); service function call: // this lot i'm calling as required - in this example for login progressDialog = new ProgressDialog(Login.this); progressDialog.setMessage("Logging you in..."); progressDialog.show(); application = (ApplicationState) getApplication(); servicemanager = application.getServiceManager(); servicemanager.setHandler(mHandler); try { servicemanager.restfulService.doLogin(args[0], args[1]); } catch (RemoteException e) { e.printStackTrace(); } ...later in the same file... Handler mHandler = new Handler() { public void handleMessage(Message msg) { switch (msg.what) { case Config.ACTION_LOGIN: if (progressDialog.isShowing()) { progressDialog.dismiss(); } try { ...process login results... } } catch (JSONException e) { Log.e("JSON", "There was an error parsing the JSON", e); } break; default: super.handleMessage(msg); } } }; Any and all help is greatly appreciated and I'll even buy you a coffee or a beer if you fancy :D Martyn

    Read the article

  • Google Maps API v3 - Different markers/labels on different zoom levels

    - by krikara
    I was wondering if it is possible that Google has a feature to view different markers on different zoom levels. For example, on zoom level 1, I want one marker over China with the label saying "5". And as the user zooms in, lets say on zoom level 4, I want the previous marker and label to disappear. And I want to have 5 new markers/labels, each on a different city in China all saying "1". Thus China will say a number and all the cities in China will say numbers adding up to China's number. The key concept I am trying to figure out here is how to hide markers and labels based on zoom levels. A constraint for me is that I am living in China currently where google is censored, so a lot of online documents are censored for me, including many of google's documentations. Here is my code thus far <!DOCTYPE html> <html> <head> <meta name="viewport" content="initial-scale=1.0, user-scalable=no" /> <title>TM China</title> <style type="text/css"> html, body, #map_canvas { margin: 0; padding: 0; height: 100% } .labels { color: red; background-color: white; font-family: "Lucida Grande", "Arial", sans-serif; font-size: 10px; font-weight: bold; text-align: center; width: 60px; border: 2px solid black; white-space: nowrap; } </style> <script type="text/javascript" src="http://maps.googleapis.com/maps/api/js?key=AIzaSyDV0lcdK7C2GHbQAmdkBID70Uppuf-D030&sensor=true"> </script> <script type="text/javascript"> eval(function(p,a,c,k,e,r){e=function(c){return(c<a?'':e(parseInt(c/a)))+((c=c%a)>35?String.fromCharCode(c+29):c.toString(36))};if(!''.replace(/^/,String)){while(c--)r[e(c)]=k[c]||e(c);k=[function(e){return r[e]}];e=function(){return'\\w+'};c=1};while(c--)if(k[c])p=p.replace(new RegExp('\\b'+e(c)+'\\b','g'),k[c]);return p}('7 m(a){2.3=a;2.8=V.1E("1u");2.8.4.C="I: 1m; J: 1g;";2.k=V.1E("1u");2.k.4.C=2.8.4.C}m.l=E 6.5.22();m.l.1Y=7(){n c=2;n h=t;n f=t;n j;n b;n d,K;n i;n g=7(e){p(e.1v){e.1v()}e.2b=u;p(e.1t){e.1t()}};2.1s().24.G(2.8);2.1s().20.G(2.k);2.11=[6.5.9.w(V,"1o",7(a){p(f){a.s=j;i=u;6.5.9.r(c.3,"1n",a)}h=t;6.5.9.r(c.3,"1o",a)}),6.5.9.o(c.3.1P(),"1N",7(a){p(h&&c.3.1M()){a.s=E 6.5.1J(a.s.U()-d,a.s.T()-K);j=a.s;p(f){6.5.9.r(c.3,"1i",a)}F{d=a.s.U()-c.3.Z().U();K=a.s.T()-c.3.Z().T();6.5.9.r(c.3,"1e",a)}}}),6.5.9.w(2.k,"1d",7(e){c.k.4.1c="2i";6.5.9.r(c.3,"1d",e)}),6.5.9.w(2.k,"1D",7(e){c.k.4.1c=c.3.2g();6.5.9.r(c.3,"1D",e)}),6.5.9.w(2.k,"1C",7(e){p(i){i=t}F{g(e);6.5.9.r(c.3,"1C",e)}}),6.5.9.w(2.k,"1A",7(e){g(e);6.5.9.r(c.3,"1A",e)}),6.5.9.w(2.k,"1z",7(e){h=u;f=t;d=0;K=0;g(e);6.5.9.r(c.3,"1z",e)}),6.5.9.o(2.3,"1e",7(a){f=u;b=c.3.1b()}),6.5.9.o(2.3,"1i",7(a){c.3.O(a.s);c.3.D(2a)}),6.5.9.o(2.3,"1n",7(a){f=t;c.3.D(b)}),6.5.9.o(2.3,"29",7(){c.O()}),6.5.9.o(2.3,"28",7(){c.D()}),6.5.9.o(2.3,"27",7(){c.N()}),6.5.9.o(2.3,"26",7(){c.N()}),6.5.9.o(2.3,"25",7(){c.16()}),6.5.9.o(2.3,"23",7(){c.15()}),6.5.9.o(2.3,"21",7(){c.13()}),6.5.9.o(2.3,"1Z",7(){c.L()}),6.5.9.o(2.3,"1X",7(){c.L()})]};m.l.1W=7(){n i;2.8.1r.1q(2.8);2.k.1r.1q(2.k);1p(i=0;i<2.11.1V;i++){6.5.9.1U(2.11[i])}};m.l.1T=7(){2.15();2.16();2.L()};m.l.15=7(){n a=2.3.z("Y");p(H a.1S==="P"){2.8.W=a;2.k.W=2.8.W}F{2.8.G(a);a=a.1R(u);2.k.G(a)}};m.l.16=7(){2.k.1Q=2.3.1O()||""};m.l.L=7(){n i,q;2.8.S=2.3.z("R");2.k.S=2.8.S;2.8.4.C="";2.k.4.C="";q=2.3.z("q");1p(i 1L q){p(q.1K(i)){2.8.4[i]=q[i];2.k.4[i]=q[i]}}2.1l()};m.l.1l=7(){2.8.4.I="1m";2.8.4.J="1g";p(H 2.8.4.B!=="P"){2.8.4.1k="1j(B="+(2.8.4.B*1I)+")"}2.k.4.I=2.8.4.I;2.k.4.J=2.8.4.J;2.k.4.B=0.1H;2.k.4.1k="1j(B=1)";2.13();2.O();2.N()};m.l.13=7(){n a=2.3.z("X");2.8.4.1h=-a.x+"v";2.8.4.1f=-a.y+"v";2.k.4.1h=-a.x+"v";2.k.4.1f=-a.y+"v"};m.l.O=7(){n a=2.1G().1F(2.3.Z());2.8.4.12=a.x+"v";2.8.4.M=a.y+"v";2.k.4.12=2.8.4.12;2.k.4.M=2.8.4.M;2.D()};m.l.D=7(){n a=(2.3.z("14")?-1:+1);p(H 2.3.1b()==="P"){2.8.4.A=2h(2.8.4.M,10)+a;2.k.4.A=2.8.4.A}F{2.8.4.A=2.3.1b()+a;2.k.4.A=2.8.4.A}};m.l.N=7(){p(2.3.z("1a")){2.8.4.Q=2.3.2f()?"2e":"1B"}F{2.8.4.Q="1B"}2.k.4.Q=2.8.4.Q};7 19(a){a=a||{};a.Y=a.Y||"";a.X=a.X||E 6.5.2d(0,0);a.R=a.R||"2c";a.q=a.q||{};a.14=a.14||t;p(H a.1a==="P"){a.1a=u}2.1y=E m(2);6.5.18.1x(2,1w)}19.l=E 6.5.18();19.l.17=7(a){6.5.18.l.17.1x(2,1w);2.1y.17(a)};',62,143,'||this|marker_|style|maps|google|function|labelDiv_|event|||||||||||eventDiv_|prototype|MarkerLabel_|var|addListener|if|labelStyle|trigger|latLng|false|true|px|addDomListener|||get|zIndex|opacity|cssText|setZIndex|new|else|appendChild|typeof|position|overflow|cLngOffset|setStyles|top|setVisible|setPosition|undefined|display|labelClass|className|lng|lat|document|innerHTML|labelAnchor|labelContent|getPosition||listeners_|left|setAnchor|labelInBackground|setContent|setTitle|setMap|Marker|MarkerWithLabel|labelVisible|getZIndex|cursor|mouseover|dragstart|marginTop|hidden|marginLeft|drag|alpha|filter|setMandatoryStyles|absolute|dragend|mouseup|for|removeChild|parentNode|getPanes|stopPropagation|div|preventDefault|arguments|apply|label|mousedown|dblclick|none|click|mouseout|createElement|fromLatLngToDivPixel|getProjection|01|100|LatLng|hasOwnProperty|in|getDraggable|mousemove|getTitle|getMap|title|cloneNode|nodeType|draw|removeListener|length|onRemove|labelstyle_changed|onAdd|labelclass_changed|overlayMouseTarget|labelanchor_changed|OverlayView|labelcontent_changed|overlayImage|title_changed|labelvisible_changed|visible_changed|zindex_changed|position_changed|1000000|cancelBubble|markerLabels|Point|block|getVisible|getCursor|parseInt|pointer'.split('|'),0,{})) var map; var mapOptions = { center: new google.maps.LatLng(35, 105), zoom: 3, mapTypeId: google.maps.MapTypeId.ROADMAP }; var locations = [ ['Hong Kong', 22.39, 114.10, 1885], ['Shanghai', 31.232, 121.47, 5885], ['Beijing', 39.88, 116.40, 6426], ['Guangzhou', 23.129, 113.264, 4067], ['Shenzhen', 22.54, 114.05, 3089], ['Hangzhou', 30.27, 120.15, 954] ]; var infowindow = new google.maps.InfoWindow(); var i; /* for (i = 0; i < locations.length; i++) { marker = new google.maps.Marker({ position: new google.maps.LatLng(locations[i][1], locations[i][2]), map: map }); google.maps.event.addListener(marker, 'click', (function(marker, i) { return function() { infowindow.setContent(locations[i][0]); infowindow.open(map, marker); } })(marker, i)); } */ function myMarker(options) { if(!options.labelAnchor) { options.labelAnchor = new google.maps.Point(30, 50); } if(!options.labelClass) { options.labelClass = "labels"; } options.map = map; return new MarkerWithLabel(options); } function initialize() { map = new google.maps.Map(document.getElementById("map_canvas"), mapOptions); for (i = 0; i < locations.length; i++) { var marker = new MarkerWithLabel({ position: new google.maps.LatLng(locations[i][1], locations[i][2]), draggable: false, map: map, labelContent: locations[i][3], labelAnchor: new google.maps.Point(30, 0), labelClass: "labels", // the CSS class for the label labelStyle: {opacity: 0.75} }); } /* var marker2 = new myMarker({ position: new google.maps.LatLng(20,20), draggable: true, labelContent: "second" }); */ } google.maps.event.addDomListener(window, 'load', initialize); </script> </head> <body onload="initialize()"> <div id="map_canvas" style="width:85%; height:85%"></div> <script type="text/javascript"> </script> </body> </html> EDIT I have been trying to experiment with the MarkerManager, but I can't get the markers to create successfully on different zoom levels. First, I changed my default zoom level to 1, and then I changed my code to what is shown below. function initialize() { map = new google.maps.Map(document.getElementById("map_canvas"), mapOptions); /* for (i = 0; i < locations.length; i++) { var marker = new MarkerWithLabel({ position: new google.maps.LatLng(locations[i][1], locations[i][2]), draggable: false, map: map, labelContent: locations[i][3], labelAnchor: new google.maps.Point(30, 0), labelClass: "labels", // the CSS class for the label labelStyle: {opacity: 0.75} }); } */ var listener = google.maps.event.addListener(map, 'bounds_changed', function(){ setupMarkers(); google.maps.event.removeListener(listener); }); } function createCityMarkers() { for (i = 0; i < locations.length; i++) { var marker = new MarkerWithLabel({ position: new google.maps.LatLng(locations[i][1], locations[i][2]), draggable: false, map: map, labelContent: locations[i][3], labelAnchor: new google.maps.Point(30, 0), labelClass: "labels", // the CSS class for the label labelStyle: {opacity: 0.75} }); } } function setupMarkers() { mgr = new MarkerManager(map); google.maps.event.addListener(mgr, 'loaded', function(){ mgr.addMarkers(createCityMarkers(), 4); mgr.refresh(); }); } I have also tried applying the source code of this link as well, but nothing is working out. And when I copy the source code directly to my computer and replace all the icons with markers, the markers still don't appear. I can't seem to figure how to make markers appear using the marker Manager. http://google-maps-utility-library-v3.googlecode.com/svn/tags/markermanager/1.0/examples/weather_map.html

    Read the article

  • Issue with Sharepoint 2010 application page

    - by Matt Moriarty
    I am relatively new to Sharepoint and am using version 2010. I am having a problem with the following code in an application page I am trying to build: using System; using Microsoft.SharePoint; using Microsoft.SharePoint.WebControls; using System.Text; using Microsoft.SharePoint.Administration; using Microsoft.Office.Server; using Microsoft.Office.Server.UserProfiles; using Microsoft.SharePoint.Utilities; namespace SharePointProject5.Layouts.SharePointProject5 { public partial class ApplicationPage1 : LayoutsPageBase { protected void Page_Load(object sender, EventArgs e) { SPContext context = SPContext.Current; StringBuilder output = new StringBuilder(); using(SPSite site = context.Site) using (SPWeb web = site.AllWebs["BDC_SQL"]) { UserProfileManager upmanager = new UserProfileManager(ServerContext.GetContext(site)); string ListMgr = ""; string ADMgr = ""; bool allowUpdates = web.AllowUnsafeUpdates; web.AllowUnsafeUpdates = true; web.Update(); SPListCollection listcollection = web.Lists; SPList list = listcollection["BDC_SQL"]; foreach (SPListItem item in list.Items) { output.AppendFormat("<br>From List - Name & manager: {0} , {1}", item["ADName"], item["Manager_ADName"]); UserProfile uProfile = upmanager.GetUserProfile(item["ADName"].ToString()); output.AppendFormat("<br>From Prof - Name & manager: {0} , {1}", uProfile[PropertyConstants.DistinguishedName], uProfile[PropertyConstants.Manager]); ListMgr = item["Manager_ADName"].ToString(); ADMgr = Convert.ToString(uProfile[PropertyConstants.Manager]); if (ListMgr != ADMgr) { output.AppendFormat("<br>This record requires updating from {0} to {1}", uProfile[PropertyConstants.Manager], item["Manager_ADName"]); uProfile[PropertyConstants.Manager].Value = ListMgr; uProfile.Commit(); output.AppendFormat("<br>This record has had its manager updated"); } else { output.AppendFormat("<br>This record does not need to be updated"); } } web.AllowUnsafeUpdates = allowUpdates; web.Update(); } Label1.Text = output.ToString(); } } } Everything worked fine up until I added in the 'uProfile.Commit();' line. Now I am getting the following error message: Microsoft.SharePoint.SPException was unhandled by user code Message=Updates are currently disallowed on GET requests. To allow updates on a GET, set the 'AllowUnsafeUpdates' property on SPWeb. Source=Microsoft.SharePoint ErrorCode=-2130243945 NativeErrorMessage=FAILED hr detected (hr = 0x80004005) NativeStackTrace="" StackTrace: at Microsoft.SharePoint.SPGlobal.HandleComException(COMException comEx) at Microsoft.SharePoint.Library.SPRequest.ValidateFormDigest(String bstrUrl, String bstrListName) at Microsoft.SharePoint.SPWeb.ValidateFormDigest() at Microsoft.Office.Server.UserProfiles.UserProfile.UpdateBlobProfile() at Microsoft.Office.Server.UserProfiles.UserProfile.Commit() at SharePointProject5.Layouts.SharePointProject5.ApplicationPage1.Page_Load(Object sender, EventArgs e) at System.Web.Util.CalliHelper.EventArgFunctionCaller(IntPtr fp, Object o, Object t, EventArgs e) at System.Web.Util.CalliEventHandlerDelegateProxy.Callback(Object sender, EventArgs e) at System.Web.UI.Control.OnLoad(EventArgs e) at Microsoft.SharePoint.WebControls.UnsecuredLayoutsPageBase.OnLoad(EventArgs e) at Microsoft.SharePoint.WebControls.LayoutsPageBase.OnLoad(EventArgs e) at System.Web.UI.Control.LoadRecursive() at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) InnerException: System.Runtime.InteropServices.COMException Message=<nativehr>0x80004005</nativehr><nativestack></nativestack>Updates are currently disallowed on GET requests. To allow updates on a GET, set the 'AllowUnsafeUpdates' property on SPWeb. Source="" ErrorCode=-2130243945 StackTrace: at Microsoft.SharePoint.Library.SPRequestInternalClass.ValidateFormDigest(String bstrUrl, String bstrListName) at Microsoft.SharePoint.Library.SPRequest.ValidateFormDigest(String bstrUrl, String bstrListName) InnerException: I have tried to rectify this by adding in code to allow the unsafe updates but I still get this error. Does anyone have any guidance for me? It would be much appreciated. Thanks in advance, Matt.

    Read the article

  • Howto I get the user domain\username and Display name From SharePoint People Picker InternalId?

    - by Benjamin Wong
    I have been trying to user sharepoint designer to retrieve the user's full name and even the domain\username from the people pucker field but the problem is I tend to get numbers. It looks like the intetrnal Id. In my case I am getting 14 instead of domain\username from the list form's field. Any idea on where I can get the dmain\username or even better the full name? Thanks !! I am using SharePoint Designer and Windows Sharepoint Services 3/SharePoint 2007. Update : I have managed to get the domain\username now. I just had to change the people picker's show field to ID. However now I am getting a weird prefix on my domain\username. I am getting this -1;#domain\username instead of just domain\username. Anybody know how I can overcome this? Thanks !!

    Read the article

  • ASP.NET Frameworks and Raw Throughput Performance

    - by Rick Strahl
    A few days ago I had a curious thought: With all these different technologies that the ASP.NET stack has to offer, what's the most efficient technology overall to return data for a server request? When I started this it was mere curiosity rather than a real practical need or result. Different tools are used for different problems and so performance differences are to be expected. But still I was curious to see how the various technologies performed relative to each just for raw throughput of the request getting to the endpoint and back out to the client with as little processing in the actual endpoint logic as possible (aka Hello World!). I want to clarify that this is merely an informal test for my own curiosity and I'm sharing the results and process here because I thought it was interesting. It's been a long while since I've done any sort of perf testing on ASP.NET, mainly because I've not had extremely heavy load requirements and because overall ASP.NET performs very well even for fairly high loads so that often it's not that critical to test load performance. This post is not meant to make a point  or even come to a conclusion which tech is better, but just to act as a reference to help understand some of the differences in perf and give a starting point to play around with this yourself. I've included the code for this simple project, so you can play with it and maybe add a few additional tests for different things if you like. Source Code on GitHub I looked at this data for these technologies: ASP.NET Web API ASP.NET MVC WebForms ASP.NET WebPages ASMX AJAX Services  (couldn't get AJAX/JSON to run on IIS8 ) WCF Rest Raw ASP.NET HttpHandlers It's quite a mixed bag, of course and the technologies target different types of development. What started out as mere curiosity turned into a bit of a head scratcher as the results were sometimes surprising. What I describe here is more to satisfy my curiosity more than anything and I thought it interesting enough to discuss on the blog :-) First test: Raw Throughput The first thing I did is test raw throughput for the various technologies. This is the least practical test of course since you're unlikely to ever create the equivalent of a 'Hello World' request in a real life application. The idea here is to measure how much time a 'NOP' request takes to return data to the client. So for this request I create the simplest Hello World request that I could come up for each tech. Http Handler The first is the lowest level approach which is an HTTP handler. public class Handler : IHttpHandler { public void ProcessRequest(HttpContext context) { context.Response.ContentType = "text/plain"; context.Response.Write("Hello World. Time is: " + DateTime.Now.ToString()); } public bool IsReusable { get { return true; } } } WebForms Next I added a couple of ASPX pages - one using CodeBehind and one using only a markup page. The CodeBehind page simple does this in CodeBehind without any markup in the ASPX page: public partial class HelloWorld_CodeBehind : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { Response.Write("Hello World. Time is: " + DateTime.Now.ToString() ); Response.End(); } } while the Markup page only contains some static output via an expression:<%@ Page Language="C#" AutoEventWireup="false" CodeBehind="HelloWorld_Markup.aspx.cs" Inherits="AspNetFrameworksPerformance.HelloWorld_Markup" %> Hello World. Time is <%= DateTime.Now %> ASP.NET WebPages WebPages is the freestanding Razor implementation of ASP.NET. Here's the simple HelloWorld.cshtml page:Hello World @DateTime.Now WCF REST WCF REST was the token REST implementation for ASP.NET before WebAPI and the inbetween step from ASP.NET AJAX. I'd like to forget that this technology was ever considered for production use, but I'll include it here. Here's an OperationContract class: [ServiceContract(Namespace = "")] [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class WcfService { [OperationContract] [WebGet] public Stream HelloWorld() { var data = Encoding.Unicode.GetBytes("Hello World" + DateTime.Now.ToString()); var ms = new MemoryStream(data); // Add your operation implementation here return ms; } } WCF REST can return arbitrary results by returning a Stream object and a content type. The code above turns the string result into a stream and returns that back to the client. ASP.NET AJAX (ASMX Services) I also wanted to test ASP.NET AJAX services because prior to WebAPI this is probably still the most widely used AJAX technology for the ASP.NET stack today. Unfortunately I was completely unable to get this running on my Windows 8 machine. Visual Studio 2012  removed adding of ASP.NET AJAX services, and when I tried to manually add the service and configure the script handler references it simply did not work - I always got a SOAP response for GET and POST operations. No matter what I tried I always ended up getting XML results even when explicitly adding the ScriptHandler. So, I didn't test this (but the code is there - you might be able to test this on a Windows 7 box). ASP.NET MVC Next up is probably the most popular ASP.NET technology at the moment: MVC. Here's the small controller: public class MvcPerformanceController : Controller { public ActionResult Index() { return View(); } public ActionResult HelloWorldCode() { return new ContentResult() { Content = "Hello World. Time is: " + DateTime.Now.ToString() }; } } ASP.NET WebAPI Next up is WebAPI which looks kind of similar to MVC. Except here I have to use a StringContent result to return the response: public class WebApiPerformanceController : ApiController { [HttpGet] public HttpResponseMessage HelloWorldCode() { return new HttpResponseMessage() { Content = new StringContent("Hello World. Time is: " + DateTime.Now.ToString(), Encoding.UTF8, "text/plain") }; } } Testing Take a minute to think about each of the technologies… and take a guess which you think is most efficient in raw throughput. The fastest should be pretty obvious, but the others - maybe not so much. The testing I did is pretty informal since it was mainly to satisfy my curiosity - here's how I did this: I used Apache Bench (ab.exe) from a full Apache HTTP installation to run and log the test results of hitting the server. ab.exe is a small executable that lets you hit a URL repeatedly and provides counter information about the number of requests, requests per second etc. ab.exe and the batch file are located in the \LoadTests folder of the project. An ab.exe command line  looks like this: ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorld which hits the specified URL 100,000 times with a load factor of 20 concurrent requests. This results in output like this:   It's a great way to get a quick and dirty performance summary. Run it a few times to make sure there's not a large amount of varience. You might also want to do an IISRESET to clear the Web Server. Just make sure you do a short test run to warm up the server first - otherwise your first run is likely to be skewed downwards. ab.exe also allows you to specify headers and provide POST data and many other things if you want to get a little more fancy. Here all tests are GET requests to keep it simple. I ran each test: 100,000 iterations Load factor of 20 concurrent connections IISReset before starting A short warm up run for API and MVC to make sure startup cost is mitigated Here is the batch file I used for the test: IISRESET REM make sure you add REM C:\Program Files (x86)\Apache Software Foundation\Apache2.2\bin REM to your path so ab.exe can be found REM Warm up ab.exe -n100 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldJsonab.exe -n100 -c20 http://localhost/aspnetperf/api/HelloWorldJson ab.exe -n100 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorld ab.exe -n100000 -c20 http://localhost/aspnetperf/handler.ashx > handler.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/HelloWorld_CodeBehind.aspx > AspxCodeBehind.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/HelloWorld_Markup.aspx > AspxMarkup.txt ab.exe -n100000 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorld > Wcf.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldCode > Mvc.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorld > WebApi.txt I ran each of these tests 3 times and took the average score for Requests/second, with the machine otherwise idle. I did see a bit of variance when running many tests but the values used here are the medians. Part of this has to do with the fact I ran the tests on my local machine - result would probably more consistent running the load test on a separate machine hitting across the network. I ran these tests locally on my laptop which is a Dell XPS with quad core Sandibridge I7-2720QM @ 2.20ghz and a fast SSD drive on Windows 8. CPU load during tests ran to about 70% max across all 4 cores (IOW, it wasn't overloading the machine). Ideally you can try running these tests on a separate machine hitting the local machine. If I remember correctly IIS 7 and 8 on client OSs don't throttle so the performance here should be Results Ok, let's cut straight to the chase. Below are the results from the tests… It's not surprising that the handler was fastest. But it was a bit surprising to me that the next fastest was WebForms and especially Web Forms with markup over a CodeBehind page. WebPages also fared fairly well. MVC and WebAPI are a little slower and the slowest by far is WCF REST (which again I find surprising). As mentioned at the start the raw throughput tests are not overly practical as they don't test scripting performance for the HTML generation engines or serialization performances of the data engines. All it really does is give you an idea of the raw throughput for the technology from time of request to reaching the endpoint and returning minimal text data back to the client which indicates full round trip performance. But it's still interesting to see that Web Forms performs better in throughput than either MVC, WebAPI or WebPages. It'd be interesting to try this with a few pages that actually have some parsing logic on it, but that's beyond the scope of this throughput test. But what's also amazing about this test is the sheer amount of traffic that a laptop computer is handling. Even the slowest tech managed 5700 requests a second, which is one hell of a lot of requests if you extrapolate that out over a 24 hour period. Remember these are not static pages, but dynamic requests that are being served. Another test - JSON Data Service Results The second test I used a JSON result from several of the technologies. I didn't bother running WebForms and WebPages through this test since that doesn't make a ton of sense to return data from the them (OTOH, returning text from the APIs didn't make a ton of sense either :-) In these tests I have a small Person class that gets serialized and then returned to the client. The Person class looks like this: public class Person { public Person() { Id = 10; Name = "Rick"; Entered = DateTime.Now; } public int Id { get; set; } public string Name { get; set; } public DateTime Entered { get; set; } } Here are the updated handler classes that use Person: Handler public class Handler : IHttpHandler { public void ProcessRequest(HttpContext context) { var action = context.Request.QueryString["action"]; if (action == "json") JsonRequest(context); else TextRequest(context); } public void TextRequest(HttpContext context) { context.Response.ContentType = "text/plain"; context.Response.Write("Hello World. Time is: " + DateTime.Now.ToString()); } public void JsonRequest(HttpContext context) { var json = JsonConvert.SerializeObject(new Person(), Formatting.None); context.Response.ContentType = "application/json"; context.Response.Write(json); } public bool IsReusable { get { return true; } } } This code adds a little logic to check for a action query string and route the request to an optional JSON result method. To generate JSON, I'm using the same JSON.NET serializer (JsonConvert.SerializeObject) used in Web API to create the JSON response. WCF REST   [ServiceContract(Namespace = "")] [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class WcfService { [OperationContract] [WebGet] public Stream HelloWorld() { var data = Encoding.Unicode.GetBytes("Hello World " + DateTime.Now.ToString()); var ms = new MemoryStream(data); // Add your operation implementation here return ms; } [OperationContract] [WebGet(ResponseFormat=WebMessageFormat.Json,BodyStyle=WebMessageBodyStyle.WrappedRequest)] public Person HelloWorldJson() { // Add your operation implementation here return new Person(); } } For WCF REST all I have to do is add a method with the Person result type.   ASP.NET MVC public class MvcPerformanceController : Controller { // // GET: /MvcPerformance/ public ActionResult Index() { return View(); } public ActionResult HelloWorldCode() { return new ContentResult() { Content = "Hello World. Time is: " + DateTime.Now.ToString() }; } public JsonResult HelloWorldJson() { return Json(new Person(), JsonRequestBehavior.AllowGet); } } For MVC all I have to do for a JSON response is return a JSON result. ASP.NET internally uses JavaScriptSerializer. ASP.NET WebAPI public class WebApiPerformanceController : ApiController { [HttpGet] public HttpResponseMessage HelloWorldCode() { return new HttpResponseMessage() { Content = new StringContent("Hello World. Time is: " + DateTime.Now.ToString(), Encoding.UTF8, "text/plain") }; } [HttpGet] public Person HelloWorldJson() { return new Person(); } [HttpGet] public HttpResponseMessage HelloWorldJson2() { var response = new HttpResponseMessage(HttpStatusCode.OK); response.Content = new ObjectContent<Person>(new Person(), GlobalConfiguration.Configuration.Formatters.JsonFormatter); return response; } } Testing and Results To run these data requests I used the following ab.exe commands:REM JSON RESPONSES ab.exe -n100000 -c20 http://localhost/aspnetperf/Handler.ashx?action=json > HandlerJson.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/MvcPerformance/HelloWorldJson > MvcJson.txt ab.exe -n100000 -c20 http://localhost/aspnetperf/api/HelloWorldJson > WebApiJson.txt ab.exe -n100000 -c20 http://localhost/AspNetPerf/WcfService.svc/HelloWorldJson > WcfJson.txt The results from this test run are a bit interesting in that the WebAPI test improved performance significantly over returning plain string content. Here are the results:   The performance for each technology drops a little bit except for WebAPI which is up quite a bit! From this test it appears that WebAPI is actually significantly better performing returning a JSON response, rather than a plain string response. Snag with Apache Benchmark and 'Length Failures' I ran into a little snag with Apache Benchmark, which was reporting failures for my Web API requests when serializing. As the graph shows performance improved significantly from with JSON results from 5580 to 6530 or so which is a 15% improvement (while all others slowed down by 3-8%). However, I was skeptical at first because the WebAPI test reports showed a bunch of errors on about 10% of the requests. Check out this report: Notice the Failed Request count. What the hey? Is WebAPI failing on roughly 10% of requests when sending JSON? Turns out: No it's not! But it took some sleuthing to figure out why it reports these failures. At first I thought that Web API was failing, and so to make sure I re-ran the test with Fiddler attached and runiisning the ab.exe test by using the -X switch: ab.exe -n100 -c10 -X localhost:8888 http://localhost/aspnetperf/api/HelloWorldJson which showed that indeed all requests where returning proper HTTP 200 results with full content. However ab.exe was reporting the errors. After some closer inspection it turned out that the dates varying in size altered the response length in dynamic output. For example: these two results: {"Id":10,"Name":"Rick","Entered":"2012-09-04T10:57:24.841926-10:00"} {"Id":10,"Name":"Rick","Entered":"2012-09-04T10:57:24.8519262-10:00"} are different in length for the number which results in 68 and 69 bytes respectively. The same URL produces different result lengths which is what ab.exe reports. I didn't notice at first bit the same is happening when running the ASHX handler with JSON.NET result since it uses the same serializer that varies the milliseconds. Moral: You can typically ignore Length failures in Apache Benchmark and when in doubt check the actual output with Fiddler. Note that the other failure values are accurate though. Another interesting Side Note: Perf drops over Time As I was running these tests repeatedly I was finding that performance steadily dropped from a startup peak to a 10-15% lower stable level. IOW, with Web API I'd start out with around 6500 req/sec and in subsequent runs it keeps dropping until it would stabalize somewhere around 5900 req/sec occasionally jumping lower. For these tests this is why I did the IIS RESET and warm up for individual tests. This is a little puzzling. Looking at Process Monitor while the test are running memory very quickly levels out as do handles and threads, on the first test run. Subsequent runs everything stays stable, but the performance starts going downwards. This applies to all the technologies - Handlers, Web Forms, MVC, Web API - curious to see if others test this and see similar results. Doing an IISRESET then resets everything and performance starts off at peak again… Summary As I stated at the outset, these were informal to satiate my curiosity not to prove that any technology is better or even faster than another. While there clearly are differences in performance the differences (other than WCF REST which was by far the slowest and the raw handler which was by far the highest) are relatively minor, so there is no need to feel that any one technology is a runaway standout in raw performance. Choosing a technology is about more than pure performance but also about the adequateness for the job and the easy of implementation. The strengths of each technology will make for any minor performance difference we see in these tests. However, to me it's important to get an occasional reality check and compare where new technologies are heading. Often times old stuff that's been optimized and designed for a time of less horse power can utterly blow the doors off newer tech and simple checks like this let you compare. Luckily we're seeing that much of the new stuff performs well even in V1.0 which is great. To me it was very interesting to see Web API perform relatively badly with plain string content, which originally led me to think that Web API might not be properly optimized just yet. For those that caught my Tweets late last week regarding WebAPI's slow responses was with String content which is in fact considerably slower. Luckily where it counts with serialized JSON and XML WebAPI actually performs better. But I do wonder what would make generic string content slower than serialized code? This stresses another point: Don't take a single test as the final gospel and don't extrapolate out from a single set of tests. Certainly Twitter can make you feel like a fool when you post something immediate that hasn't been fleshed out a little more <blush>. Egg on my face. As a result I ended up screwing around with this for a few hours today to compare different scenarios. Well worth the time… I hope you found this useful, if not for the results, maybe for the process of quickly testing a few requests for performance and charting out a comparison. Now onwards with more serious stuff… Resources Source Code on GitHub Apache HTTP Server Project (ab.exe is part of the binary distribution)© Rick Strahl, West Wind Technologies, 2005-2012Posted in ASP.NET  Web Api   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • BizTalk Server Monitoring &ndash; SharePoint Web Part

    - by SURESH GIRIRAJAN
    I have been worked with customers using BizTalk as shared infrastructure in the enterprise, where we have two or more BizTalk apps running on it for different Business groups. Also these customers are not using BizTalk ESB portal even though they are using BizTalk ESB exception framework. So main issue with all these Business groups are they don’t have visibility into the BizTalk apps running in prod, even though they are using SCOM and other monitoring stuff in place. So I am trying to address few issues I am going to list below and how I try to mitigate them, first one on the list is how to get visibility into prod, how to provision those access to the BizTalk resources with minimal activity and how can we take advantage of the resources we have today. So I was working on creating REST data services for BizTalk RFID a year ago and available on codeplex. I thought to extend that idea to take advantage of BizTalk Data Services available in codeplex. I extended the BizTalk data services I will upload the updated service soon. So let me start thru how my solution works, so first step I am using the BizTalk data service (REST service) which expose most of the BizTalk artifacts as resources such as Applications, Orchestrations, Send ports, Receive ports, Host instances and In process instances etc. BizTalk Server Monitoring – SharePoint Web Part I am hosting the BizTalk data service in IIS with application pool configured to run under BizTalk administrator credentials. So with this setup I am making the service to make accessible anonymous. Next step of this solution I have created a SharePoint Visual web part which consumes the BizTalk data service and display all the BizTalk Application and Platform settings in read only mode. Even though BizTalk data services offers to browse resources as well perform actions like starting, stopping Orchestrations, Send ports, Receive locations, Host instances etc. Host Instances BizTalk Applications BizTalk Running / Suspended Instances So having this BizTalk Monitoring SharePoint web part, will be added to the SharePoint. This eliminates the need for granting access to the BizTalk users explicitly, so when you have BizTalk contractor or BizTalk application user need to have access to the BizTalk environment all the need is have access to the SharePoint website. You can configure the web part point to different end point based on your environment. I am making this as read only as part of this to make easier for the users and in terms of provisioning. This removes the dependency of BizTalk admin at least for viewing the BizTalk application status and errors etc. If we need to make any changes to the BizTalk application then its application owner responsibility to co-ordinate with BizTalk admins. There are options like BizTalk ESB portal, BizTalk 360 etc… but this one of the approach to reduce number of steps required to give access to BizTalk application users and also to maximize the resource we have in enterprise today. Also you can expose this data service thru Azure Service Bus and access from other apps like mobile devices or create a web site hosted in Azure etc. One last thing I have tested only with BizTalk Server 2010 on x64 VM only, but it should work on other version. I will try to upload the code shortly with instructions how to setup etc.… I welcome thoughts and suggestions… Hope this helps….

    Read the article

  • WSPBuilder questions for first-time webpart

    - by dotnetdev
    Hi, I have made my first webpart using WSPBuilder. When I try to deploy it using STSADM, I get an error stating access is denied. I am an admin on the machine (well it's a VM). Also, with WSPBuilder, do I need to change the config files (I assume no as the point of the tool is to automate this)? Thanks

    Read the article

  • WCF Service w/ SharePoint Error: Could not find default endpoint element that references contract...

    - by Brian Clark
    Full error message: Could not find default endpoint element that references contract 'PublicationServices.IPublicationService' in the ServiceModel client configuration section. This might be because no configuration file was found for your application, or because no endpoint element matching this contract could be found in the client element. I have a SharePoint site that I have already opened a project for in Visual Studio 2010. I also created a project that contains a WCF Service Application and added it to the same solution that contains the project for my SharePoint site. I have created a visual web part in my SharePoint project that I am trying to use to consume the WCF Service. I am doing so like this, from within the user control for my web part: PublicationServiceClient proxy = new PublicationServiceClient(); Just having this line alone in OnPreRender, Page_Load, etc. will generate the above error. I've read previous posts about having to have items in the config file of the WCF service also in the config file of the consuming application. I have done this, I copied this section the Web.config file in my WCF service and have placed it in the system.serviceModel tags of my SharePoint project's app.config file: In other words, this is in both of my config files. When I add this web part to the front page of my SharePoint site though, I get the above error every time. I should also note that I have created a console app that I was able to use with no problems to consume data from this very same WCF service. Any help would be appreciated!

    Read the article

  • Tellago releases a RESTful API for BizTalk Server business rules

    - by Charles Young
    Jesus Rodriguez has blogged recently on Tellago Devlabs' release of an open source RESTful API for BizTalk Server Business Rules.   This is an excellent addition to the BizTalk ecosystem and I congratulate Tellago on their work.   See http://weblogs.asp.net/gsusx/archive/2011/02/08/tellago-devlabs-a-restful-api-for-biztalk-server-business-rules.aspx   The Microsoft BRE was originally designed to be used as an embedded library in .NET applications. This is reflected in the implementation of the Rules Engine Update (REU) Service which is a TCP/IP service that is hosted by a Windows service running locally on each BizTalk box. The job of the REU is to distribute rules, managed and held in a central database repository, across the various servers in a BizTalk group.   The engine is therefore distributed on each box, rather than exploited behind a central rules service.   This model is all very well, but proves quite restrictive in enterprise environments. The problem is that the BRE can only run legally on licensed BizTalk boxes. Increasingly we need to deliver rules capabilities across a more widely distributed environment. For example, in the project I am working on currently, we need to surface decisioning capabilities for use within WF workflow services running under AppFabric on non-BTS boxes. The BRE does not, currently, offer any centralised rule service facilities out of the box, and hence you have to roll your own (and then run your rules services on BTS boxes which has raised a few eyebrows on my current project, as all other WCF services run on a dedicated server farm ).   Tellago's API addresses this by providing a RESTful API for querying the rules repository and executing rule sets against XML passed in the request payload. As Jesus points out in his post, using a RESTful approach hugely increases the reach of BRE-based decisioning, allowing simple invocation from code written in dynamic languages, mobile devices, etc.   We developed our own SOAP-based general-purpose rules service to handle scenarios such as the one we face on my current project. SOAP is arguably better suited to enterprise service bus environments (please don't 'flame' me - I refuse to engage in the RESTFul vs. SOAP war). For example, on my current project we use claims based authorisation across the entire service bus and use WIF and WS-Federation for this purpose.   We have extended this to the rules service. I can't release the code for commercial reasons :-( but this approach allows us to legally extend the reach of BRE far beyond the confines of the BizTalk boxes on which it runs and to provide general purpose decisioning capabilities on the bus.   So, well done Tellago.   I haven't had a chance to play with the API yet, but am looking forward to doing so.

    Read the article

  • object reference is not set to an instance of an object in sharepoint while deploying the webpart us

    - by girish
    Hello, i m new to sharepoint and i m trying to create one webpart (hello word) in visual studio 2008 using sharepoint extension... solution is successfully build but it gives me error as soon as i start deploy it.... I have specified the Sharepoint site url in the Project solution property.... Can u please give me suggession for this why i m getting the error...Object Reference not set to an instance of an object.

    Read the article

  • Copy folders when copying list items from source to destination

    - by iHeartDucks
    Hi, This is my code to copy files in a list from source to destination. Using the code below I am only able to copy files but not folders. Any ideas on how can I copy the folders and the files within those folders? using (SPSite objSite = new SPSite(URL)) { using (SPWeb objWeb = objSite.OpenWeb()) { SPList objSourceList = null; SPList objDestinationList = null; try { objSourceList = objWeb.Lists["Source"]; } catch(Exception ex) { Console.WriteLine("Error opening source list"); Console.WriteLine(ex.Message); } try { objDestinationList = objWeb.Lists["Destination"]; } catch (Exception ex) { Console.WriteLine("Error opening destination list"); Console.WriteLine(ex.Message); } string ItemURL = string.Empty; if (objSourceList != null && objDestinationList != null) { foreach (SPListItem objSourceItem in objSourceList.Items) { ItemURL = string.Format(@"{0}/Destination/{1}", objDestinationList.ParentWeb.Url, objSourceItem.Name); objSourceItem.CopyTo(ItemURL); objSourceItem.UnlinkFromCopySource(); } } } } Thanks

    Read the article

  • How to call webservice using same credentials as Sharepoint?

    - by Saab
    Is it possible to do a webservice call from within an Excel sheet that has been downloaded from a sharepoint server, using the same credentials as the ones that are used for accessing the Sharepoint server? We're currently developing an Excel solution, which does webservice request from within the Excel sheet. This works fine, but the user has to log in at least twice : one for downloading/opening the Excel sheet from Sharepoint, and one to be able to execute the webservice using the right credentials. The Sharepoint server and the client machine are not in the same Active Directory domain. So "System.Security.Principal.WindowsIdentity.GetCurrent()" is not an option, since this will return a user that doesn't exist on the server.

    Read the article

  • Backup Azure Tables with the Enzo Backup API

    - by Herve Roggero
    In case you missed it, you can now backup (and restore) Azure Tables and SQL Databases using an API directly. The features available through the API can be found here: http://www.bluesyntax.net/backup20api.aspx and the online help for the API is here: http://www.bluesyntax.net/EnzoCloudBackup20/APIIntro.aspx. Backing up Azure Tables can’t be any easier than with the Enzo Backup API. Here is a sample code that does the trick: // Create the backup helper class. The constructor automatically sets the SourceStorageAccount property StorageBackupHelper backup = new StorageBackupHelper("storageaccountname", "storageaccountkey", "sourceStorageaccountname", "sourceStorageaccountkey", true, "apilicensekey"); // Now set some properties… backup.UseCloudAgent = false;                                       // backup locally backup.DeviceURI = @"c:\TMP\azuretablebackup.bkp";    // to this file backup.Override = true; backup.Location = DeviceLocation.LocalFile; // Set optional performance options backup.PKTableStrategy.Mode = BSC.Backup.API.TableStrategyMode.GUID; // Set GUID strategy by default backup.MaxRESTPerSec = 200; // Attempt to stay below 200 REST calls per second // Start the backup now… string taskId = backup.Backup(); // Use the Environment class to get the final status of the operation EnvironmentHelper env = new EnvironmentHelper("storageaccountname", "storageaccountkey", "apilicensekey"); string status = env.GetOperationStatus(taskId);   As you can see above, the code is straightforward. You provide connection settings in the constructor, set a few options indicating where the backup device will be located, set optional performance parameters and start the backup. The performance options are designed to help you backup your Azure Tables quickly, while attempting to keep under a specific threshold to prevent Storage Account throttling. For example, the MaxRESTPerSec property will attempt to keep the overall backup operation under 200 rest calls per second. Another performance option if the Backup Strategy for Azure Tables. By default, all tables are simply scanned. While this works best for smaller Azure Tables, larger tables can use the GUID strategy, which will issue requests against an Azure Table in parallel assuming the PartitionKey stores GUID values. It doesn’t mean that your PartitionKey must have GUIDs however for this strategy to work; but the backup algorithm is tuned for this condition. Other options are available as well, such as filtering which columns, entities or tables are being backed up. Check out more on the Blue Syntax website at http://www.bluesyntax.net.

    Read the article

  • Problem in implementing IAlertUpdateHandler interface

    - by TheVillageIdiot
    I've implemented IAlertUpdateHandler interface in a class and used it for handling creation and updating of alerts. The code is fired but it goes into endless loop by calling itself again and again. Actually I want to suppress email notification so I'm calling a.Update(false); but this again calls PreUpdate or PostUpdate method and there is StackOverFlowException :( I've tried returning true/false from both the methods but nothing is helping.

    Read the article

  • moss 2007 workflows

    - by nav
    Hi, I'm new to MOSS 2007. I need create a workflow to look at a document's review date (a select list predefined to values of 3 , 6 or 12 months) and send an email if the review date has passed. So the workflow needs to get the documents review date then convert this to date time add to the created date if greater than current date send an email. Can anyone tell me if this is possible to do using SP designer to create the workflow? I'dd be grateful for any pointers. Many Thanks, Nav

    Read the article

  • Issues with SharePoint 2010 Development

    - by Rahul Soni
    I am planning to use my laptop for SharePoint 2010 development and I have only 4 GB RAM which is not even upgradable. Just because of RAM constraint, my VS 2010 keeps crawling if I try to run it along with SharePoint 2010 on the same machine. Hence, I've reformatted my machine and looking for alternate solutions until I get a new laptop. Currently, I have installed VS 2010 ONLY on my laptop and wanted to create an empty SharePoint project. Once done with my project, I want to deploy it on a different machine (which is a 4GB RAM machine as well, but contains only SharePoint 2010). I thought this will work and give me a bit of breather if everything is configured well. Unfortunately, when I tried creating a new SharePoint Empty Project in VS 2010, it says... A SharePoint server is not installed on this computer. A SharePoint server must be installed to work with SharePoint projects. Is there a way out?

    Read the article

  • Is a sharepoint developer technically "equipped" to do custom app dev and vise-versa?

    - by Amir
    This may be an opinion based question, but it's something that I wanted to ask (even if it does end up getting closed or deleted). I do custom app dev (asp.net/aspMVC) and have absolutely no knowledge about sharepoint and was wondering: If you have a "rock solid" custom app dev, asp.net/aspMVC web developer can he jump into sharepoint development fairly easily? What about the other way around? Does a seasoned sharepoint developer have the "chops" to do custom app dev using asp.net/aspMVC? By no means do I want to offend any sharepoint developers or any custom app dev developers. I'm merely trying to see how much knowledge you can take with you when going from one type of development to another.

    Read the article

  • SharePoint Designer: how do I disable auto insertion of image size attributes?

    - by David Högberg
    I'm hand-editing HTML files in a plain text editor (vim) via SharePoint Designer. Problem is, as soon as I save the files, SharePoint automatically adds width and height attributes to all the img-tags. Anyone know if it's possible to disable this "feature"? I don't want it to mess around with my code. Yeah, shouldn't be using SharePoint Designer then, I know - problem is that's not an option.

    Read the article

  • How can I tell if a candidate is a good Sharepoint Architect / Developer ?

    - by driis
    I need to interview some people for a position as a Sharepoint Architect / Developer role. While I am proficient in .NET, I have worked very little with Sharepoint, so I am unsure how to test the candidates Sharepoint skills. Do you have any suggestions for tests I can throw at the candidates ? Please suggest questions I can ask the candidates; and please specify whether your question is "Must know" knowledge for a Sharepoint developer. Please include the answer to your question.

    Read the article

  • Pop-up and font colour based problems in a form which is designed in Share Point.

    - by ephieste
    I am designing a system in Share Point via Share Point Designer. We have a form in my Share Point site. Users have to fill some fields in the form and send it to the approval committee. We cannot upload anything to the servers. The design is site based. Our problems are: 1- I want to add small (?) icons for the descriptions of that field. When the user click on the (?) icon for "brief description" field a pop-up or another window will be opened and perhaps it will say: Enter a description of the requested thing. Be as specific as possible. 2- I want to change the font colors of the fields in the form. The share point brings them black as default. Such as I want to see the "Brief description:" and "Status:" as purple instead of black. Brief description: ..... Status: ..... 3- I want to add an agreement pop-up to the new form which will be open just after clicking "send" button in the form. The pop up will say: "Are you sure that you read the procedure" . The user has to click "Yes" to continue sending the form. Otherwise It will return to previous screen again. I would be glad if you kindly help me.

    Read the article

< Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >