About 22,800 results
Open links in new tab
  1. Community - Microsoft Azure

    Powered by Dynamics 365 Customer Service. Learn more here. Privacy Terms of use © Microsoft 2021

  2. Storage · Community - feedback.azure.com

    Using Azure Search, you can index and search Table Storage data (using full text search, filters, facets, custom scoring, etc.) and capture incremental changes in the data on a schedule, all …

  3. Sign in to your account - Microsoft Azure

    No account? Create one!Can’t access your account?

  4. Bing Search only has fuzzy responses - social.msdn.microsoft.com

    Jul 11, 2018 · We're trying to use the bing search api, but it seems that there is no way to do an exact search. Even using quotes returns fuzzy results, not only exact matches. Is there some …

  5. Community - feedback.azure.com

    You’re offline. This is a read only version of the page. Microsoft Azure | Share your Ideas Community

  6. Top Search Keywords/Views in Azure Web Application

    Jul 20, 2011 · In Azure how to develop such functionality. At a time of Search Functionality Should I save Records and then using Worker Role should I prepare analysis Record?

  7. How to validate users on AD in azure application

    Apr 14, 2013 · In my application, the alias that users input on UI is validated on AD directly. After moving the application to azure, it no longer runs in our corp domain, is it still possible to …

  8. B2B Guest user collabration between two tenants

    Aug 10, 2017 · I am trying to test federation between two Office 365 tenants i.e. want from one company users be able to easily see users in the Global Address Book and vice versa. Users …

  9. Need an on premise alternative to Azure Blob storage for …

    Jan 7, 2015 · My team is planning to write an application which stores BLOB data on Azure BLOB storage. For this, initially we are planing to search on premise BLOB storage application/tools …

  10. Is it possible to append data for file in ADLS using ADF copy activity?

    Apr 13, 2020 · I do have pipeline which is running on every hour, for each run it should read file from Source directory and it has to merge in existing file in Azure Data Lake Store rather …