![]() R display(sql("DESCRIBE DETAIL people_10m")) To get the location, you can use the DESCRIBE DETAIL statement, for example: Python display(spark.sql('DESCRIBE DETAIL people_10m')) For information about available options when you create a Delta table, see CREATE TABLE.įor managed tables, Azure Databricks determines the location for the data. The preceding operations create a new managed table by using the schema that was inferred from the data. Val people = ("/databricks-datasets/learning-spark-v2/people/lta")ĪS SELECT * FROM delta.`/databricks-datasets/learning-spark-v2/people/lta` Python # Load the data from its source.ĭf = ("/databricks-datasets/learning-spark-v2/people/lta")ĭf = read.df(path = "/databricks-datasets/learning-spark-v2/people/lta") You can use the delta keyword to specify the format if using Databricks Runtime 7.3 LTS. ![]() Delta Lake is the default for all reads, writes, and table creation commands in Databricks Runtime 8.0 and above.
0 Comments
Leave a Reply. |